The Cauchy-Schwarz inequality asserts that on any measure space , and any measurable , one has
Thus for instance one has
This inequality is useful for decoupling an expression involving two functions , replacing that expression with one involving just , and one involving just . Only one of these latter expressions needs to be small (and the other one bounded) in order for the original expression to be small. Thus, one focus on estimating expressions involving just (say), effectively eliminating from view.
Consider the following two ways of measuring the "size" of vectors in . The -norm of a vector is defined as and the -norm is defined as .
What is the relationship between these two norms? It follows from the triangle inequality that the -norm is always bigger than the -norm. How much bigger?
The answer can be found through an application of the Cauchy-Schwarz inequality to the sequences and :
Moreover, the factor in the above bound is the best possible: to see this plug in the vector .
(Counting 4-cycles in graphs)
(Some instance of the large sieve inequality)
The Cauchy-Schwarz inequality is efficient as long as you do expect and to behave in a roughly "parallel" manner. If instead they are behaving in an "orthogonal" manner then the inequality is quite lossy.
Another useful tool for decoupling is the Arithmetic-geometric mean inequality
or (slightly more generally)
for any complex numbers , where is a parameter one can optimize in later.
(Discussion of converse Cauchy-Schwarz).
See also "A tiny remark about the Cauchy-Schwarz inequality" by Tim Gowers.