In the course of proving some statement, you may want to introduce some parameter (e.g. if you wish to divide into cases when a certain variable is small or large, one may introduce a parameter and divide into the cases and .) But you don't yet know what the "best" choice of this new parameter is. In many cases, what one can do is just leave the parameter unspecified, and continue the argument with this undetermined parameter. At a much later stage of the argument, it may become clearer what metric one would use to decide what choices of parameter are "good" and what are "bad", at which point one can optimize in that metric and find the right choice of parameter.
Suppose one wishes to establish the Cauchy-Schwarz inequality
for non-negative real numbers , where
A first attempt would be to use the arithmetic mean-geometric mean inequality
but when one substitutes this in and works everything out, one gets the inequality
which is inferior to Cauchy-Schwarz (by another application of the AM-GM inequality!).
However, one can be cleverer about this by multiplying by an unspecified parameter and dividing by the same parameter, which when inserted into (1) gives the more general inequality
It is not yet clear what value of is best (though we already know that does not always work). But one can forge on regardless. Summing (2) in , one obtains
And now it is clear what to do to optimize in : one should choose so that the right-hand side is as small as possible. A little calculus (see also the heuristic "To optimize a sum, try making the terms roughly equal in size") then shows that the optimal choice of is , at least when and are both non-zero (but the case when or can be easily handled separately). Inserting this choice of gives the desired inequality.
This is a special case of "If you don't know how to make a decision, then don't make it".