Tricki
a repository of mathematical know-how

To optimize a sum try making the terms roughly equal in size

Stub iconThis article is a stub.This means that it cannot be considered to contain or lead to any mathematically interesting information.

Quick description

Suppose one wants to find the parameter \lambda that minimizes a quantity f(\lambda) + g(\lambda), where f is an increasing non-negative function in \lambda and g is a decreasing non-negative function in \lambda. One can of course use calculus methods to do this, by finding the value(s) of \lambda where the derivative of f(\lambda)+g(\lambda) vanishes. But a quick and dirty way to find the minimum approximately is just find the value \lambda_0 where the two functions agree:

 f(\lambda_0) = g(\lambda_0).

Indeed, since f(\lambda)+g(\lambda) \geq f(\lambda_0)+0 = g(\lambda_0) for \lambda \geq \lambda_0, and f(\lambda)+g(\lambda) \geq 0+g(\lambda_0) = g(\lambda_0) for \lambda \leq \lambda_0, we see that the minimal value of f(\lambda)+g(\lambda) lies between g(\lambda_0) and 2g(\lambda_0).

More generally, once one finds a \lambda_0 where f(\lambda_0) and g(\lambda_0) are comparable in magnitude, this is already enough to compute the minimum of f(\lambda)+g(\lambda) up to multiplicative constants.

Prerequisites

Calculus

Example 1

(Optimize expressions such as \lambda^\alpha A + \frac{B}{\lambda^\beta})

General discussion

When optimizing a sum f_1(\lambda) + \ldots + f_n(\lambda), where the intermediate f_j(\lambda) are somehow "in between" f_1(\lambda) and f_n(\lambda), the above heuristic is often effective (up to a factor of n, perhaps) if one looks for the \lambda which balances the two extreme terms f_1(\lambda) and f_n(\lambda). (Need an example of this...)