Nonlinear equations are ubiquitous in mathematics, and we need practical methods to solve such equations. Methods vary in their reliability and robustness as well as their speed and ease of implementation, as well as applicability. For example, the Newton-Raphson method can be applied to systems of nonlinear equations and can converge very fast, while bisection is extremely reliable, but is much slower and can only be used for a single scalar equation in one variable.
Bisection is extremely robust but relatively slow for finding a root of once we have where and have opposite signs.
One simply computes and determines the sign of . One then either updates or and repeats the process depending on the sign of to ensure that we again have and of opposite signs. Of course, if , we stop. Normally we stop when , a pre-determined tolerance.
As long as is continuous, this method will converge thanks to the intermediate value theorem.
Speed can be improved using so-called bracketing methods such as Dekker's or Brent's method.
The so-called Newton-Raphson method is based on the linear approximation for . Solving the linearized equation gives a new estimate
This can be repeated to give further improvements:
The Newton-Raphson method can be guaranteed to converge provided the Jacobian matrix is non-singular at the solution , and is "sufficiently close" to . When it does converge to and is non-singular with sufficiently smooth, convergence is quadratic; that is, .
This rapid convergence is very desirable, and the number of iterations needed for an accuracy of (so that ) is provided everything works nicely.
However, Newton's method is not robust, and there has been a great deal of work on practical modifications of the method to prevent bad behavior. These modifications typically include guarding the step: setting and setting where to ensure that (at least) .
This indicates that the Newton-Raphson method is essentially a local method; some sort of "globalization" strategy is needed to handle practical situations.
Methods that are much more reliable for highly nonlinear problems are homotopy (or continuation) methods. These are based on a homotopy between the problem we want to solve and an easy problem : . We need, for example, and . We then follow the curve in from the solution(s) of to one or more solutions of .
Typically people use for some (random) . Provided no solutions "escape to " on this curve, and the curve is smooth, we can use local techniques (such as a modified version of Newton-Raphson based on linear least squares) to follow the curve. Using random is useful because the generalized Morse-Sard theorem guarantees that for almost all the curve is smooth. Conditions recognizable from topology or nonlinear analysis (such as for all ) are typically used to ensure that curves do not escape to infinity.
There are also piecewise affine versions of this idea, and in fact this is related to simplex-tableau type algorithms for finding solutions to nonlinear equations and (for example) linear complementarity problems.