Usually we think of numerical solution of initial value problems:
although sometimes we need to solve two-point boundary value problems:
Theory and practice for solution of initial value problems is generally simpler, and that is the main focus here.
Methods for solution of initial value problems can be used for solving two-point boundary value problems by the shooting method (see, for example shooting methods and variants such as multiple shooting when combined with a general method for solving nonlinear equations (such as Newton's method or homotopy methods).
Stiff differential equations are common in practice and cause difficulty with numerical simulation. These are equations where the ratio of the largest to smallest eigenvalue of is very large. The name comes from simulation of mechanical systems, but the stiffest systems seem to come from simulation of chemical reactions where this ratio can be of the order of or more.
Stiff differential equations generally require implicit methods, where computation of from requires solution of (nonlinear) equations. These are generally more complex to implement, but give far superior performance and have less difficulty in "tuning" the step-size to fit the problem at hand.
Many codes for numerical solution of differential equations are "automatic" in the sense that they adjust the step size to keep the estimated error within certain user-prescribed bounds. However, it should be noted that these methods are not completely foolproof and care should be taken in their implementation to avoid falling into certain traps. Also, for optimal control problems where solution trajectories are used for optimization, the adjustments of the step sizes are often done is discontinuous ways, making optimization of the resulting output extremely difficult.
Calculus, ordinary differential equations.
Euler's method is very basic: for . For a differential equation with Lipschitz (in both and ) with constant we can obtain a bound on the error
We can bound using and noting that for (almost all) and .
Runge-Kutta methods are so-called one-step methods which involve some intermediate computation in order to compute from , as opposed to multistep methods where computation of requires for . The number of intermediate function values needed is the number of stages for the method.
Runge-Kutta methods can be represented by a Butcher tableau (see Wikipedia article here). Runge-Kutta methods can give optimal order if the coefficients of the method are generated from the points and weights of Gaussian quadrature methods in the appropriate way (order for stages). The computational cost and difficulty in implementing implicit Runge-Kutta methods like this has been a barrier to their popular acceptance, although these are often considered to be the most powerful methods, especially for high-accuracy solution of stiff differential equations, or if there are special features to be preserved by the numerical method (e.g., symplectic methods).
In multistep methods, is computed in terms of for . The best known multistep methods are Adams-Bashforth, Adams-Moulton, and Backward Difference Formulas (BDF) methods. All these methods are based on interpolation, either of for the Adams methods, or itself (using to obtain the equation for ) for the BDF methods.
These methods can give very high order (although this is often not the most important thing). However, they are difficult to get started as for must be computed somehow. The most successful methods of this type are variable order and variable step size methods, such as W. Gear's old DIFSUB code. (There are much more modern codes available than this, though.) These start with a first order method (which reduces to Euler or implicit Euler and requires no previous values) with an extremely small step size, and increases both the order and the step size as rapidly as possible. The order must be increased in order to allow the step size to increase, so the two go together.
Care must be taken not to change the step size too often as this can cause some numerical instabilities in itself.