# Descent, method of

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

A method for solving the minimization problem where is some function of the variables . The iterative sequence of the method of descent is computed by the formula where is a vector indicating some direction of decrease of at , and is an iterative parameter, the value of which indicates the step-length in the direction . If is a differentiable function and is not an extremal point of it, then the vector must satisfy the inequality (*)

where is the gradient of at .

If is a sufficiently smooth function (e.g. twice continuously differentiable) and if the sequence satisfies inequality (*), then there exists a sequence such that Under certain restrictions (see ) on the function and on the method of choosing the parameters and the vectors , the sequence converges to a solution of the initial problem.

The gradient method, in which the vectors are in some way expressed in terms of the vectors , is a method of descent. One of the most common cases is when where is a symmetric matrix satisfying for any two vectors and , with certain constants . Under additional restrictions (see ) on and by a special selection of , the gradient method ensures the convergence of to a solution of the initial problem with the rate of an arithmetical progression with ratio . A special case of the gradient method is the method of steepest descent (cf. Steepest descent, method of), in which the matrix is the unit matrix.

How to Cite This Entry:
Descent, method of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Descent,_method_of&oldid=11500
This article was adapted from an original article by Yu.A. Kuznetsov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article