Namespaces
Variants
Actions

Descent, method of

From Encyclopedia of Mathematics
Jump to: navigation, search

A method for solving the minimization problem

where f is some function of the variables (x_1,\dotsc,x_n). The iterative sequence \{x_k\} of the method of descent is computed by the formula

x^{k+1}=x^k+\alpha_kg^k,

where g^k is a vector indicating some direction of decrease of f at x^k, and \alpha_k is an iterative parameter, the value of which indicates the step-length in the direction g^k. If f is a differentiable function and x^k is not an extremal point of it, then the vector g_k must satisfy the inequality

(f'(x^k),g^k)<0,\label{*}\tag{*}

where f'(x^k) is the gradient of f at x^k.

If f is a sufficiently smooth function (e.g. twice continuously differentiable) and if the sequence \{g^k\} satisfies inequality \eqref{*}, then there exists a sequence \{\alpha_k\} such that

f(x^0)>\dotsb>f(x^k)>\dotsb.

Under certain restrictions (see [3]) on the function f and on the method of choosing the parameters \{\alpha_k\} and the vectors g^k, the sequence \{x_k\} converges to a solution x^* of the initial problem.

The gradient method, in which the vectors \{g^k\} are in some way expressed in terms of the vectors \{f'(x^k)\}, is a method of descent. One of the most common cases is when

g^k=-B(x^k)f'(x^k),

where B(x) is a symmetric matrix satisfying

m(x,x)\leq(B(y)x,x)\leq M(x,x)

for any two vectors x and y, with certain constants M\geq m>0. Under additional restrictions (see [3]) on f and by a special selection of \{\alpha_k\}, the gradient method ensures the convergence of \{x^k\} to a solution x^* of the initial problem with the rate of an arithmetical progression with ratio g<1. A special case of the gradient method is the method of steepest descent (cf. Steepest descent, method of), in which the matrix B(x) is the unit matrix.

References

[1] L.V. Kantorovich, G.P. Akilov, "Functionalanalysis in normierten Räumen" , Akademie Verlag (1964) (Translated from Russian)
[2] G. Zoutendijk, "Methods of feasible directions" , Elsevier (1970)
[3] B.N. Pshenichnyi, Yu.M. Danilin, "Numerical methods in extremal problems" , MIR (1978) (Translated from Russian)
[4] B.T. Polyak, "Gradient methods for the minimization of functionals" USSR Comp. Math. Math. Physics , 3 : 4 (1963) pp. 864–878 Zh. Vychisl. Mat. i Mat. Fiz. , 3 : 4 (1963) pp. 643–654


Comments

See also Coordinate-wise descent method.

References

[a1] J.M. Ortega, W.C. Rheinboldt, "Iterative solution of non-linear equations in several variables" , Acad. Press (1970)
How to Cite This Entry:
Descent, method of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Descent,_method_of&oldid=44669
This article was adapted from an original article by Yu.A. Kuznetsov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article