Jacobi condition
A necessary condition for optimality in problems in the calculus of variations. The Jacobi condition is a necessary condition for the second variation of a functional being minimized to be non-negative at a point where it is minimal (the vanishing of the first variation of the functional is ensured by the first-order necessary condition: the Euler equation, the transversality condition and the Weierstrass conditions (for a variational extremum)).
One poses the problem of minimizing, for example, the functional
$$ \tag{1 } J ( x) = \ \int\limits _ { t _ {1} } ^ { {t _ 2 } } F ( t, x, \dot{x} ) dt $$
under given conditions at the end points:
$$ \tag{2 } x ( t _ {1} ) = x _ {1} ,\ \ x ( t _ {2} ) = x _ {2} . $$
If $ x ( t) $, $ t _ {1} \leq t \leq t _ {2} $, is a solution to the problem (1), (2), then the first variation $ \delta J $ of the functional must vanish, and so one obtains the first-order necessary conditions, and the second variation
$$ \tag{3 } \delta ^ {2} J ( \eta ) = \ \int\limits _ { t _ {1} } ^ { {t _ 2 } } ( F _ {\dot{x} \dot{x} } {\dot \eta } {} ^ {2} + 2F _ {x \dot{x} } \eta \dot \eta + F _ {xx} \eta ^ {2} ) dt $$
must be greater than or equal to 0 for any piecewise-smooth function $ \eta ( t) $ satisfying the zero boundary conditions:
$$ \tag{4 } \eta ( t _ {1} ) = 0,\ \ \eta ( t _ {2} ) = 0. $$
The Euler equation for $ \delta ^ {2} J ( \eta ) $ is:
$$ \tag{5 } F _ {xx} \eta + F _ {x \dot{x} } \dot \eta - { \frac{d}{dt} } ( F _ {x \dot{x} } \eta + F _ {\dot{x} \dot{x} } \dot \eta ) = 0 $$
and is called the Jacobi equation. It is a second-order linear differential equation in the unknown function $ \eta ( t) $. All the coefficients of $ \eta $ and $ \dot \eta $ in (5) are evaluated at the values of $ t, x, \dot{x} $ corresponding to a known optimal solution $ x ( t) $, and so they are all known functions of $ t $.
The function $ \eta ( t) \equiv 0 $, $ t _ {1} \leq t \leq t _ {2} $, satisfies the Jacobi equation under the boundary conditions (4), that is, it is an extremal of $ \delta ^ {2} J ( \eta ) $. On the other hand, for $ \eta ( t) \equiv 0 $ the second variation $ \delta ^ {2} J ( \eta ) = 0 $, and since for an optimal solution $ x ( t) $ the second variation is non-negative for any $ \eta ( t) $, the function $ \eta ( t) \equiv 0 $, $ t _ {1} \leq t \leq t _ {2} $, minimizes $ \delta ^ {2} J ( \eta ) $. If Legendre's condition $ F _ {\dot{x} \dot{x} } \neq 0 $, $ t _ {1} \leq t \leq t _ {2} $, holds (cf. also Legendre condition), that is, $ x ( t) $ is a non-singular extremal, then under the initial conditions $ \eta ( t _ {1} ) = \dot \eta ( t _ {1} ) = 0 $, the solution to the Jacobi equation is identically zero.
A point $ t = c $ is called conjugate to a point $ t = a $ if there is a solution to the Jacobi equation that vanishes at $ t = a $ and $ t = c $, and is not identically zero between $ a $ and $ c $. By the necessary Jacobi condition, if a non-singular extremal $ x ( t) $, $ t _ {1} \leq t \leq t _ {2} $, gives a minimum of the functional (1), then $ ( t _ {1} , t _ {2} ) $ does not contain points conjugate to $ t _ {1} $.
The practical meaning of the Jacobi condition can be explained as follows. Suppose that it does not hold, that is, there is a point $ a $, $ t _ {1} < a < t _ {2} $, conjugate to $ t _ {1} $. Then one can construct the continuous function
$$ \eta _ {1} ( t) = \ \left \{ \begin{array}{ll} \eta ( t), & t _ {1} \leq t \leq a,\ \eta \neq 0, \\ 0, & a \leq t \leq t _ {2} , \\ \end{array} \right .$$
that is a solution to (5) for which $ \delta ^ {2} J ( \eta _ {1} ) = 0 $. Thus, $ \eta _ {1} ( t) $, $ t _ {1} \leq t \leq t _ {2} $, is a polygonal extremal of $ \delta ^ {2} J ( \eta ) $ with a corner at $ t = a $. But by the necessary condition of Weierstrass–Erdmann (see Euler equation), which requires the continuity of $ F - \dot{x} F _ {x} $ and $ F _ {\dot{x} } $ at the corner, at $ t = a $ one must have $ \dot \eta _ {1} ( a) = 0 $. This, together with $ \eta _ {1} ( a) = 0 $, gives $ \eta _ {1} ( t) \equiv 0 $, in contradiction to the assumption $ \eta ( t) \neq 0 $, $ t _ {1} \leq t \leq a $.
To verify the Jacobi condition directly one has to consider the solution to (5) that satisfies the initial conditions
$$ \eta ( t _ {1} ) = 0,\ \ \dot \eta ( t _ {1} ) = 1. $$
Let it be $ \Delta ( t _ {1} , t) $. For a point $ t = a $, $ t _ {1} < a < t _ {2} $, to be conjugate to $ t _ {1} $ it is necessary and sufficient that $ \delta ( t _ {1} , t) $ vanishes at $ t = a $. Hence, the fulfilment of the Jacobi condition is equivalent to the non-vanishing of $ \Delta ( t _ {1} , t) $ on $ ( t _ {1} , t _ {2} ) $.
In a more general case, when a variational problem (a problem in Lagrange's, Mayer's or Bolza's form) is being considered, the statement of the Jacobi condition has certain special features. The problem of minimizing the second variation of the functional is stated as a Bolza problem. This problem is called the associated problem, and its extremals are called associated extremals. The differential conditions of the constraint, and the boundary conditions in the associated problem of minimizing the second variation, are obtained as a result of variation of the corresponding conditions of the original variational problem. The form of the definition of a conjugate point remains the same. For the second variation of the functional to be non-negative on the class of associated extremals satisfying the associated condition at the end points, the Jacobi condition must hold; this requires that $ ( t _ {1} , t _ {2} ) $ does not contain points conjugate to $ t _ {1} $.
The Jacobi condition was established by C.G.J. Jacobi (1837).
References
[1] | G.A. Bliss, "Lectures on the calculus of variations" , Chicago Univ. Press (1947) |
[2] | M.A. Lavrent'ev, L.A. Lyusternik, "A course in variational calculus" , Moscow-Leningrad (1950) (In Russian) |
Comments
Both the Jacobi condition and the Legendre condition are related to sufficiency conditions in the calculus of variations (see [a1]). The Legendre–Clebsch condition is a generalization of the latter one for optimal control problems (see [a2]). Generalizations of the Legendre–Clebsch condition, for singular control problems, have been obtained by H.J. Kelley (see [a3]).
References
[a1] | L.E. [L.E. El'sgol'ts] Elsgolc, "Calculus of variations" , Pergamon (1961) (Translated from Russian) |
[a2] | A.E. Bryson, Y.-C. Ho, "Applied optimal control" , Ginn (1969) |
[a3] | H.J. Kelley, R.E. Kopp, H.G. Moyer, "Singular extremals" G. Leitmann (ed.) , Topics of Optimization , Acad. Press (1967) pp. Chapt. 3; 63–101 |
[a4] | N.I. Akhiezer, "The calculus of variations" , Blaisdell (1962) (Translated from Russian) |
[a5] | L. Cesari, "Optimization - Theory and applications" , Springer (1983) |
Jacobi condition. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Jacobi_condition&oldid=47455