Equilibrium position
of a system of ordinary differential equations
$$ \tag{* } \dot{x} = \ f ( t, x),\ \ x \in \mathbf R ^ {n} $$
A point $ \xi \in \mathbf R ^ {n} $ such that $ x = \xi $ is a solution of
(constant in time). The solution itself is also called an equilibrium position. A point $ \xi \in \mathbf R ^ {n} $ is an equilibrium position of
if and only if
$$ f ( t, \xi ) = \ 0 \ \textrm{ for } \ \textrm{ all } t. $$
Let $ x = \phi ( t) $ be an arbitrary solution of . The change of variables $ x = \phi ( t) + y $ transforms this solution into the equilibrium position $ y = 0 $ of the system
$$ \dot{y} = \ F ( t, y),\ \ F ( t, y) \equiv \ f ( t, \phi ( t) + y) - f ( t, \phi ( t)). $$
Therefore, in stability theory, for example, it is possible to assume, without loss of generality, that the problem always consists of investigating the stability of an equilibrium position at the origin in $ \mathbf R ^ {n} $.
The equilibrium position $ x = 0 $ of a non-autonomous system
is often called the trivial solution, zero solution, singular point, stationary point, rest point, equilibrium state, or fixed point.
Equilibrium position. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Equilibrium_position&oldid=46841