# Metric entropy

of a dynamical system

One of the most important invariants in ergodic theory. Basic is the concept of the entropy $h ( S)$ of an endomorphism $S$( see Metric isomorphism) of a Lebesgue space $( X , \mu )$. For any finite measurable decomposition (measurable partition) $\xi$ the limit

$$h ( S , \xi ) = \lim\limits _ {n \rightarrow \infty } \ \frac{1}{n} H ( \xi _ {S} ^ {n} ) ,$$

$$\xi _ {S} ^ {n} = \xi \lor S ^ {-1} \xi \lor \dots \lor S ^ {- n+ 1} \xi$$

(the entropy of $\xi$ in unit time relative to $S$) exists, where $H ( \xi )$ is the entropy (cf. Entropy of a measurable decomposition) of $\xi$, and $\xi \lor \eta$ is the partition whose elements are the intersections of the elements of $\xi$ and $\eta$. (This definition carries over verbatim to $\xi$ with $H ( \xi ) < \infty$; by another method $h ( S , \xi )$ can be defined for any measurable $\xi$.) The entropy $h ( S)$ is defined as the least upper bound of the $h ( S , \xi )$ over all possible finite measurable $\xi$. (It may be $\infty$; the use of all $\xi$ with $H ( \xi ) < \infty$ or of all measurable $\xi$ yields the same entropy.)

Originally the entropy was defined by A.N. Kolmogorov somewhat differently (see ); the version given above came later (see ). In the basic case of an aperiodic automorphism of a Lebesgue space the definitions are ultimately equivalent .

It turns out that $h ( S ^ {n} ) = n h ( S)$, and if $S$ is an automorphism, then $h ( S ^ {-1} ) = h ( S)$. Therefore, the entropy of a cascade $\{ S ^ {n} \}$ is naturally taken to be $h ( S)$. For a measurable flow $\{ S _ {t} \}$ it turns out that $h ( S _ {t} ) = | t | h( S _ {1} )$. Therefore the entropy of a flow is naturally taken to be $h ( S _ {1} )$. The definition of the entropy for other transformation groups with an invariant measure is somewhat different. (It does not reduce to the entropy of a single transformation in the group; see , .) There are modifications of the entropy for the case of an infinite invariant measure ; another modification is the $A$- entropy (where $A = \{ k _ {n} \}$ is an ascending sequence of natural numbers), which is obtained when $\xi _ {S} ^ {n}$ is replaced by

$$S ^ {- k _ {1} } \xi \lor \dots \lor S ^ {- k _ {n} } \xi$$

and $\lim\limits$ by $\overline{\lim\limits}\;$( see ).

The entropy is a metric isomorphism invariant of dynamical systems and is fundamentally different from the earlier-known invariants, which are basically connected with the spectrum of a dynamical system. In particular, by means of the entropy of Bernoulli automorphisms (cf. Bernoulli automorphism; see ) it was first established that there exist non-isomorphic ergodic systems with the same continuous spectrum (which contrasts with the situation for a discrete spectrum). In a wider setting the role of the entropy is related to the fact that a new trend arose in ergodic theory: the entropy theory of dynamical systems (see , , and Ergodic theory).

The entropy provides a tool for characterizing the rate of mixing of sets of small measure (more accurately, the collection of those that form the partition). Side-by-side with this "global" role, the entropy also plays a "local" role, which is established by Breiman's ergodic theorem (an individual ergodic theorem of information theory): For ergodic $S$ and almost-all $x$,

$$\frac{1}{n} | \mathop{\rm log} \mu ( C _ {\xi _ {X} ^ {s} } ( x) ) | \rightarrow h ( S , \xi ) \ \textrm{ for } n \rightarrow \infty ,$$

where $C _ \eta ( x)$ is the element of the partition $\eta$ containing $x$ and the logarithm is taken to the same base as in the definition of $H$( see , ). (Breiman's theorem is true for $\xi$ with $H ( \xi ) < \infty$, but, generally speaking, not for countable $\xi$ with $H ( \xi ) = \infty$; there are variants for non-ergodic $S$( see , ) and an infinite $\mu$. A weaker assertion on the convergence in the sense of $l _ {1}$ has been proved for a certain general class of transformation groups .)

For smooth dynamical systems with a smooth invariant measure a connection has been established between the entropy and the Lyapunov characteristic exponent of the equations in variations (see ).

The name "entropy" is explained by the analogy between the entropy of dynamical systems and that in information theory and statistical physics, right up to the fact that in certain examples these entropies are the same (see, for example, , ). The analogy with statistical physics was one of the stimuli for introducing in ergodic theory (even in a not-purely metric context and for topological dynamical systems, cf. Topological dynamical system) new concepts such as "Gibbsian measures" , the "topological pressure" (an analogue to the free energy) and the "variational principle" for the latter (see the references to $Y$- system; Topological entropy).

How to Cite This Entry:
Metric entropy. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Metric_entropy&oldid=50924
This article was adapted from an original article by D.V. Anosov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article