Entropy theory of a dynamical system
2020 Mathematics Subject Classification: Primary: 37A35 Secondary: 60G10 [MSN][ZBL]
A branch of ergodic theory closely connected with probability theory and information theory. In broad lines, the nature of this connection is as follows.
Let be a dynamical system (usually a measurable flow or a cascade) with phase space and invariant measure . Let be a measurable function and let be the measurable decomposition (measurable partition) of into inverse images , . (For what follows it is sufficient to consider inverse images of having a countable, and as a rule even finite, number of values, and the corresponding partition .) Then
is a stationary stochastic process (in the narrow sense of the word) with as space of elementary events. Usually this can be regarded as a process the space of elementary events of which is the space of sample functions (cf. Sample function) , endowed with a suitable measure , and . The mapping
is a homomorphism of measure spaces (see the definition in the article Metric isomorphism) that carries into the shift , where .
The process contains some information about the original system . This can even be complete information when is an isomorphism. (One says then that is a generator for ; if is an automorphism, then the partition is called a one-side generator for if it is a generator for the cascade , and a two-side generator for if it is a generator for .) However, also depends on the choice of , that is, first of all, on (the specific values of on the elements of are less important here). Of interest in ergodic theory are those properties of an individual process , or of a collection of those processes (obtained for various ), that are properties of the system itself. However, to select such properties was for a long time not easy unless they reduced to known ones.
This difficulty was successfully overcome in the middle of the 1950s by A.N. Kolmogorov when he introduced a fundamentally new (non-spectral) invariant, the metric entropy of a dynamical system, and emphasized the role of increasing measurable partitions , that is, those for which is finer than () for . (In this way a partition describes the "past" of the process , see also -system; Exact endomorphism.) The elaboration of this range of problems (including that of the existence and properties of generating partitions) is the object of the entropy theory of dynamical systems in the form in which it was put together in the middle of the 1960s (see [R]). A substantial addition was the more complete and somewhat more special theory of D. Ornstein in which auxiliary stochastic processes are used in a more direct way (see [O]). In view of the need to ensure the invariance under metric isomorphisms in both the "Kolmogorov" and the "Ornstein" entropy theory of dynamical systems, probability and information-theoretical ideas pervade the area in an essentially transformed form.
Two conditions of "regularity" type of a stochastic process occurring in the entropy theory of dynamical systems may serve as examples. One of them leads to the definition of a -system. The other, more restrictive one, a very weak Bernoulli property, turns out to be necessary and sufficient for a shift in the space of sample functions to be isomorphic to a Bernoulli automorphism. It can be verified in a number of examples, the original definitions of which have no relation to stochastic processes.
References
[R] | V.A. Rokhlin, "Lectures on the entropy theory of measure-preserving transformations" Russian Math. Surveys , 22 : 5 (1967) pp. 1–52 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 3–56 Zbl 0174.45501 |
[O] | D. Ornstein, "Ergodic theory, randomness, and dynamical systems" , Yale Univ. Press (1974) MR0447525 Zbl 0296.28016 |
See also the references to -system; Entropy; Ergodic theory.
Comments
Entropy in the theory of dynamical systems is defined as follows (cf. also Entropy). For every measurable partition of a probability space , the entropy of is defined as
(it is assumed that ). The base of the logarithm can be any positive number, but as a rule one takes logarithms to the base 2 or .
Then define the entropy of a measure-preserving transformation (i.e. of a cascade) with respect to a partition by
where , the common refinement of the partitions . Finally, the entropy of is defined as
where the supremum is over all finite measurable partitions of .
Since for a flow in one has for all , one usually defines the entropy of a flow by
References
[CFS] | I.P. Cornfel'd, S.V. Fomin, Ya.G. Sinai, "Ergodic theory" , Springer (1982) (Translated from Russian) MR832433 |
[M] | R. Mañé, "Ergodic theory and differentiable dynamics" , Springer (1987) MR0889254 Zbl 0616.28007 |
Entropy theory of a dynamical system. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Entropy_theory_of_a_dynamical_system&oldid=23605