Namespaces
Variants
Actions

Metric entropy

From Encyclopedia of Mathematics
Revision as of 17:17, 7 February 2011 by 127.0.0.1 (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

of a dynamical system

One of the most important invariants in ergodic theory. Basic is the concept of the entropy of an endomorphism (see Metric isomorphism) of a Lebesgue space . For any finite measurable decomposition (measurable partition) the limit

(the entropy of in unit time relative to ) exists, where is the entropy (cf. Entropy of a measurable decomposition) of , and is the partition whose elements are the intersections of the elements of and . (This definition carries over verbatim to with ; by another method can be defined for any measurable .) The entropy is defined as the least upper bound of the over all possible finite measurable . (It may be ; the use of all with or of all measurable yields the same entropy.)

Originally the entropy was defined by A.N. Kolmogorov somewhat differently (see ); the version given above came later (see [2]). In the basic case of an aperiodic automorphism of a Lebesgue space the definitions are ultimately equivalent [3].

It turns out that , and if is an automorphism, then . Therefore, the entropy of a cascade is naturally taken to be . For a measurable flow it turns out that . Therefore the entropy of a flow is naturally taken to be . The definition of the entropy for other transformation groups with an invariant measure is somewhat different. (It does not reduce to the entropy of a single transformation in the group; see [5], [6].) There are modifications of the entropy for the case of an infinite invariant measure [7]; another modification is the -entropy (where is an ascending sequence of natural numbers), which is obtained when is replaced by

and by (see [8]).

The entropy is a metric isomorphism invariant of dynamical systems and is fundamentally different from the earlier-known invariants, which are basically connected with the spectrum of a dynamical system. In particular, by means of the entropy of Bernoulli automorphisms (cf. Bernoulli automorphism; see ) it was first established that there exist non-isomorphic ergodic systems with the same continuous spectrum (which contrasts with the situation for a discrete spectrum). In a wider setting the role of the entropy is related to the fact that a new trend arose in ergodic theory: the entropy theory of dynamical systems (see [3], [4], and Ergodic theory).

The entropy provides a tool for characterizing the rate of mixing of sets of small measure (more accurately, the collection of those that form the partition). Side-by-side with this "global" role, the entropy also plays a "local" role, which is established by Breiman's ergodic theorem (an individual ergodic theorem of information theory): For ergodic and almost-all ,

where is the element of the partition containing and the logarithm is taken to the same base as in the definition of (see , [4]). (Breiman's theorem is true for with [10], but, generally speaking, not for countable with [11]; there are variants for non-ergodic (see [4], [12]) and an infinite [13]. A weaker assertion on the convergence in the sense of has been proved for a certain general class of transformation groups [6].)

For smooth dynamical systems with a smooth invariant measure a connection has been established between the entropy and the Lyapunov characteristic exponent of the equations in variations (see [14][16]).

The name "entropy" is explained by the analogy between the entropy of dynamical systems and that in information theory and statistical physics, right up to the fact that in certain examples these entropies are the same (see, for example, [4], [17]). The analogy with statistical physics was one of the stimuli for introducing in ergodic theory (even in a not-purely metric context and for topological dynamical systems, cf. Topological dynamical system) new concepts such as "Gibbsian measures" , the "topological pressure" (an analogue to the free energy) and the "variational principle" for the latter (see the references to -system; Topological entropy).

References

[1a] A.N. Kolmogorov, "A new metric invariant of transitive dynamical systems, and Lebesgue space automorphisms" Dokl. Akad. Nauk SSSR , 119 : 5 (1958) pp. 861–864 (In Russian)
[1b] A.N. Kolmogorov, "On entropy per unit time as a metric invariant of automorphisms" Dokl. Akad. Nauk SSSR , 124 : 4 (1959) pp. 754–755 (In Russian)
[2] Ya.G. Sinai, "On the notion of entropy of dynamical systems" Dokl. Akad. Nauk SSSR , 124 : 4 (1959) pp. 768–771 (In Russian)
[3] V.A. Rokhlin, "Lectures on the entropy theory of transformations with invariant measure" Russian Math. Surveys , 22 : 5 (1967) pp. 1–52 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 3–56
[4] P. Billingsley, "Ergodic theory and information" , Wiley (1965)
[5] A.V. Safonov, "Information parts in groups" Math. USSR. Izv. , 22 (1984) pp. 393–398 Izv. Akad. Nauk SSSR Ser. Mat. , 47 : 2 (1983) pp. 421–426
[6] J.C. Kieffer, "A generalized Shannon–McMillan theorem for the action of an amenable group on a probability space" Ann. of Probab. , 3 : 6 (1975) pp. 1031–1037
[7] V. Krengel, "Entropy of conservative transformations" Z. Wahrscheinlichkeitstheor. Verw. Geb. , 7 : 3 (1967) pp. 161–181
[8] A.G. Kushnirenko, "Metric invariants of entropy type" Russian Math. Surveys , 22 : 5 (1967) pp. 53–61 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 37–65
[9a] L. Breiman, "The individual ergodic theorem of information theory" Ann. Math. Stat. , 28 : 3 (1957) pp. 809–811
[9b] L. Breiman, "Correction to "The individual ergodic theorem of information theory" " Ann. Math. Stat. , 31 : 3 (1960) pp. 809–810
[10] K.L. Chung, "A note on the ergodic theorem of information theory" Ann. Math. Stat. , 32 : 3 (1961) pp. 612–614
[11] B.S. Pitskeĺ, "Nonuniform distribution of entropy for processes with a countable set of states" Probl. Peredatsi Inform. , 12 : 2 (1976) pp. 98–103 (In Russian)
[12] A. Ionesco-Tulcea, "Contributions to information theory for abstract alphabets" Arkiv for Mat. , 4 : 2–3 (1961) pp. 235–247
[13] E.M. Klimko, L. Sucheston, "On convergence of information in spaces with infinite invariant measure" Z. Wahrscheinlichkeitstheor. Verw. Geb. , 10 : 3 (1968) pp. 226–235
[14] V.M. Millionshchikov, "A formula for the entropy of smooth dynamical systems" Differential Eq. , 12 (1976) pp. 1527–1530 Differents. Uravnen. , 12 : 12 (1976) pp. 2188–2192
[15] Ya.B. Pesin, "Characteristic Lyapunov exponents, and smooth ergodic theory" Russian Math. Surveys , 32 : 4 (1977) pp. 55–114 Uspekhi Mat. Nauk , 32 : 4 (1977) pp. 55–112
[16] R. Mañé, "A proof of Pesin's formula" Ergod. Th. and Dynam. Syst. , 1 : 1 (1981) pp. 95–102
[17] D.W. Robinson, D. Ruelle, "Mean entropy of states in classical statistical mechanics" Comm. Math. Phys. , 5 : 4 (1967) pp. 288–300


Comments

Instead of -entropy the term sequence entropy is used in the English literature. See e.g. [a1], § 4.11. For several useful recent references concerning the computation of entropy, see [a2].

References

[a1] P. Walters, "An introduction to ergodic theory" , Springer (1982)
[a2] M.P. Wojtkowski, "Measure theoretic entropy of the system of hard spheres" Ergod. Th. and Dynam. Syst. , 8 (1988) pp. 133–153
How to Cite This Entry:
Metric entropy. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Metric_entropy&oldid=16474
This article was adapted from an original article by D.V. Anosov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article