Namespaces
Variants
Actions

Difference between revisions of "Metric entropy"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
Line 1: Line 1:
 +
<!--
 +
m0636501.png
 +
$#A+1 = 54 n = 0
 +
$#C+1 = 54 : ~/encyclopedia/old_files/data/M063/M.0603650 Metric entropy
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
''of a dynamical system''
 
''of a dynamical system''
  
One of the most important invariants in [[Ergodic theory|ergodic theory]]. Basic is the concept of the entropy <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636501.png" /> of an endomorphism <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636502.png" /> (see [[Metric isomorphism|Metric isomorphism]]) of a [[Lebesgue space|Lebesgue space]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636503.png" />. For any finite [[Measurable decomposition|measurable decomposition]] (measurable partition) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636504.png" /> the limit
+
One of the most important invariants in [[Ergodic theory|ergodic theory]]. Basic is the concept of the entropy $  h ( S) $
 +
of an endomorphism $  S $(
 +
see [[Metric isomorphism|Metric isomorphism]]) of a [[Lebesgue space|Lebesgue space]] $  ( X , \mu ) $.  
 +
For any finite [[Measurable decomposition|measurable decomposition]] (measurable partition) $  \xi $
 +
the limit
 +
 
 +
$$
 +
h ( S , \xi )  =  \lim\limits _ {n \rightarrow \infty } \
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636505.png" /></td> </tr></table>
+
\frac{1}{n}
 +
H ( \xi _ {S}  ^ {n} ) ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636506.png" /></td> </tr></table>
+
$$
 +
\xi _ {S}  ^ {n}  = \xi \lor S  ^ {-} 1 \xi \lor \dots \lor S  ^ {-} n+ 1 \xi
 +
$$
  
(the entropy of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636507.png" /> in unit time relative to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636508.png" />) exists, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m0636509.png" /> is the entropy (cf. [[Entropy of a measurable decomposition|Entropy of a measurable decomposition]]) of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365010.png" />, and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365011.png" /> is the partition whose elements are the intersections of the elements of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365012.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365013.png" />. (This definition carries over verbatim to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365014.png" /> with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365015.png" />; by another method <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365016.png" /> can be defined for any measurable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365017.png" />.) The entropy <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365018.png" /> is defined as the least upper bound of the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365019.png" /> over all possible finite measurable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365020.png" />. (It may be <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365021.png" />; the use of all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365022.png" /> with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365023.png" /> or of all measurable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365024.png" /> yields the same entropy.)
+
(the entropy of $  \xi $
 +
in unit time relative to $  S $)  
 +
exists, where $  H ( \xi ) $
 +
is the entropy (cf. [[Entropy of a measurable decomposition|Entropy of a measurable decomposition]]) of $  \xi $,  
 +
and $  \xi \lor \eta $
 +
is the partition whose elements are the intersections of the elements of $  \xi $
 +
and $  \eta $.  
 +
(This definition carries over verbatim to $  \xi $
 +
with $  H ( \xi ) < \infty $;  
 +
by another method $  h ( S , \xi ) $
 +
can be defined for any measurable $  \xi $.)  
 +
The entropy $  h ( S) $
 +
is defined as the least upper bound of the $  h ( S , \xi ) $
 +
over all possible finite measurable $  \xi $.  
 +
(It may be $  \infty $;  
 +
the use of all $  \xi $
 +
with $  H ( \xi ) < \infty $
 +
or of all measurable $  \xi $
 +
yields the same entropy.)
  
 
Originally the entropy was defined by A.N. Kolmogorov somewhat differently (see ); the version given above came later (see [[#References|[2]]]). In the basic case of an [[Aperiodic automorphism|aperiodic automorphism]] of a Lebesgue space the definitions are ultimately equivalent [[#References|[3]]].
 
Originally the entropy was defined by A.N. Kolmogorov somewhat differently (see ); the version given above came later (see [[#References|[2]]]). In the basic case of an [[Aperiodic automorphism|aperiodic automorphism]] of a Lebesgue space the definitions are ultimately equivalent [[#References|[3]]].
  
It turns out that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365025.png" />, and if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365026.png" /> is an automorphism, then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365027.png" />. Therefore, the entropy of a [[Cascade|cascade]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365028.png" /> is naturally taken to be <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365029.png" />. For a [[Measurable flow|measurable flow]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365030.png" /> it turns out that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365031.png" />. Therefore the entropy of a flow is naturally taken to be <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365032.png" />. The definition of the entropy for other transformation groups with an invariant measure is somewhat different. (It does not reduce to the entropy of a single transformation in the group; see [[#References|[5]]], [[#References|[6]]].) There are modifications of the entropy for the case of an infinite invariant measure [[#References|[7]]]; another modification is the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365034.png" />-entropy (where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365035.png" /> is an ascending sequence of natural numbers), which is obtained when <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365036.png" /> is replaced by
+
It turns out that $  h ( S  ^ {n} ) = n h ( S) $,  
 +
and if $  S $
 +
is an automorphism, then $  h ( S  ^ {-} 1 ) = h ( S) $.  
 +
Therefore, the entropy of a [[Cascade|cascade]] $  \{ S  ^ {n} \} $
 +
is naturally taken to be $  h ( S) $.  
 +
For a [[Measurable flow|measurable flow]] $  \{ S _ {t} \} $
 +
it turns out that $  h ( S _ {t} ) = | t | h( S _ {1} ) $.  
 +
Therefore the entropy of a flow is naturally taken to be $  h ( S _ {1} ) $.  
 +
The definition of the entropy for other transformation groups with an invariant measure is somewhat different. (It does not reduce to the entropy of a single transformation in the group; see [[#References|[5]]], [[#References|[6]]].) There are modifications of the entropy for the case of an infinite invariant measure [[#References|[7]]]; another modification is the $  A $-
 +
entropy (where $  A = \{ k _ {n} \} $
 +
is an ascending sequence of natural numbers), which is obtained when $  \xi _ {S}  ^ {n} $
 +
is replaced by
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365037.png" /></td> </tr></table>
+
$$
 +
S ^ {- k _ {1} } \xi \lor \dots \lor S ^ {- k _ {n} } \xi
 +
$$
  
and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365038.png" /> by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365039.png" /> (see [[#References|[8]]]).
+
and $  \lim\limits $
 +
by $  \overline{\lim\limits}\; $(
 +
see [[#References|[8]]]).
  
 
The entropy is a metric isomorphism invariant of dynamical systems and is fundamentally different from the earlier-known invariants, which are basically connected with the [[Spectrum of a dynamical system|spectrum of a dynamical system]]. In particular, by means of the entropy of Bernoulli automorphisms (cf. [[Bernoulli automorphism|Bernoulli automorphism]]; see ) it was first established that there exist non-isomorphic ergodic systems with the same continuous spectrum (which contrasts with the situation for a discrete spectrum). In a wider setting the role of the entropy is related to the fact that a new trend arose in ergodic theory: the entropy theory of dynamical systems (see [[#References|[3]]], [[#References|[4]]], and [[Ergodic theory|Ergodic theory]]).
 
The entropy is a metric isomorphism invariant of dynamical systems and is fundamentally different from the earlier-known invariants, which are basically connected with the [[Spectrum of a dynamical system|spectrum of a dynamical system]]. In particular, by means of the entropy of Bernoulli automorphisms (cf. [[Bernoulli automorphism|Bernoulli automorphism]]; see ) it was first established that there exist non-isomorphic ergodic systems with the same continuous spectrum (which contrasts with the situation for a discrete spectrum). In a wider setting the role of the entropy is related to the fact that a new trend arose in ergodic theory: the entropy theory of dynamical systems (see [[#References|[3]]], [[#References|[4]]], and [[Ergodic theory|Ergodic theory]]).
  
The entropy provides a tool for characterizing the rate of [[Mixing|mixing]] of sets of small measure (more accurately, the collection of those that form the partition). Side-by-side with this  "global"  role, the entropy also plays a  "local"  role, which is established by Breiman's ergodic theorem (an individual ergodic theorem of information theory): For ergodic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365040.png" /> and almost-all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365041.png" />,
+
The entropy provides a tool for characterizing the rate of [[Mixing|mixing]] of sets of small measure (more accurately, the collection of those that form the partition). Side-by-side with this  "global"  role, the entropy also plays a  "local"  role, which is established by Breiman's ergodic theorem (an individual ergodic theorem of information theory): For ergodic $  S $
 +
and almost-all $  x $,
 +
 
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365042.png" /></td> </tr></table>
+
\frac{1}{n}
 +
|  \mathop{\rm log}  \mu ( C _ {\xi _ {X}  ^ {s} } ( x) )
 +
|
 +
\rightarrow  h ( S , \xi ) \  \textrm{ for }  n \rightarrow \infty ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365043.png" /> is the element of the partition <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365044.png" /> containing <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365045.png" /> and the logarithm is taken to the same base as in the definition of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365046.png" /> (see , [[#References|[4]]]). (Breiman's theorem is true for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365047.png" /> with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365048.png" /> [[#References|[10]]], but, generally speaking, not for countable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365049.png" /> with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365050.png" /> [[#References|[11]]]; there are variants for non-ergodic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365051.png" /> (see [[#References|[4]]], [[#References|[12]]]) and an infinite <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365052.png" /> [[#References|[13]]]. A weaker assertion on the convergence in the sense of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365053.png" /> has been proved for a certain general class of transformation groups [[#References|[6]]].)
+
where $  C _  \eta  ( x) $
 +
is the element of the partition $  \eta $
 +
containing $  x $
 +
and the logarithm is taken to the same base as in the definition of $  H $(
 +
see , [[#References|[4]]]). (Breiman's theorem is true for $  \xi $
 +
with $  H ( \xi ) < \infty $[[#References|[10]]], but, generally speaking, not for countable $  \xi $
 +
with $  H ( \xi ) = \infty $[[#References|[11]]]; there are variants for non-ergodic $  S $(
 +
see [[#References|[4]]], [[#References|[12]]]) and an infinite $  \mu $[[#References|[13]]]. A weaker assertion on the convergence in the sense of $  l _ {1} $
 +
has been proved for a certain general class of transformation groups [[#References|[6]]].)
  
 
For smooth dynamical systems with a smooth invariant measure a connection has been established between the entropy and the [[Lyapunov characteristic exponent|Lyapunov characteristic exponent]] of the equations in variations (see [[#References|[14]]]–[[#References|[16]]]).
 
For smooth dynamical systems with a smooth invariant measure a connection has been established between the entropy and the [[Lyapunov characteristic exponent|Lyapunov characteristic exponent]] of the equations in variations (see [[#References|[14]]]–[[#References|[16]]]).
  
The name  "entropy"  is explained by the analogy between the entropy of dynamical systems and that in [[Information theory|information theory]] and statistical physics, right up to the fact that in certain examples these entropies are the same (see, for example, [[#References|[4]]], [[#References|[17]]]). The analogy with statistical physics was one of the stimuli for introducing in ergodic theory (even in a not-purely metric context and for topological dynamical systems, cf. [[Topological dynamical system|Topological dynamical system]]) new concepts such as  "Gibbsian measures" , the  "topological pressure"  (an analogue to the free energy) and the  "variational principle"  for the latter (see the references to [[Y-system|<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365054.png" />-system]]; [[Topological entropy|Topological entropy]]).
+
The name  "entropy"  is explained by the analogy between the entropy of dynamical systems and that in [[Information theory|information theory]] and statistical physics, right up to the fact that in certain examples these entropies are the same (see, for example, [[#References|[4]]], [[#References|[17]]]). The analogy with statistical physics was one of the stimuli for introducing in ergodic theory (even in a not-purely metric context and for topological dynamical systems, cf. [[Topological dynamical system|Topological dynamical system]]) new concepts such as  "Gibbsian measures" , the  "topological pressure"  (an analogue to the free energy) and the  "variational principle"  for the latter (see the references to [[Y-system| $  Y $-
 +
system]]; [[Topological entropy|Topological entropy]]).
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1a]</TD> <TD valign="top">  A.N. Kolmogorov,  "A new metric invariant of transitive dynamical systems, and Lebesgue space automorphisms"  ''Dokl. Akad. Nauk SSSR'' , '''119''' :  5  (1958)  pp. 861–864  (In Russian)</TD></TR><TR><TD valign="top">[1b]</TD> <TD valign="top">  A.N. Kolmogorov,  "On entropy per unit time as a metric invariant of automorphisms"  ''Dokl. Akad. Nauk SSSR'' , '''124''' :  4  (1959)  pp. 754–755  (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  Ya.G. Sinai,  "On the notion of entropy of dynamical systems"  ''Dokl. Akad. Nauk SSSR'' , '''124''' :  4  (1959)  pp. 768–771  (In Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  V.A. Rokhlin,  "Lectures on the entropy theory of transformations with invariant measure"  ''Russian Math. Surveys'' , '''22''' :  5  (1967)  pp. 1–52  ''Uspekhi Mat. Nauk'' , '''22''' :  5  (1967)  pp. 3–56</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  P. Billingsley,  "Ergodic theory and information" , Wiley  (1965)</TD></TR><TR><TD valign="top">[5]</TD> <TD valign="top">  A.V. Safonov,  "Information parts in groups"  ''Math. USSR. Izv.'' , '''22'''  (1984)  pp. 393–398  ''Izv. Akad. Nauk SSSR Ser. Mat.'' , '''47''' :  2  (1983)  pp. 421–426</TD></TR><TR><TD valign="top">[6]</TD> <TD valign="top">  J.C. Kieffer,  "A generalized Shannon–McMillan theorem for the action of an amenable group on a probability space"  ''Ann. of Probab.'' , '''3''' :  6  (1975)  pp. 1031–1037</TD></TR><TR><TD valign="top">[7]</TD> <TD valign="top">  V. Krengel,  "Entropy of conservative transformations"  ''Z. Wahrscheinlichkeitstheor. Verw. Geb.'' , '''7''' :  3  (1967)  pp. 161–181</TD></TR><TR><TD valign="top">[8]</TD> <TD valign="top">  A.G. Kushnirenko,  "Metric invariants of entropy type"  ''Russian Math. Surveys'' , '''22''' :  5  (1967)  pp. 53–61  ''Uspekhi Mat. Nauk'' , '''22''' :  5  (1967)  pp. 37–65</TD></TR><TR><TD valign="top">[9a]</TD> <TD valign="top">  L. Breiman,  "The individual ergodic theorem of information theory"  ''Ann. Math. Stat.'' , '''28''' :  3  (1957)  pp. 809–811</TD></TR><TR><TD valign="top">[9b]</TD> <TD valign="top">  L. Breiman,  "Correction to  "The individual ergodic theorem of information theory" "  ''Ann. Math. Stat.'' , '''31''' :  3  (1960)  pp. 809–810</TD></TR><TR><TD valign="top">[10]</TD> <TD valign="top">  K.L. Chung,  "A note on the ergodic theorem of information theory"  ''Ann. Math. Stat.'' , '''32''' :  3  (1961)  pp. 612–614</TD></TR><TR><TD valign="top">[11]</TD> <TD valign="top">  B.S. Pitskeĺ,  "Nonuniform distribution of entropy for processes with a countable set of states"  ''Probl. Peredatsi Inform.'' , '''12''' :  2  (1976)  pp. 98–103  (In Russian)</TD></TR><TR><TD valign="top">[12]</TD> <TD valign="top">  A. Ionesco-Tulcea,  "Contributions to information theory for abstract alphabets"  ''Arkiv for Mat.'' , '''4''' :  2–3  (1961)  pp. 235–247</TD></TR><TR><TD valign="top">[13]</TD> <TD valign="top">  E.M. Klimko,  L. Sucheston,  "On convergence of information in spaces with infinite invariant measure"  ''Z. Wahrscheinlichkeitstheor. Verw. Geb.'' , '''10''' :  3  (1968)  pp. 226–235</TD></TR><TR><TD valign="top">[14]</TD> <TD valign="top">  V.M. Millionshchikov,  "A formula for the entropy of smooth dynamical systems"  ''Differential Eq.'' , '''12'''  (1976)  pp. 1527–1530  ''Differents. Uravnen.'' , '''12''' :  12  (1976)  pp. 2188–2192</TD></TR><TR><TD valign="top">[15]</TD> <TD valign="top">  Ya.B. Pesin,  "Characteristic Lyapunov exponents, and smooth ergodic theory"  ''Russian Math. Surveys'' , '''32''' :  4  (1977)  pp. 55–114  ''Uspekhi Mat. Nauk'' , '''32''' :  4  (1977)  pp. 55–112</TD></TR><TR><TD valign="top">[16]</TD> <TD valign="top">  R. Mañé,  "A proof of Pesin's formula"  ''Ergod. Th. and Dynam. Syst.'' , '''1''' :  1  (1981)  pp. 95–102</TD></TR><TR><TD valign="top">[17]</TD> <TD valign="top">  D.W. Robinson,  D. Ruelle,  "Mean entropy of states in classical statistical mechanics"  ''Comm. Math. Phys.'' , '''5''' :  4  (1967)  pp. 288–300</TD></TR></table>
 
<table><TR><TD valign="top">[1a]</TD> <TD valign="top">  A.N. Kolmogorov,  "A new metric invariant of transitive dynamical systems, and Lebesgue space automorphisms"  ''Dokl. Akad. Nauk SSSR'' , '''119''' :  5  (1958)  pp. 861–864  (In Russian)</TD></TR><TR><TD valign="top">[1b]</TD> <TD valign="top">  A.N. Kolmogorov,  "On entropy per unit time as a metric invariant of automorphisms"  ''Dokl. Akad. Nauk SSSR'' , '''124''' :  4  (1959)  pp. 754–755  (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  Ya.G. Sinai,  "On the notion of entropy of dynamical systems"  ''Dokl. Akad. Nauk SSSR'' , '''124''' :  4  (1959)  pp. 768–771  (In Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  V.A. Rokhlin,  "Lectures on the entropy theory of transformations with invariant measure"  ''Russian Math. Surveys'' , '''22''' :  5  (1967)  pp. 1–52  ''Uspekhi Mat. Nauk'' , '''22''' :  5  (1967)  pp. 3–56</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  P. Billingsley,  "Ergodic theory and information" , Wiley  (1965)</TD></TR><TR><TD valign="top">[5]</TD> <TD valign="top">  A.V. Safonov,  "Information parts in groups"  ''Math. USSR. Izv.'' , '''22'''  (1984)  pp. 393–398  ''Izv. Akad. Nauk SSSR Ser. Mat.'' , '''47''' :  2  (1983)  pp. 421–426</TD></TR><TR><TD valign="top">[6]</TD> <TD valign="top">  J.C. Kieffer,  "A generalized Shannon–McMillan theorem for the action of an amenable group on a probability space"  ''Ann. of Probab.'' , '''3''' :  6  (1975)  pp. 1031–1037</TD></TR><TR><TD valign="top">[7]</TD> <TD valign="top">  V. Krengel,  "Entropy of conservative transformations"  ''Z. Wahrscheinlichkeitstheor. Verw. Geb.'' , '''7''' :  3  (1967)  pp. 161–181</TD></TR><TR><TD valign="top">[8]</TD> <TD valign="top">  A.G. Kushnirenko,  "Metric invariants of entropy type"  ''Russian Math. Surveys'' , '''22''' :  5  (1967)  pp. 53–61  ''Uspekhi Mat. Nauk'' , '''22''' :  5  (1967)  pp. 37–65</TD></TR><TR><TD valign="top">[9a]</TD> <TD valign="top">  L. Breiman,  "The individual ergodic theorem of information theory"  ''Ann. Math. Stat.'' , '''28''' :  3  (1957)  pp. 809–811</TD></TR><TR><TD valign="top">[9b]</TD> <TD valign="top">  L. Breiman,  "Correction to  "The individual ergodic theorem of information theory" "  ''Ann. Math. Stat.'' , '''31''' :  3  (1960)  pp. 809–810</TD></TR><TR><TD valign="top">[10]</TD> <TD valign="top">  K.L. Chung,  "A note on the ergodic theorem of information theory"  ''Ann. Math. Stat.'' , '''32''' :  3  (1961)  pp. 612–614</TD></TR><TR><TD valign="top">[11]</TD> <TD valign="top">  B.S. Pitskeĺ,  "Nonuniform distribution of entropy for processes with a countable set of states"  ''Probl. Peredatsi Inform.'' , '''12''' :  2  (1976)  pp. 98–103  (In Russian)</TD></TR><TR><TD valign="top">[12]</TD> <TD valign="top">  A. Ionesco-Tulcea,  "Contributions to information theory for abstract alphabets"  ''Arkiv for Mat.'' , '''4''' :  2–3  (1961)  pp. 235–247</TD></TR><TR><TD valign="top">[13]</TD> <TD valign="top">  E.M. Klimko,  L. Sucheston,  "On convergence of information in spaces with infinite invariant measure"  ''Z. Wahrscheinlichkeitstheor. Verw. Geb.'' , '''10''' :  3  (1968)  pp. 226–235</TD></TR><TR><TD valign="top">[14]</TD> <TD valign="top">  V.M. Millionshchikov,  "A formula for the entropy of smooth dynamical systems"  ''Differential Eq.'' , '''12'''  (1976)  pp. 1527–1530  ''Differents. Uravnen.'' , '''12''' :  12  (1976)  pp. 2188–2192</TD></TR><TR><TD valign="top">[15]</TD> <TD valign="top">  Ya.B. Pesin,  "Characteristic Lyapunov exponents, and smooth ergodic theory"  ''Russian Math. Surveys'' , '''32''' :  4  (1977)  pp. 55–114  ''Uspekhi Mat. Nauk'' , '''32''' :  4  (1977)  pp. 55–112</TD></TR><TR><TD valign="top">[16]</TD> <TD valign="top">  R. Mañé,  "A proof of Pesin's formula"  ''Ergod. Th. and Dynam. Syst.'' , '''1''' :  1  (1981)  pp. 95–102</TD></TR><TR><TD valign="top">[17]</TD> <TD valign="top">  D.W. Robinson,  D. Ruelle,  "Mean entropy of states in classical statistical mechanics"  ''Comm. Math. Phys.'' , '''5''' :  4  (1967)  pp. 288–300</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====
Instead of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m063/m063650/m06365055.png" />-entropy the term sequence entropy is used in the English literature. See e.g. [[#References|[a1]]], § 4.11. For several useful recent references concerning the computation of entropy, see [[#References|[a2]]].
+
Instead of $  A $-
 +
entropy the term sequence entropy is used in the English literature. See e.g. [[#References|[a1]]], § 4.11. For several useful recent references concerning the computation of entropy, see [[#References|[a2]]].
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  P. Walters,  "An introduction to ergodic theory" , Springer  (1982)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  M.P. Wojtkowski,  "Measure theoretic entropy of the system of hard spheres"  ''Ergod. Th. and Dynam. Syst.'' , '''8'''  (1988)  pp. 133–153</TD></TR></table>
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  P. Walters,  "An introduction to ergodic theory" , Springer  (1982)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  M.P. Wojtkowski,  "Measure theoretic entropy of the system of hard spheres"  ''Ergod. Th. and Dynam. Syst.'' , '''8'''  (1988)  pp. 133–153</TD></TR></table>

Revision as of 08:00, 6 June 2020


of a dynamical system

One of the most important invariants in ergodic theory. Basic is the concept of the entropy $ h ( S) $ of an endomorphism $ S $( see Metric isomorphism) of a Lebesgue space $ ( X , \mu ) $. For any finite measurable decomposition (measurable partition) $ \xi $ the limit

$$ h ( S , \xi ) = \lim\limits _ {n \rightarrow \infty } \ \frac{1}{n} H ( \xi _ {S} ^ {n} ) , $$

$$ \xi _ {S} ^ {n} = \xi \lor S ^ {-} 1 \xi \lor \dots \lor S ^ {-} n+ 1 \xi $$

(the entropy of $ \xi $ in unit time relative to $ S $) exists, where $ H ( \xi ) $ is the entropy (cf. Entropy of a measurable decomposition) of $ \xi $, and $ \xi \lor \eta $ is the partition whose elements are the intersections of the elements of $ \xi $ and $ \eta $. (This definition carries over verbatim to $ \xi $ with $ H ( \xi ) < \infty $; by another method $ h ( S , \xi ) $ can be defined for any measurable $ \xi $.) The entropy $ h ( S) $ is defined as the least upper bound of the $ h ( S , \xi ) $ over all possible finite measurable $ \xi $. (It may be $ \infty $; the use of all $ \xi $ with $ H ( \xi ) < \infty $ or of all measurable $ \xi $ yields the same entropy.)

Originally the entropy was defined by A.N. Kolmogorov somewhat differently (see ); the version given above came later (see [2]). In the basic case of an aperiodic automorphism of a Lebesgue space the definitions are ultimately equivalent [3].

It turns out that $ h ( S ^ {n} ) = n h ( S) $, and if $ S $ is an automorphism, then $ h ( S ^ {-} 1 ) = h ( S) $. Therefore, the entropy of a cascade $ \{ S ^ {n} \} $ is naturally taken to be $ h ( S) $. For a measurable flow $ \{ S _ {t} \} $ it turns out that $ h ( S _ {t} ) = | t | h( S _ {1} ) $. Therefore the entropy of a flow is naturally taken to be $ h ( S _ {1} ) $. The definition of the entropy for other transformation groups with an invariant measure is somewhat different. (It does not reduce to the entropy of a single transformation in the group; see [5], [6].) There are modifications of the entropy for the case of an infinite invariant measure [7]; another modification is the $ A $- entropy (where $ A = \{ k _ {n} \} $ is an ascending sequence of natural numbers), which is obtained when $ \xi _ {S} ^ {n} $ is replaced by

$$ S ^ {- k _ {1} } \xi \lor \dots \lor S ^ {- k _ {n} } \xi $$

and $ \lim\limits $ by $ \overline{\lim\limits}\; $( see [8]).

The entropy is a metric isomorphism invariant of dynamical systems and is fundamentally different from the earlier-known invariants, which are basically connected with the spectrum of a dynamical system. In particular, by means of the entropy of Bernoulli automorphisms (cf. Bernoulli automorphism; see ) it was first established that there exist non-isomorphic ergodic systems with the same continuous spectrum (which contrasts with the situation for a discrete spectrum). In a wider setting the role of the entropy is related to the fact that a new trend arose in ergodic theory: the entropy theory of dynamical systems (see [3], [4], and Ergodic theory).

The entropy provides a tool for characterizing the rate of mixing of sets of small measure (more accurately, the collection of those that form the partition). Side-by-side with this "global" role, the entropy also plays a "local" role, which is established by Breiman's ergodic theorem (an individual ergodic theorem of information theory): For ergodic $ S $ and almost-all $ x $,

$$ \frac{1}{n} | \mathop{\rm log} \mu ( C _ {\xi _ {X} ^ {s} } ( x) ) | \rightarrow h ( S , \xi ) \ \textrm{ for } n \rightarrow \infty , $$

where $ C _ \eta ( x) $ is the element of the partition $ \eta $ containing $ x $ and the logarithm is taken to the same base as in the definition of $ H $( see , [4]). (Breiman's theorem is true for $ \xi $ with $ H ( \xi ) < \infty $[10], but, generally speaking, not for countable $ \xi $ with $ H ( \xi ) = \infty $[11]; there are variants for non-ergodic $ S $( see [4], [12]) and an infinite $ \mu $[13]. A weaker assertion on the convergence in the sense of $ l _ {1} $ has been proved for a certain general class of transformation groups [6].)

For smooth dynamical systems with a smooth invariant measure a connection has been established between the entropy and the Lyapunov characteristic exponent of the equations in variations (see [14][16]).

The name "entropy" is explained by the analogy between the entropy of dynamical systems and that in information theory and statistical physics, right up to the fact that in certain examples these entropies are the same (see, for example, [4], [17]). The analogy with statistical physics was one of the stimuli for introducing in ergodic theory (even in a not-purely metric context and for topological dynamical systems, cf. Topological dynamical system) new concepts such as "Gibbsian measures" , the "topological pressure" (an analogue to the free energy) and the "variational principle" for the latter (see the references to $ Y $- system; Topological entropy).

References

[1a] A.N. Kolmogorov, "A new metric invariant of transitive dynamical systems, and Lebesgue space automorphisms" Dokl. Akad. Nauk SSSR , 119 : 5 (1958) pp. 861–864 (In Russian)
[1b] A.N. Kolmogorov, "On entropy per unit time as a metric invariant of automorphisms" Dokl. Akad. Nauk SSSR , 124 : 4 (1959) pp. 754–755 (In Russian)
[2] Ya.G. Sinai, "On the notion of entropy of dynamical systems" Dokl. Akad. Nauk SSSR , 124 : 4 (1959) pp. 768–771 (In Russian)
[3] V.A. Rokhlin, "Lectures on the entropy theory of transformations with invariant measure" Russian Math. Surveys , 22 : 5 (1967) pp. 1–52 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 3–56
[4] P. Billingsley, "Ergodic theory and information" , Wiley (1965)
[5] A.V. Safonov, "Information parts in groups" Math. USSR. Izv. , 22 (1984) pp. 393–398 Izv. Akad. Nauk SSSR Ser. Mat. , 47 : 2 (1983) pp. 421–426
[6] J.C. Kieffer, "A generalized Shannon–McMillan theorem for the action of an amenable group on a probability space" Ann. of Probab. , 3 : 6 (1975) pp. 1031–1037
[7] V. Krengel, "Entropy of conservative transformations" Z. Wahrscheinlichkeitstheor. Verw. Geb. , 7 : 3 (1967) pp. 161–181
[8] A.G. Kushnirenko, "Metric invariants of entropy type" Russian Math. Surveys , 22 : 5 (1967) pp. 53–61 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 37–65
[9a] L. Breiman, "The individual ergodic theorem of information theory" Ann. Math. Stat. , 28 : 3 (1957) pp. 809–811
[9b] L. Breiman, "Correction to "The individual ergodic theorem of information theory" " Ann. Math. Stat. , 31 : 3 (1960) pp. 809–810
[10] K.L. Chung, "A note on the ergodic theorem of information theory" Ann. Math. Stat. , 32 : 3 (1961) pp. 612–614
[11] B.S. Pitskeĺ, "Nonuniform distribution of entropy for processes with a countable set of states" Probl. Peredatsi Inform. , 12 : 2 (1976) pp. 98–103 (In Russian)
[12] A. Ionesco-Tulcea, "Contributions to information theory for abstract alphabets" Arkiv for Mat. , 4 : 2–3 (1961) pp. 235–247
[13] E.M. Klimko, L. Sucheston, "On convergence of information in spaces with infinite invariant measure" Z. Wahrscheinlichkeitstheor. Verw. Geb. , 10 : 3 (1968) pp. 226–235
[14] V.M. Millionshchikov, "A formula for the entropy of smooth dynamical systems" Differential Eq. , 12 (1976) pp. 1527–1530 Differents. Uravnen. , 12 : 12 (1976) pp. 2188–2192
[15] Ya.B. Pesin, "Characteristic Lyapunov exponents, and smooth ergodic theory" Russian Math. Surveys , 32 : 4 (1977) pp. 55–114 Uspekhi Mat. Nauk , 32 : 4 (1977) pp. 55–112
[16] R. Mañé, "A proof of Pesin's formula" Ergod. Th. and Dynam. Syst. , 1 : 1 (1981) pp. 95–102
[17] D.W. Robinson, D. Ruelle, "Mean entropy of states in classical statistical mechanics" Comm. Math. Phys. , 5 : 4 (1967) pp. 288–300

Comments

Instead of $ A $- entropy the term sequence entropy is used in the English literature. See e.g. [a1], § 4.11. For several useful recent references concerning the computation of entropy, see [a2].

References

[a1] P. Walters, "An introduction to ergodic theory" , Springer (1982)
[a2] M.P. Wojtkowski, "Measure theoretic entropy of the system of hard spheres" Ergod. Th. and Dynam. Syst. , 8 (1988) pp. 133–153
How to Cite This Entry:
Metric entropy. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Metric_entropy&oldid=47829
This article was adapted from an original article by D.V. Anosov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article