# Difference between revisions of "Ornstein isomorphism theorem"

(start to properly wikify; much more left to be done) |
m (Added category TEXdone) |
||

(6 intermediate revisions by one other user not shown) | |||

Line 1: | Line 1: | ||

+ | {{TEX|done}} | ||

[[Ergodic theory|Ergodic theory]], the study of measure-preserving transformations or flows, arose from the study of the long-term statistical behaviour of dynamical systems (cf. also [[Measure-preserving transformation|Measure-preserving transformation]]; [[Flow (continuous-time dynamical system)|Flow (continuous-time dynamical system)]]; [[Dynamical system|Dynamical system]]). Consider, for example, a billiard ball moving at constant speed on a rectangular table with a convex obstacle. The state of the system (the position and velocity of the ball), at one instant of time, can be described by three numbers or a point in Euclidean 3-dimensional space, and its time evolution by a flow on its state space, a subset of 3-dimensional space. The [[Lebesgue measure]] of a set does not change as it evolves and can be identified with its [[Probability|probability]]. | [[Ergodic theory|Ergodic theory]], the study of measure-preserving transformations or flows, arose from the study of the long-term statistical behaviour of dynamical systems (cf. also [[Measure-preserving transformation|Measure-preserving transformation]]; [[Flow (continuous-time dynamical system)|Flow (continuous-time dynamical system)]]; [[Dynamical system|Dynamical system]]). Consider, for example, a billiard ball moving at constant speed on a rectangular table with a convex obstacle. The state of the system (the position and velocity of the ball), at one instant of time, can be described by three numbers or a point in Euclidean 3-dimensional space, and its time evolution by a flow on its state space, a subset of 3-dimensional space. The [[Lebesgue measure]] of a set does not change as it evolves and can be identified with its [[Probability|probability]]. | ||

Line 5: | Line 6: | ||

It is sometimes convenient to discretize time (''e.g.'', look at the flow once every minute), and this is also referred to as a transformation. | It is sometimes convenient to discretize time (''e.g.'', look at the flow once every minute), and this is also referred to as a transformation. | ||

− | Measure-preserving transformations (or flows) also arise from the study of stationary processes (cf. also [[ | + | Measure-preserving transformations (or flows) also arise from the study of stationary processes (''cf.'' also [[Stationary stochastic process]]). The simplest examples are independent processes such as coin tossing. The outcome of each coin tossing experiment (the experiment goes on for all time) can be described as a doubly-infinite sequence of heads <math>H</math> and tails <math>T</math>. The state space is the collection of these sequences. Each subset is assigned a probability. For example, the set of all sequences that are <math>H</math> at time 3 and <math>T</math> at time 5 gets probability 1/4. The passage of time shifts each sequence to the left (what used to be time 1 is now time 0). (This kind of construction works for all stochastic processes, independence and discrete time are not needed.) |

− | The above transformation is called the Bernoulli shift < | + | The above transformation is called the Bernoulli shift <math>B(1/2,1/2)</math>. If, instead of flipping a coin, one spins a roulette wheel with three slots of probability <math>p_1, p_2, p_3</math>, one would get the Bernoulli shift <math>B(p_1, p_2, p_3)</math>. |

− | Bernoulli shifts play a central role in ergodic theory, but it was not known until 1958 whether or not all Bernoulli shifts are isomorphic. A.N. Kolmogorov and Ya.G. Sinai solved this problem by introducing a new invariant for measure-preserving transformations: the entropy, which they took from Shannon's theory of information (cf. also [[ | + | Bernoulli shifts play a central role in ergodic theory, but it was not known until 1958 whether or not all Bernoulli shifts are isomorphic. A.N. Kolmogorov and Ya.G. Sinai solved this problem by introducing a new invariant for measure-preserving transformations: the entropy, which they took from Shannon's theory of information (''cf.'' also [[Entropy of a measurable decomposition]]; [[Shannon sampling theorem]]). They showed that the entropy of <math>B(p_1,\ldots,p_n)</math> is |

− | + | :<math>\sum_{i=1}^n p_i \log p_i</math> | |

thus proving that not all Bernoulli shifts are isomorphic. | thus proving that not all Bernoulli shifts are isomorphic. | ||

Line 17: | Line 18: | ||

The simplest case of the Ornstein isomorphism theorem (1970), [[#References|[a3]]], states that two Bernoulli shifts of the same entropy are isomorphic. | The simplest case of the Ornstein isomorphism theorem (1970), [[#References|[a3]]], states that two Bernoulli shifts of the same entropy are isomorphic. | ||

− | A deeper version says that all the Bernoulli shifts are strung together in a unique flow: There is a flow < | + | A deeper version says that all the Bernoulli shifts are strung together in a unique flow: There is a flow <math>B_t</math> such that <math>B_0</math> is isomorphic to the Bernoulli shift <math>B(1/2, 1/2)</math>, and for any <math>t_0</math>, <math>B_{t_0}</math> is also a Bernoulli shift. (Here, <math>B_{t_0}</math> means that one samples the flow every <math>t_0</math> units of time.) In fact, one obtains all Bernoulli shifts (more precisely, all finite entropy shifts) by varying <math>t_0</math>. (There is also a unique Bernoulli flow of infinite entropy.) <math>B_t</math> is unique up to a constant scaling of the time parameter (''i.e.'', if <math>\widetilde{B}_t</math> is another flow such that for some <math>t_0</math>, <math>\widetilde{B}_{t_0}</math> is a Bernoulli shift, then there is a constant <math>c</math> such that <math>B_{ct}</math> is isomorphic to <math>\widetilde{B}_t</math>). |

The thrust of this result is that at the level of abstraction of isomorphism there is a unique flow that is the most random possible. | The thrust of this result is that at the level of abstraction of isomorphism there is a unique flow that is the most random possible. | ||

− | The above claim is clarified by the following part of the isomorphism theorem: Any flow < | + | The above claim is clarified by the following part of the isomorphism theorem: Any flow <math>f_t</math> that is not completely predictable, has as a factor <math>B_{ct}</math> for some <math>c>0</math> (the numbers <math>c</math> involved are those for which the entropy of <math>B_{ct}</math> is not greater than the entropy of <math>f_t</math>). The only factors of <math>B_t</math> are <math>B_{ct}</math> with <math>0<c\le 1</math>. |

− | Here, completely predictable means that all observations on the system are predictable in the sense that if one makes the observation at regular intervals of time ( | + | Here, completely predictable means that all observations on the system are predictable in the sense that if one makes the observation at regular intervals of time (''e.g.'', every hour on the hour), then the past determines the future. (An observation is simply a [[Measurable function|measurable function]] <math>P</math> on the state space; one can think of repeated observations as a stationary process.) It is not hard to prove that "completely predictable" is the same as zero entropy. |

− | Also, " | + | Also, "<math>B_t</math> is a factor of <math>f_t</math>" means that there is a many-to-one mapping <math>\phi</math> from the state space of <math>f_t</math> to that of <math>B_t</math> so that a set and its inverse image evolve in the same way ( <math>\phi^{-1}(B_t(E))=f_t(\phi^{-1} E)</math>; this is the same as saying that one gets <math>B_t</math> by restricting <math>f_t</math> to an invariant sub-sigma algebra or by lumping points). |

− | Thus, < | + | Thus, <math>B_t</math> is, in some sense, responsible for all randomness in flows. |

− | The most important part of the isomorphism theorem is a criterion that allows one to show that specific systems are isomorphic to < | + | The most important part of the isomorphism theorem is a criterion that allows one to show that specific systems are isomorphic to <math>B_t</math>. |

− | Using results of Sinai, one can show that billiards with a convex obstacle (described earlier) are isomorphic to < | + | Using results of Sinai, one can show that billiards with a convex obstacle (described earlier) are isomorphic to <math>B_t</math>. |

− | If one would perturb the obstacle, one would get an isomorphic system; if the perturbation is small, then the isomorphism mapping of the state space (a subset of | + | If one would perturb the obstacle, one would get an isomorphic system; if the perturbation is small, then the isomorphism mapping of the state space (a subset of 3-dimensional space) can be shown to be close to the identity. This is an example of "statistical stability" , another consequence of the isomorphism theorem, which provides a statistical version of structural stability. Note that the billiard system is very sensitive to initial conditions and the perturbation completely changes individual orbits. This result shows, however, that the collection of all orbits is hardly changed. |

− | [[ | + | [[Geodesic flow]] on a manifold of negative curvature is another example where results of D. Anosov allow one to check the criterion and is thus isomorphic to <math>B_t</math>. Here too one obtains stability for small perturbations of the manifold's Riemannian structure. |

− | Results of Ya.B. Pesin allow one to check the criterion for any ergodic measure preserving flow of positive entropy on a | + | Results of Ya.B. Pesin allow one to check the criterion for any ergodic measure preserving flow of positive entropy on a 3-dimensional manifold (''i.e.'' not completely predictable). Thus, any such flow is isomorphic to <math>B_t</math> or the product of <math>B_t</math> and a rotation. Stability is not known. |

====References==== | ====References==== | ||

− | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> D.S. Ornstein, "Ergodic theory, randomness, and dynamical systems" , Yale Univ. Press (1974)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> "Dynamical systems II" Ya.G. Sinai (ed.) , Springer (1989)</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> V.I. Arnold, A. Avez, "Ergodic problems of classical mechanics" , Benjamin (1968)</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top"> M. Smorodinsky, "Ergodic theory. Entropy" , ''Lecture Notes Math.'' , '''214''' , Springer (1970)</TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top"> P. Shields, "The theory of Bernoulli shifts" , Univ. Chicago Press (1973)</TD></TR><TR><TD valign="top">[a6]</TD> <TD valign="top"> D.S. Ornstein, B. Weiss, "Statistical properties of chaotic systems" ''Bull. Amer. Math. Soc.'' , '''24''' : 1 (1991)</TD></TR><TR><TD valign="top">[a7]</TD> <TD valign="top"> "Ergodic theory and differentiable dynamics" R. Mané (ed.) , Springer (1987)</TD></TR><TR><TD valign="top">[a8]</TD> <TD valign="top"> D. Randolph, "Fundamentals of measurable dynamics — Ergodic theory of | + | <table> |

+ | <TR><TD valign="top">[a1]</TD> <TD valign="top"> D.S. Ornstein, "Ergodic theory, randomness, and dynamical systems" , Yale Univ. Press (1974)</TD> | ||

+ | </TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> "Dynamical systems II" Ya.G. Sinai (ed.) , Springer (1989)</TD></TR> | ||

+ | <TR><TD valign="top">[a3]</TD> <TD valign="top"> V.I. Arnold, A. Avez, "Ergodic problems of classical mechanics" , Benjamin (1968)</TD></TR> | ||

+ | <TR><TD valign="top">[a4]</TD> <TD valign="top"> M. Smorodinsky, "Ergodic theory. Entropy" , ''Lecture Notes Math.'' , '''214''' , Springer (1970)</TD></TR> | ||

+ | <TR><TD valign="top">[a5]</TD> <TD valign="top"> P. Shields, "The theory of Bernoulli shifts" , Univ. Chicago Press (1973)</TD></TR> | ||

+ | <TR><TD valign="top">[a6]</TD> <TD valign="top"> D.S. Ornstein, B. Weiss, "Statistical properties of chaotic systems" ''Bull. Amer. Math. Soc.'' , '''24''' : 1 (1991)</TD></TR> | ||

+ | <TR><TD valign="top">[a7]</TD> <TD valign="top"> "Ergodic theory and differentiable dynamics" R. Mané (ed.) , Springer (1987)</TD></TR> | ||

+ | <TR><TD valign="top">[a8]</TD> <TD valign="top"> D. Randolph, "Fundamentals of measurable dynamics — Ergodic theory of Lebesgue spaces" , Oxford Univ. Press (to appear)</TD></TR> | ||

+ | </table> |

## Latest revision as of 13:02, 12 December 2013

Ergodic theory, the study of measure-preserving transformations or flows, arose from the study of the long-term statistical behaviour of dynamical systems (cf. also Measure-preserving transformation; Flow (continuous-time dynamical system); Dynamical system). Consider, for example, a billiard ball moving at constant speed on a rectangular table with a convex obstacle. The state of the system (the position and velocity of the ball), at one instant of time, can be described by three numbers or a point in Euclidean 3-dimensional space, and its time evolution by a flow on its state space, a subset of 3-dimensional space. The Lebesgue measure of a set does not change as it evolves and can be identified with its probability.

One can abstract the statistical properties (*e.g.*, ignoring sets of probability 0) and regard the state-space as an abstract measure space. Equivalently, one says that two flows are isomorphic if there is a one-to-one measure-preserving (probability-preserving) correspondence between their state spaces so that corresponding sets evolve in the same way (*i.e.*, the correspondence is maintained for all time).

It is sometimes convenient to discretize time (*e.g.*, look at the flow once every minute), and this is also referred to as a transformation.

Measure-preserving transformations (or flows) also arise from the study of stationary processes (*cf.* also Stationary stochastic process). The simplest examples are independent processes such as coin tossing. The outcome of each coin tossing experiment (the experiment goes on for all time) can be described as a doubly-infinite sequence of heads \(H\) and tails \(T\). The state space is the collection of these sequences. Each subset is assigned a probability. For example, the set of all sequences that are \(H\) at time 3 and \(T\) at time 5 gets probability 1/4. The passage of time shifts each sequence to the left (what used to be time 1 is now time 0). (This kind of construction works for all stochastic processes, independence and discrete time are not needed.)

The above transformation is called the Bernoulli shift \(B(1/2,1/2)\). If, instead of flipping a coin, one spins a roulette wheel with three slots of probability \(p_1, p_2, p_3\), one would get the Bernoulli shift \(B(p_1, p_2, p_3)\).

Bernoulli shifts play a central role in ergodic theory, but it was not known until 1958 whether or not all Bernoulli shifts are isomorphic. A.N. Kolmogorov and Ya.G. Sinai solved this problem by introducing a new invariant for measure-preserving transformations: the entropy, which they took from Shannon's theory of information (*cf.* also Entropy of a measurable decomposition; Shannon sampling theorem). They showed that the entropy of \(B(p_1,\ldots,p_n)\) is

\[\sum_{i=1}^n p_i \log p_i\]

thus proving that not all Bernoulli shifts are isomorphic.

The simplest case of the Ornstein isomorphism theorem (1970), [a3], states that two Bernoulli shifts of the same entropy are isomorphic.

A deeper version says that all the Bernoulli shifts are strung together in a unique flow: There is a flow \(B_t\) such that \(B_0\) is isomorphic to the Bernoulli shift \(B(1/2, 1/2)\), and for any \(t_0\), \(B_{t_0}\) is also a Bernoulli shift. (Here, \(B_{t_0}\) means that one samples the flow every \(t_0\) units of time.) In fact, one obtains all Bernoulli shifts (more precisely, all finite entropy shifts) by varying \(t_0\). (There is also a unique Bernoulli flow of infinite entropy.) \(B_t\) is unique up to a constant scaling of the time parameter (*i.e.*, if \(\widetilde{B}_t\) is another flow such that for some \(t_0\), \(\widetilde{B}_{t_0}\) is a Bernoulli shift, then there is a constant \(c\) such that \(B_{ct}\) is isomorphic to \(\widetilde{B}_t\)).

The thrust of this result is that at the level of abstraction of isomorphism there is a unique flow that is the most random possible.

The above claim is clarified by the following part of the isomorphism theorem: Any flow \(f_t\) that is not completely predictable, has as a factor \(B_{ct}\) for some \(c>0\) (the numbers \(c\) involved are those for which the entropy of \(B_{ct}\) is not greater than the entropy of \(f_t\)). The only factors of \(B_t\) are \(B_{ct}\) with \(0<c\le 1\).

Here, completely predictable means that all observations on the system are predictable in the sense that if one makes the observation at regular intervals of time (*e.g.*, every hour on the hour), then the past determines the future. (An observation is simply a measurable function \(P\) on the state space; one can think of repeated observations as a stationary process.) It is not hard to prove that "completely predictable" is the same as zero entropy.

Also, "\(B_t\) is a factor of \(f_t\)" means that there is a many-to-one mapping \(\phi\) from the state space of \(f_t\) to that of \(B_t\) so that a set and its inverse image evolve in the same way ( \(\phi^{-1}(B_t(E))=f_t(\phi^{-1} E)\); this is the same as saying that one gets \(B_t\) by restricting \(f_t\) to an invariant sub-sigma algebra or by lumping points).

Thus, \(B_t\) is, in some sense, responsible for all randomness in flows.

The most important part of the isomorphism theorem is a criterion that allows one to show that specific systems are isomorphic to \(B_t\).

Using results of Sinai, one can show that billiards with a convex obstacle (described earlier) are isomorphic to \(B_t\).

If one would perturb the obstacle, one would get an isomorphic system; if the perturbation is small, then the isomorphism mapping of the state space (a subset of 3-dimensional space) can be shown to be close to the identity. This is an example of "statistical stability" , another consequence of the isomorphism theorem, which provides a statistical version of structural stability. Note that the billiard system is very sensitive to initial conditions and the perturbation completely changes individual orbits. This result shows, however, that the collection of all orbits is hardly changed.

Geodesic flow on a manifold of negative curvature is another example where results of D. Anosov allow one to check the criterion and is thus isomorphic to \(B_t\). Here too one obtains stability for small perturbations of the manifold's Riemannian structure.

Results of Ya.B. Pesin allow one to check the criterion for any ergodic measure preserving flow of positive entropy on a 3-dimensional manifold (*i.e.* not completely predictable). Thus, any such flow is isomorphic to \(B_t\) or the product of \(B_t\) and a rotation. Stability is not known.

#### References

[a1] | D.S. Ornstein, "Ergodic theory, randomness, and dynamical systems" , Yale Univ. Press (1974) |

[a2] | "Dynamical systems II" Ya.G. Sinai (ed.) , Springer (1989) |

[a3] | V.I. Arnold, A. Avez, "Ergodic problems of classical mechanics" , Benjamin (1968) |

[a4] | M. Smorodinsky, "Ergodic theory. Entropy" , Lecture Notes Math. , 214 , Springer (1970) |

[a5] | P. Shields, "The theory of Bernoulli shifts" , Univ. Chicago Press (1973) |

[a6] | D.S. Ornstein, B. Weiss, "Statistical properties of chaotic systems" Bull. Amer. Math. Soc. , 24 : 1 (1991) |

[a7] | "Ergodic theory and differentiable dynamics" R. Mané (ed.) , Springer (1987) |

[a8] | D. Randolph, "Fundamentals of measurable dynamics — Ergodic theory of Lebesgue spaces" , Oxford Univ. Press (to appear) |

**How to Cite This Entry:**

Ornstein isomorphism theorem.

*Encyclopedia of Mathematics.*URL: http://encyclopediaofmath.org/index.php?title=Ornstein_isomorphism_theorem&oldid=27176