Namespaces
Variants
Actions

Tail triviality

From Encyclopedia of Mathematics
Revision as of 16:56, 1 July 2020 by Maximilian Janisch (talk | contribs) (AUTOMATIC EDIT (latexlist): Replaced 38 formulas out of 38 by TEX code with an average confidence of 2.0 and a minimal confidence of 2.0.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Let $( F , \mathcal{B} )$ be a measurable space. A sequence of random variables $X = ( X _ { n } ) _ { n \in Z }$ taking values in $F$ is described by the triple $( F ^ {\bf Z } , {\cal B} ^ {\bf Z } , \mathsf{P} )$, where $\textsf{P}$ is a probability measure on $( F ^ { \mathbf{Z} } , B ^ {\mathbf{Z} } )$ called the distribution of $X$. The sequence $X$ is said to be independent if $\textsf{P}$ is a product measure, i.e. $\mathsf{P} = \prod _ { x \in \mathbf{Z} } \mu _ { x }$ for probability measures $\mu _ { x }$ on $( F , \mathcal{B} )$.

The right and left tail-sigma-fields of $X$ are defined as

\begin{equation*} \mathcal{T} ^ { + } = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : n \geq N ) \end{equation*}

\begin{equation*} \mathcal{T} ^ { - } = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : n \leq - N ) \end{equation*}

and the two-sided tail-sigma-field is defined as

\begin{equation*} \mathcal{T} = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : | n | \geq N ). \end{equation*}

(Here, $\sigma ( Y )$ denotes the smallest sigma-field (cf. Borel field of sets) with respect to which $Y$ is measurable.) The Kolmogorov zero-one law [a1] states that, in the independent case, ${\cal T} ^ { + }$, $\mathcal{T}^{-}$ and $\mathcal{T}$ are trivial, i.e. all their elements have probability $0$ or $1$ under $\textsf{P}$. Without the independence property this need no longer be true: tail triviality only holds when $X$ has sufficiently weak dependencies. In fact, when the index set $\bf Z$ is viewed as time, tail triviality means that the present is asymptotically independent of the far future and the far past. There exist examples where ${\cal T} ^ { + }$, $\mathcal{T}^{-}$ are trivial but $\mathcal{T}$ is not [a3]. Intuitively, in such examples there are "dependencies across infinity" .

Instead of indexing the random variables by $\bf Z$ one may also consider a random field $( X _ { n } ) _ { n \in {\bf Z} ^ { d }}$, indexed by the $d$-dimensional integers ($d \geq 1$). The definition of $\mathcal{T}$ is the same as before, but now with $| n | = \operatorname { min } _ { 1 \leq i \leq d } | n _ { i } |$, and $\mathcal{T}$ is called the sigma-field at infinity. For independent random fields, $\mathcal{T}$ is again trivial. Without the independence property, however, the question is considerably more subtle and is related to the phenomenon of phase transition (i.e. non-uniqueness of probability measures having prescribed conditional probabilities in finite sets). Tail triviality holds, for instance, when $\textsf{P}$ is an extremal Gibbs measure [a2].

References

[a1] P. Billingsley, "Probability and measure" , Wiley (1986) (Edition: Second)
[a2] H.-O. Georgii, "Gibbs measures and phase transitions" , Studies Math. , 9 , W. de Gruyter (1988)
[a3] D.S. Ornstein, B. Weiss, "Every transformation is bilaterally deterministic" Israel J. Math. , 24 (1975) pp. 154–158
How to Cite This Entry:
Tail triviality. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Tail_triviality&oldid=50128
This article was adapted from an original article by F. den Hollander (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article