Difference between revisions of "Tail triviality"
(Importing text file) |
m (AUTOMATIC EDIT (latexlist): Replaced 38 formulas out of 38 by TEX code with an average confidence of 2.0 and a minimal confidence of 2.0.) |
||
Line 1: | Line 1: | ||
− | + | <!--This article has been texified automatically. Since there was no Nroff source code for this article, | |
+ | the semi-automatic procedure described at https://encyclopediaofmath.org/wiki/User:Maximilian_Janisch/latexlist | ||
+ | was used. | ||
+ | If the TeX and formula formatting is correct, please remove this message and the {{TEX|semi-auto}} category. | ||
− | + | Out of 38 formulas, 38 were replaced by TEX code.--> | |
− | + | {{TEX|semi-auto}}{{TEX|done}} | |
+ | Let $( F , \mathcal{B} )$ be a [[Measurable space|measurable space]]. A sequence of random variables $X = ( X _ { n } ) _ { n \in Z }$ taking values in $F$ is described by the triple $( F ^ {\bf Z } , {\cal B} ^ {\bf Z } , \mathsf{P} )$, where $\textsf{P}$ is a [[Probability measure|probability measure]] on $( F ^ { \mathbf{Z} } , B ^ {\mathbf{Z} } )$ called the distribution of $X$. The sequence $X$ is said to be independent if $\textsf{P}$ is a product measure, i.e. $\mathsf{P} = \prod _ { x \in \mathbf{Z} } \mu _ { x }$ for probability measures $\mu _ { x }$ on $( F , \mathcal{B} )$. | ||
− | + | The right and left tail-sigma-fields of $X$ are defined as | |
+ | |||
+ | \begin{equation*} \mathcal{T} ^ { + } = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : n \geq N ) \end{equation*} | ||
+ | |||
+ | \begin{equation*} \mathcal{T} ^ { - } = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : n \leq - N ) \end{equation*} | ||
and the two-sided tail-sigma-field is defined as | and the two-sided tail-sigma-field is defined as | ||
− | + | \begin{equation*} \mathcal{T} = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : | n | \geq N ). \end{equation*} | |
− | (Here, | + | (Here, $\sigma ( Y )$ denotes the smallest sigma-field (cf. [[Borel field of sets|Borel field of sets]]) with respect to which $Y$ is measurable.) The Kolmogorov [[Zero-one law|zero-one law]] [[#References|[a1]]] states that, in the independent case, ${\cal T} ^ { + }$, $\mathcal{T}^{-}$ and $\mathcal{T}$ are trivial, i.e. all their elements have probability $0$ or $1$ under $\textsf{P}$. Without the independence property this need no longer be true: tail triviality only holds when $X$ has sufficiently weak dependencies. In fact, when the index set $\bf Z$ is viewed as time, tail triviality means that the present is asymptotically independent of the far future and the far past. There exist examples where ${\cal T} ^ { + }$, $\mathcal{T}^{-}$ are trivial but $\mathcal{T}$ is not [[#References|[a3]]]. Intuitively, in such examples there are "dependencies across infinity" . |
− | Instead of indexing the random variables by | + | Instead of indexing the random variables by $\bf Z$ one may also consider a [[Random field|random field]] $( X _ { n } ) _ { n \in {\bf Z} ^ { d }}$, indexed by the $d$-dimensional integers ($d \geq 1$). The definition of $\mathcal{T}$ is the same as before, but now with $| n | = \operatorname { min } _ { 1 \leq i \leq d } | n _ { i } |$, and $\mathcal{T}$ is called the sigma-field at infinity. For independent random fields, $\mathcal{T}$ is again trivial. Without the independence property, however, the question is considerably more subtle and is related to the phenomenon of phase transition (i.e. non-uniqueness of probability measures having prescribed conditional probabilities in finite sets). Tail triviality holds, for instance, when $\textsf{P}$ is an extremal Gibbs measure [[#References|[a2]]]. |
====References==== | ====References==== | ||
− | <table>< | + | <table><tr><td valign="top">[a1]</td> <td valign="top"> P. Billingsley, "Probability and measure" , Wiley (1986) (Edition: Second)</td></tr><tr><td valign="top">[a2]</td> <td valign="top"> H.-O. Georgii, "Gibbs measures and phase transitions" , ''Studies Math.'' , '''9''' , W. de Gruyter (1988)</td></tr><tr><td valign="top">[a3]</td> <td valign="top"> D.S. Ornstein, B. Weiss, "Every transformation is bilaterally deterministic" ''Israel J. Math.'' , '''24''' (1975) pp. 154–158</td></tr></table> |
Latest revision as of 16:56, 1 July 2020
Let $( F , \mathcal{B} )$ be a measurable space. A sequence of random variables $X = ( X _ { n } ) _ { n \in Z }$ taking values in $F$ is described by the triple $( F ^ {\bf Z } , {\cal B} ^ {\bf Z } , \mathsf{P} )$, where $\textsf{P}$ is a probability measure on $( F ^ { \mathbf{Z} } , B ^ {\mathbf{Z} } )$ called the distribution of $X$. The sequence $X$ is said to be independent if $\textsf{P}$ is a product measure, i.e. $\mathsf{P} = \prod _ { x \in \mathbf{Z} } \mu _ { x }$ for probability measures $\mu _ { x }$ on $( F , \mathcal{B} )$.
The right and left tail-sigma-fields of $X$ are defined as
\begin{equation*} \mathcal{T} ^ { + } = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : n \geq N ) \end{equation*}
\begin{equation*} \mathcal{T} ^ { - } = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : n \leq - N ) \end{equation*}
and the two-sided tail-sigma-field is defined as
\begin{equation*} \mathcal{T} = \bigcap _ { N \geq 0 } \sigma ( X _ { n } : | n | \geq N ). \end{equation*}
(Here, $\sigma ( Y )$ denotes the smallest sigma-field (cf. Borel field of sets) with respect to which $Y$ is measurable.) The Kolmogorov zero-one law [a1] states that, in the independent case, ${\cal T} ^ { + }$, $\mathcal{T}^{-}$ and $\mathcal{T}$ are trivial, i.e. all their elements have probability $0$ or $1$ under $\textsf{P}$. Without the independence property this need no longer be true: tail triviality only holds when $X$ has sufficiently weak dependencies. In fact, when the index set $\bf Z$ is viewed as time, tail triviality means that the present is asymptotically independent of the far future and the far past. There exist examples where ${\cal T} ^ { + }$, $\mathcal{T}^{-}$ are trivial but $\mathcal{T}$ is not [a3]. Intuitively, in such examples there are "dependencies across infinity" .
Instead of indexing the random variables by $\bf Z$ one may also consider a random field $( X _ { n } ) _ { n \in {\bf Z} ^ { d }}$, indexed by the $d$-dimensional integers ($d \geq 1$). The definition of $\mathcal{T}$ is the same as before, but now with $| n | = \operatorname { min } _ { 1 \leq i \leq d } | n _ { i } |$, and $\mathcal{T}$ is called the sigma-field at infinity. For independent random fields, $\mathcal{T}$ is again trivial. Without the independence property, however, the question is considerably more subtle and is related to the phenomenon of phase transition (i.e. non-uniqueness of probability measures having prescribed conditional probabilities in finite sets). Tail triviality holds, for instance, when $\textsf{P}$ is an extremal Gibbs measure [a2].
References
[a1] | P. Billingsley, "Probability and measure" , Wiley (1986) (Edition: Second) |
[a2] | H.-O. Georgii, "Gibbs measures and phase transitions" , Studies Math. , 9 , W. de Gruyter (1988) |
[a3] | D.S. Ornstein, B. Weiss, "Every transformation is bilaterally deterministic" Israel J. Math. , 24 (1975) pp. 154–158 |
Tail triviality. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Tail_triviality&oldid=50128