Bernoulli measure

A measure describing the independent repetition of an experiment with two possible outcomes, such as playing head and tail with a possibly biased coin. This is the simplest and most basic probabilistic scheme for which the fundamental theorems of probability theory, such as the weak and strong law of large numbers, the integral and local limit properties and the large deviation principle, were first and most easily proved.

The scheme can be generalized to experiments with possibly more than two outcomes and can be mathematically formalized in the following way. Given a countable set $D$ and a finite set $S$ with $n$ elements $n \geq 2$, which can be identified with the set of the first $n$ positive integers if there are no special requirements, and $n$ non-negative numbers $p _ {1} \dots p _ {n}$ with $p _ {1} + \dots + p _ {n} = 1$, one defines the corresponding Bernoulli measure on the space $\Omega = S ^ {D} = \{ \omega _ {i} \} _ {i \in D }$ as a probability measure (finitely or countably additive, in accordance with the setting) ${\mathsf P}$ such that the $\omega _ {i}$' s are independent and identically distributed random variables ${\mathsf P} ( \omega _ {i} = j ) = p _ {j}$ for $1 \leq j \leq n$. In the countable additive framework, a basic result is the Kolmogorov zero-one law [a9], stating that events in the algebra at infinity (i.e., events that are measurable with respect to the Borel fields generated by the $\omega _ {i}$' s in the complement of every finite subset of $D$) have probability either $0$ or $1$.

Percolation theory on Bernoulli measures is a theory that, although in a simpler setting, presents in many respects the same kinds of phenomena as statistical mechanics, with important difficult results and many open problems (see Statistical mechanics, mathematical problems in; [a8], [a3]). In the most common setting, one considers on each site $i$ of a lattice a random variable with values $0$ and $1$; such variables are assumed to be independent and identically distributed, so that one finds that the Bernoulli measures parametrized by the probability $p$ for an $\omega _ {i}$ are equal to $1$. Many events considered in percolation theory are increasing (or positive) in the sense that their indicator functions are non-decreasing functions of the $\omega _ {i}$' s. L. Russo [a7] has proved a finite version of Kolmogorov's zero-one law; it states that increasing events that in a suitable sense depend little on each $\omega _ {i}$ have probability close to $0$ or $1$ for all but a small interval of $p$' s. T.E. Harris [a4] has proved a basic inequality for increasing events on Bernoulli measures: if $A$ and $B$ are increasing events, then ${\mathsf P} ( A \cap B ) \geq {\mathsf P} ( A ) {\mathsf P} ( B )$. An inequality in the opposite sense was proved by J. van den Berg and H. Kesten [a1] for increasing events; it was generalized to arbitrary events by D. Reimer [a6]. It states that the probability that two events happen disjointly (in the sense that one can verify their occurrence by looking at disjoint subsets of the lattice) is less than or equal to the product of their probabilities.

The De Finetti theorem on infinite sequences of exchangeable random events [a2] shows that Bernoulli measures are relevant for statistical inference in a wide range of situations. It states that if the distribution of an infinite family of random events is invariant under finite permutations, then it can be expressed as a mixture of Bernoulli measures. The assumption of exchangeability is very natural in many concrete situations.

Let $D = \mathbf Z$ be the set of integer numbers and let $T$ be the left shift operator on $\Omega = S ^ {\mathbf Z}$:

$$( T \omega ) _ {i} = \omega _ {i + 1 } .$$

The pair consisting of the Bernoulli measure and the shift operator is called the Bernoulli shift. One says that an invertible measurable mapping $\phi : \Omega \rightarrow {\Omega ^ \prime }$ between two Bernoulli shifts, with probability measures ${\mathsf P}$ and ${\mathsf P} ^ \prime$ and shift operators $T$ and $T ^ \prime$, is an isomorphism between Bernoulli shifts if ${\mathsf P} ^ \prime \circ \phi = {\mathsf P}$ and $T ^ \prime \circ \phi = \phi \circ T$. The famous Ornstein theorem [a5], which has been generalized in many ways, states that two Bernoulli shifts are isomorphic if and only if their Kolmogorov–Sinai entropies are equal. For a Bernoulli shift the Kolmogorov–Sinai entropy is given by $\sum _ {1} ^ {n} - p _ {i} { \mathop{\rm log} } ( p _ {i} )$, with the convention that $x { \mathop{\rm log} } ( x ) = 0$ for $x = 0$.

References

 [a1] J. van den Berg, H. Kesten, "Inequalities with applications to percolation and reliability theory" J. Appl. Probab. , 22 (1985) pp. 556 [a2] B. de Finetti, "La prévision: ses lois logiques, ses sources subjectives" Ann. Inst. Poincaré , VII (1937) pp. 1–68 [a3] G. Grimmett, "Percolation" , Springer (1989) [a4] T.E. Harris, "A lower bound for the critical probability in certain percolation processes" Proc. Cambridge Philos. Soc. , 56 (1960) pp. 13 [a5] D. Ornstein, "Ergodic theory, randomness, and dynamical systems." , Yale Univ. Press (1974) [a6] D. Reimer, "Butterflies" Preprint (1994) [a7] L. Russo, "An approximate zero-one law" Z. Wahrscheinlichkeitsth. verw. Gebiete , 61 (1982) pp. 129–139 [a8] H. Kesten, "Percolation theory for mathematicians" , Progress in Probab. and Stat. , 2 , Birkhäuser (1983) [a9] A.N. Kolmogorov, "Foundations of the theory of probability" , Chelsea (1950) (In Russian)
How to Cite This Entry:
Bernoulli measure. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Bernoulli_measure&oldid=46018
This article was adapted from an original article by M. Campanino (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article