# Strong law of large numbers

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

2010 Mathematics Subject Classification: Primary: 60F15 [MSN][ZBL]

A form of the law of large numbers (in its general form) which states that, under certain conditions, the arithmetical averages of a sequence of random variables tend to certain constant values with probability one. More exactly, let

$$\tag{1 } X _ {1} , X _ {2} \dots$$

be a sequence of random variables and let $S _ {n} = X _ {1} + \dots + X _ {n}$. One says that the sequence (1) satisfies the strong law of large numbers if there exists a sequence of constants $A _ {n}$ such that the probability of the relationship

$$\tag{2 } \frac{S _ {n} }{n} - A _ {n} \rightarrow 0,\ n \rightarrow \infty ,$$

is one. Another formulation, which is equivalent to the former one, is as follows: The sequence (1) satisfies the strong law of large numbers if, for any $\epsilon > 0$, the probability of all the inequalities

$$\tag{3 } \left | \frac{S _ {n} }{n} - A _ {n} \right | \leq \epsilon ,\ \left | \frac{S _ {n+} 1 }{n+} 1 - A _ {n+} 1 \right | \leq \epsilon \dots$$

to be true at the same time tends to one as $n \rightarrow \infty$. Thus, one considers the behaviour of the sequence of the sums as a whole, whereas in the ordinary law of large numbers individual sums only are considered. If the sequence (1) satisfies the strong law of large numbers, it also satisfies the ordinary law of large numbers for the same $A _ {n}$, i.e.

$$\tag{4 } {\mathsf P} \left \{ \left | \frac{S _ {n} }{n} - A _ {n} \right | \leq \epsilon \right \} \rightarrow 1$$

for any $\epsilon > 0$ if $n \rightarrow \infty$. The converse need not be true. For example, if the random variables (1) are independent and, for $n \geq 16$, assume the two values $\pm \sqrt {n/ \mathop{\rm ln} \mathop{\rm ln} \mathop{\rm ln} n }$ with probability 1/2 each, they satisfy the law of large numbers (4) for $A _ {n} = 0$, but the strong law of large numbers (2) is not satisfied for any value of $A _ {n}$. The existence of such examples is not at all obvious at first sight. The reason is that even though, in general, convergence in probability is weaker than convergence with probability one, nevertheless the two types of convergence are equivalent, for example, in the case of series of independent random variables.

The strong law of large numbers was first formulated and demonstrated by E. Borel [B] for the Bernoulli scheme in the number-theoretic interpretation; cf. Borel strong law of large numbers. Special cases of the Bernoulli scheme result from the expansion of a real number $\omega$, taken at random (with uniform distribution) in the interval $( 0, 1)$, into an infinite fraction to any basis (see Bernoulli trials). Thus, in the binary expansion

$$\omega = \sum _ { n= } 1 ^ \infty \frac{X _ {n} ( \omega ) }{2 ^ {n} }$$

the successive signs of $X _ {n} ( \omega )$ assume two values, 0 and 1, with probability 1/2 each, and are independent random variables. The sum $S _ {n} ( \omega ) = {\sum _ {k=} 1 ^ {n} } X _ {k} ( \omega )$ is equal to the number of ones among the first $n$ signs in the binary expansion, while $S _ {n} ( \omega )/n$ is their proportion. At the same time $S _ {n}$ may be considered as the number of "successful" trials in the Bernoulli scheme with probability of "success" (appearance of "1" ) equal to 1/2. Borel showed that the proportion of "ones" , $S _ {n} ( \omega )/n$, tends to 1/2 for almost-all $\omega$ in $( 0, 1)$. In a similar manner, in the expansion of $\omega$ to the base 10, "success" may be considered as the appearance of any one of the digits $0 \dots 9$( e.g. the number 3). One then obtains a Bernoulli scheme with probability of success 1/10 and the frequency of appearance of the selected digit among the first $n$ signs in the decimal expansion tends to 1/10 for almost-all $\omega$ in $( 0, 1)$. It was also noted by Borel that the frequency of appearance of any given group of $r$ digits tends to $1/ {10 ^ {r} }$ for almost-all $\omega$( cf. Normal number).

F. Cantelli [C] stated sufficient conditions for the strong law of large numbers for independent random variables $X _ {n}$ in terms of the second and fourth central moments of the summands (these conditions are fulfilled for the Bernoulli scheme). Putting

$$B _ {n,k} = \ \sum _ { j= } 1 ^ { n } {\mathsf E} | X _ {j} - {\mathsf E} X _ {j} | ^ {k} ,$$

the Cantelli condition can be written in the form

$$\sum _ { n= } 1 ^ \infty \frac{B _ {n,4} + B _ {n,2} ^ {2} }{n ^ {2} } < \infty .$$

The proofs of Cantelli and Borel are based on the following reasoning. Let, for some sequence of positive numbers $\epsilon _ {n} \rightarrow 0$( when $n \rightarrow \infty$),

$$\tag{5 } \sum _ { n= } 1 ^ \infty {\mathsf P} \left \{ \left | \frac{S _ {n} }{n} - A _ {n} \right | > \epsilon _ {n} \right \} < \infty .$$

Then, according to the Borel–Cantelli lemma, only a finite number of events under the probability sign in (5) are realized with probability one. Accordingly, for all sufficiently large $n$,

$$\left | \frac{S _ {n} }{n} - A _ {n} \right | \leq \epsilon _ {n} ,$$

with probability one, i.e. (3) is valid. Borel estimated the terms of the series (5) by the de Moivre–Laplace theorem, while Cantelli did so basing himself on Chebyshev's inequality with fourth moments.

A further extension of the conditions for the applicability of the strong law of large numbers was realized by A.Ya. Khinchin and A.N. Kolmogorov. Khinchin [Kh], [Kh2] introduced the very name of "strong law of large numbers" and proved a sufficient condition (also applicable to dependent variables) for it with $A _ {n} = {\mathsf E} ( S _ {n} /n)$. Denoting by $r _ {i,k}$ the correlation coefficient between $X _ {i}$ and $X _ {k}$ and putting

$$c _ {n} = \sup _ {| i - k| = n } \ | r _ {i,k} |,\ \ C _ {n} = \sum _ { k= } 0 ^ { n } c _ {k} ,$$

one can write Khinchin's condition as follows: $B _ {n,2} C _ {n} = O ( n ^ {2 - \delta } )$ for some $\delta > 0$. In fact, the statement which follows from Khinchin's proof is much stronger.

In the case of independent summands the best known conditions for applicability of the strong law of large numbers are those established by Kolmogorov: Sufficient (1930) for variables with a finite variance and necessary and sufficient conditions (1933) for identically-distributed variables consist of the existence of the mathematical expectation of the variables $X _ {i}$. Kolmogorov's theorem for random variables (1) with finite variances states that the condition

$$\tag{6 } \sum _ { n= } 1 ^ \infty \frac{ {\mathsf D} X _ {n} }{n ^ {2} } < \infty$$

implies the applicability of the strong law of large numbers with $A _ {n} = {\mathsf E} ( S _ {n} / n)$ for the sequence (1). In terms of variances condition (6) is best in the sense that for any sequence of positive numbers $b _ {n}$ such that the series $\sum b _ {n} / n ^ {2}$ diverges it is possible to construct a sequence of independent random variables $X _ {n}$ with ${\mathsf D} X _ {n} = b _ {n}$ which does not satisfy the strong law of large numbers. The scope of application of condition (6) (and also of other conditions of the strong law of large numbers for independent variables) may be extended as follows. Let $m _ {n}$ be the median of $X _ {n}$. The convergence of the series

$$\sum {\mathsf P} \{ | X _ {n} - m _ {n} | > n \}$$

is necessary in the strong law of large numbers. It follows from the Borel–Cantelli lemma that $| X _ {n} - m _ {n} | < n$ with probability one, starting from a certain number $n$. Accordingly, in a study on the conditions of applicability of the strong law of large numbers, one may immediately restrict to random variables which satisfy the last-named condition.

In the proof given by Khinchin and Kolmogorov the convergence of the series (5) is replaced by the convergence of the series

$$\sum _ { k } {\mathsf P} \left \{ \max _ {n _ {k} < n \leq n _ {k+} 1 } \left | \frac{S _ {n} }{n} - A _ {n} \right | > \epsilon _ {n} \right \} ,$$

where $n _ {k} = 2 ^ {k}$. In this proof, Khinchin actually employed a number of ideas from the theory of orthogonal series of functions, while Kolmogorov employed his inequality for the maxima of sums of random variables.

It is possible to state a necessary and sufficient condition for the strong law of large numbers for independent random variables. Putting

$$Z _ {k} = 2 ^ {-} k \sum _ { ( } k) X _ {n} ,$$

where the sum $\sum _ {(} k)$ applies to the values of $n$ for which $2 ^ {k} < n \leq 2 ^ {k+} 1$, this condition may be written as follows: For any $\epsilon > 0$,

$$\tag{7 } \sum _ { k } {\mathsf P} \{ | Z _ {k} - m Z _ {k} | > \epsilon \} < \infty ,$$

where $mZ _ {k}$ is the median of $Z _ {k}$[Pr]. If additional restrictions are imposed, (7) will yield conditions expressed in terms of the characteristics of the individual terms. For instance, if $X _ {n} = O ( {n / \mathop{\rm ln} \mathop{\rm ln} n } )$ or if all $X _ {n}$ are normally distributed, condition (7) is equivalent to the following one: For any $\epsilon > 0$,

$$\tag{8 } \sum _ { k } e ^ {\{ - {\epsilon / {\mathsf D} Z _ {k} } \} } < \infty .$$

Here, since $X _ {1} , X _ {2} \dots$ are independent,

$${\mathsf D} Z _ {k} = 2 ^ {-} 2k \sum _ { ( } k) {\mathsf D} X _ {n} .$$

Conditions of applicability of the strong law of large numbers to Markov chains and processes, and to stationary processes, are known [D]. Thus, Khinchin's method, which is applicable to a sequence $X _ {n}$ that is stationary in the wide sense, with correlation function $R( n)$, leads to the following theorem: If the series $\sum ( \mathop{\rm ln} ^ {2} n) | R( n) | / n$ is convergent, then $( X _ {0} + \dots + X _ {n} ) / ( n + 1) \rightarrow {\mathsf E} X _ {0}$ with probability one. In applications to stationary (in the narrow sense) random processes the name "strong law of large numbers" is sometimes given to the following statement: The limit

$$Y = \lim\limits _ {n \rightarrow \infty } \ \frac{X _ {0} + \dots + X _ {n} }{n+} 1$$

exists with probability one. The random variable $Y$ is equal to the conditional mathematical expectation of $X _ {0}$ with respect to the $\sigma$- algebra of sets that are shift-invariant; the variable $Y$ is constant and equals ${\mathsf E} X _ {0}$ with probability one only for metrically-transitive processes. The strong law of large numbers in this form is identical with the Birkhoff ergodic theorem.

There exist variations of the strong law of large numbers for random vectors in normed linear spaces [G]. The chronologically earliest example of such a variation is the Glivenko–Cantelli theorem on the convergence of the empirical distribution function to the theoretical one.

The deviations of $S _ {n} / n$ from $A _ {n}$ are described by the law of the iterated logarithm.

#### References

 [B] E. Borel, "Les probabilités dénombrables et leurs applications arithmétique" Rend. Circ. Mat. Palermo (2) , 27 (1909) pp. 247–271 [C] F.P. Cantelli, "Sulla probabilità come limite della frequenza" Atti Accad. Naz. Lincei , 26 : 1 (1917) pp. 39–45 Zbl 46.0779.02 [Kh] A.Ya. Khintchine, "Fundamental laws of probability theory" , Moscow (1927) (In Russian) [Kh2] A.Ya. Khintchine, "Sur la loi forte des grands nombres" C.R. Acad. Sci. Paris Ser. I Math. , 186 (1928) pp. 285–287 Zbl 54.0544.03 [Ko] A.N. Kolmogorov, "Sur la loi forte des grands nombres" C.R. Acad. Sci. Paris Sér. I Math. , 191 (1930) pp. 910–912 [Pr] Yu.V. Prokhorov, "On the strong law of large numbers" Izv. Akad. Nauk SSSR Ser. Mat. , 14 (1950) pp. 523–536 (In Russian) MR0121858 Zbl 0089.13903 [D] J.L. Doob, "Stochastic processes" , Chapman & Hall (1953) MR1570654 MR0058896 Zbl 0053.26802 [Pe] V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) MR0388499 Zbl 0322.60043 Zbl 0322.60042 [G] U. Grenander, "Probabilities on algebraic structures" , Wiley (1963) MR0206994 Zbl 0131.34804
How to Cite This Entry:
Strong law of large numbers. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Strong_law_of_large_numbers&oldid=48876
This article was adapted from an original article by Yu.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article