Namespaces
Variants
Actions

Difference between revisions of "Cramér theorem"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (moved Cramér theorem to Cramer theorem: ascii title)
m (tex encoded by computer)
 
(3 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 +
<!--
 +
c0270001.png
 +
$#A+1 = 16 n = 0
 +
$#C+1 = 16 : ~/encyclopedia/old_files/data/C027/C.0207000 Cram\Aeer theorem
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
{{MSC|60F10}}
 
{{MSC|60F10}}
  
 
[[Category:Limit theorems]]  
 
[[Category:Limit theorems]]  
  
An integral limit theorem for the probability of large deviations of sums of independent random variables. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270001.png" /> be a sequence of independent random variables with the same non-degenerate distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270002.png" />, such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270003.png" /> and such that the generating function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270004.png" /> of the moments is finite in some interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270005.png" /> (this last condition is known as the Cramér condition). Let
+
An integral limit theorem for the probability of large deviations of sums of independent random variables. Let $  X _ {1} , X _ {2} \dots $
 +
be a sequence of independent random variables with the same non-degenerate distribution function $  F $,  
 +
such that $  {\mathsf E} X _ {1} = 0 $
 +
and such that the generating function $  {\mathsf E} e ^ {tX _ {1} } $
 +
of the moments is finite in some interval $  | t | < H $(
 +
this last condition is known as the Cramér condition). Let
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270006.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} X _ {1}  ^ {2}  = \sigma  ^ {2} ,\ \
 +
F _ {n} ( x)  = {\mathsf P} \left (
  
If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270007.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270008.png" /> as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c0270009.png" />, then
+
\frac{1}{\sigma n  ^ {1/2} }
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c02700010.png" /></td> </tr></table>
+
\sum _ {j = 1 } ^ { n }
 +
X _ {j} < x \right ) .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c02700011.png" /></td> </tr></table>
+
If  $  x > 1 $,
 +
$  x = o ( \sqrt n ) $
 +
as  $  n \rightarrow \infty $,
 +
then
  
Here <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c02700012.png" /> is the normal <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c02700013.png" /> distribution function and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c02700014.png" /> is the so-called Cramér series, the coefficients of which depend only on the moments of the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c02700015.png" />; this series is convergent for all sufficiently small <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c027/c027000/c02700016.png" />. Actually, the original result, obtained by H. Cramér in 1938, was somewhat weaker than that just described.
+
$$
  
====References====
+
\frac{1 - F _ {n} ( x) }{1 - \Phi ( x) }
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  H. Cramér,   "Sur un nouveau théorème-limite de la théorie des probabilités" , ''Act. Sci. et Ind.'' , '''736''' , Hermann  (1938)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  I.A. Ibragimov,   Yu.V. Linnik,  "Independent and stationary sequences of random variables" , Wolters-Noordhoff (1971) (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> V.V. Petrov,  "Sums of independent random variables" , Springer  (1975(Translated from Russian)</TD></TR></table>
+
  = \
 +
\mathop{\rm exp} \left \{
 +
 
 +
\frac{x  ^ {3} }{\sqrt n }
 +
 
 +
\lambda \left (
 +
\frac{x}{\sqrt n }
 +
\right )  \right \}
 +
\left [ 1 + O \left (
 +
\frac{x}{\sqrt n }
 +
\right )  \right ] ,
 +
$$
 +
 
 +
$$
 +
 
 +
\frac{F _ {n} (- x) }{\Phi (- x) }
 +
  \mathop{\rm exp} \left \{ -
 +
 
 +
\frac{x  ^ {3} }{\sqrt n }
 +
  \lambda \left ( - {
 +
\frac{x}{\sqrt n }
 +
  } \right ) \
 +
\right \} \left [ 1 + O \left (
 +
\frac{x}{\sqrt n }
 +
  \right \right ] .
 +
$$
  
 +
Here  $  \Phi ( x) $
 +
is the normal  $  ( 0, 1) $
 +
distribution function and  $  \lambda ( t) = \sum _ {k = 0 }  ^  \infty  c _ {k} t  ^ {k} $
 +
is the so-called Cramér series, the coefficients of which depend only on the moments of the random variable  $  X _ {1} $;
 +
this series is convergent for all sufficiently small  $  t $.
 +
Actually, the original result, obtained by H. Cramér in 1938, was somewhat weaker than that just described.
  
 +
====References====
 +
{|
 +
|valign="top"|{{Ref|C}}|| H. Cramér, "Sur un nouveau théorème-limite de la théorie des probabilités" , ''Act. Sci. et Ind.'' , '''736''' , Hermann (1938) {{MR|}} {{ZBL|64.0529.01}}
 +
|-
 +
|valign="top"|{{Ref|IL}}|| I.A. Ibragimov, Yu.V. Linnik, "Independent and stationary sequences of random variables" , Wolters-Noordhoff (1971) (Translated from Russian) {{MR|0322926}} {{ZBL|0219.60027}}
 +
|-
 +
|valign="top"|{{Ref|P}}|| V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) {{MR|0388499}} {{ZBL|0322.60043}} {{ZBL|0322.60042}}
 +
|}
  
 
====Comments====
 
====Comments====
Line 24: Line 87:
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  R.S. Ellis,   "Entropy, large deviations, and statistical mechanics" , Springer (1985)</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|E}}|| R.S. Ellis, "Entropy, large deviations, and statistical mechanics" , Springer (1985) {{MR|0793553}} {{ZBL|0566.60097}}
 +
|}

Latest revision as of 17:31, 5 June 2020


2020 Mathematics Subject Classification: Primary: 60F10 [MSN][ZBL]

An integral limit theorem for the probability of large deviations of sums of independent random variables. Let $ X _ {1} , X _ {2} \dots $ be a sequence of independent random variables with the same non-degenerate distribution function $ F $, such that $ {\mathsf E} X _ {1} = 0 $ and such that the generating function $ {\mathsf E} e ^ {tX _ {1} } $ of the moments is finite in some interval $ | t | < H $( this last condition is known as the Cramér condition). Let

$$ {\mathsf E} X _ {1} ^ {2} = \sigma ^ {2} ,\ \ F _ {n} ( x) = {\mathsf P} \left ( \frac{1}{\sigma n ^ {1/2} } \sum _ {j = 1 } ^ { n } X _ {j} < x \right ) . $$

If $ x > 1 $, $ x = o ( \sqrt n ) $ as $ n \rightarrow \infty $, then

$$ \frac{1 - F _ {n} ( x) }{1 - \Phi ( x) } = \ \mathop{\rm exp} \left \{ \frac{x ^ {3} }{\sqrt n } \lambda \left ( \frac{x}{\sqrt n } \right ) \right \} \left [ 1 + O \left ( \frac{x}{\sqrt n } \right ) \right ] , $$

$$ \frac{F _ {n} (- x) }{\Phi (- x) } = \mathop{\rm exp} \left \{ - \frac{x ^ {3} }{\sqrt n } \lambda \left ( - { \frac{x}{\sqrt n } } \right ) \ \right \} \left [ 1 + O \left ( \frac{x}{\sqrt n } \right ) \right ] . $$

Here $ \Phi ( x) $ is the normal $ ( 0, 1) $ distribution function and $ \lambda ( t) = \sum _ {k = 0 } ^ \infty c _ {k} t ^ {k} $ is the so-called Cramér series, the coefficients of which depend only on the moments of the random variable $ X _ {1} $; this series is convergent for all sufficiently small $ t $. Actually, the original result, obtained by H. Cramér in 1938, was somewhat weaker than that just described.

References

[C] H. Cramér, "Sur un nouveau théorème-limite de la théorie des probabilités" , Act. Sci. et Ind. , 736 , Hermann (1938) Zbl 64.0529.01
[IL] I.A. Ibragimov, Yu.V. Linnik, "Independent and stationary sequences of random variables" , Wolters-Noordhoff (1971) (Translated from Russian) MR0322926 Zbl 0219.60027
[P] V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) MR0388499 Zbl 0322.60043 Zbl 0322.60042

Comments

See also Limit theorems; Probability of large deviations.

References

[E] R.S. Ellis, "Entropy, large deviations, and statistical mechanics" , Springer (1985) MR0793553 Zbl 0566.60097
How to Cite This Entry:
Cramér theorem. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Cram%C3%A9r_theorem&oldid=22309
This article was adapted from an original article by V.V. Petrov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article