Difference between revisions of "Cramér theorem"
(refs format) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | c0270001.png | ||
+ | $#A+1 = 16 n = 0 | ||
+ | $#C+1 = 16 : ~/encyclopedia/old_files/data/C027/C.0207000 Cram\Aeer theorem | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
{{MSC|60F10}} | {{MSC|60F10}} | ||
[[Category:Limit theorems]] | [[Category:Limit theorems]] | ||
− | An integral limit theorem for the probability of large deviations of sums of independent random variables. Let | + | An integral limit theorem for the probability of large deviations of sums of independent random variables. Let $ X _ {1} , X _ {2} \dots $ |
+ | be a sequence of independent random variables with the same non-degenerate distribution function $ F $, | ||
+ | such that $ {\mathsf E} X _ {1} = 0 $ | ||
+ | and such that the generating function $ {\mathsf E} e ^ {tX _ {1} } $ | ||
+ | of the moments is finite in some interval $ | t | < H $( | ||
+ | this last condition is known as the Cramér condition). Let | ||
+ | |||
+ | $$ | ||
+ | {\mathsf E} X _ {1} ^ {2} = \sigma ^ {2} ,\ \ | ||
+ | F _ {n} ( x) = {\mathsf P} \left ( | ||
+ | |||
+ | \frac{1}{\sigma n ^ {1/2} } | ||
+ | |||
+ | \sum _ {j = 1 } ^ { n } | ||
+ | X _ {j} < x \right ) . | ||
+ | $$ | ||
+ | |||
+ | If $ x > 1 $, | ||
+ | $ x = o ( \sqrt n ) $ | ||
+ | as $ n \rightarrow \infty $, | ||
+ | then | ||
+ | |||
+ | $$ | ||
+ | |||
+ | \frac{1 - F _ {n} ( x) }{1 - \Phi ( x) } | ||
+ | = \ | ||
+ | \mathop{\rm exp} \left \{ | ||
+ | |||
+ | \frac{x ^ {3} }{\sqrt n } | ||
− | + | \lambda \left ( | |
+ | \frac{x}{\sqrt n } | ||
+ | \right ) \right \} | ||
+ | \left [ 1 + O \left ( | ||
+ | \frac{x}{\sqrt n } | ||
+ | \right ) \right ] , | ||
+ | $$ | ||
− | + | $$ | |
− | + | \frac{F _ {n} (- x) }{\Phi (- x) } | |
+ | = \mathop{\rm exp} \left \{ - | ||
− | + | \frac{x ^ {3} }{\sqrt n } | |
+ | \lambda \left ( - { | ||
+ | \frac{x}{\sqrt n } | ||
+ | } \right ) \ | ||
+ | \right \} \left [ 1 + O \left ( | ||
+ | \frac{x}{\sqrt n } | ||
+ | \right ) \right ] . | ||
+ | $$ | ||
− | Here | + | Here $ \Phi ( x) $ |
+ | is the normal $ ( 0, 1) $ | ||
+ | distribution function and $ \lambda ( t) = \sum _ {k = 0 } ^ \infty c _ {k} t ^ {k} $ | ||
+ | is the so-called Cramér series, the coefficients of which depend only on the moments of the random variable $ X _ {1} $; | ||
+ | this series is convergent for all sufficiently small $ t $. | ||
+ | Actually, the original result, obtained by H. Cramér in 1938, was somewhat weaker than that just described. | ||
====References==== | ====References==== |
Latest revision as of 17:31, 5 June 2020
2020 Mathematics Subject Classification: Primary: 60F10 [MSN][ZBL]
An integral limit theorem for the probability of large deviations of sums of independent random variables. Let $ X _ {1} , X _ {2} \dots $ be a sequence of independent random variables with the same non-degenerate distribution function $ F $, such that $ {\mathsf E} X _ {1} = 0 $ and such that the generating function $ {\mathsf E} e ^ {tX _ {1} } $ of the moments is finite in some interval $ | t | < H $( this last condition is known as the Cramér condition). Let
$$ {\mathsf E} X _ {1} ^ {2} = \sigma ^ {2} ,\ \ F _ {n} ( x) = {\mathsf P} \left ( \frac{1}{\sigma n ^ {1/2} } \sum _ {j = 1 } ^ { n } X _ {j} < x \right ) . $$
If $ x > 1 $, $ x = o ( \sqrt n ) $ as $ n \rightarrow \infty $, then
$$ \frac{1 - F _ {n} ( x) }{1 - \Phi ( x) } = \ \mathop{\rm exp} \left \{ \frac{x ^ {3} }{\sqrt n } \lambda \left ( \frac{x}{\sqrt n } \right ) \right \} \left [ 1 + O \left ( \frac{x}{\sqrt n } \right ) \right ] , $$
$$ \frac{F _ {n} (- x) }{\Phi (- x) } = \mathop{\rm exp} \left \{ - \frac{x ^ {3} }{\sqrt n } \lambda \left ( - { \frac{x}{\sqrt n } } \right ) \ \right \} \left [ 1 + O \left ( \frac{x}{\sqrt n } \right ) \right ] . $$
Here $ \Phi ( x) $ is the normal $ ( 0, 1) $ distribution function and $ \lambda ( t) = \sum _ {k = 0 } ^ \infty c _ {k} t ^ {k} $ is the so-called Cramér series, the coefficients of which depend only on the moments of the random variable $ X _ {1} $; this series is convergent for all sufficiently small $ t $. Actually, the original result, obtained by H. Cramér in 1938, was somewhat weaker than that just described.
References
[C] | H. Cramér, "Sur un nouveau théorème-limite de la théorie des probabilités" , Act. Sci. et Ind. , 736 , Hermann (1938) Zbl 64.0529.01 |
[IL] | I.A. Ibragimov, Yu.V. Linnik, "Independent and stationary sequences of random variables" , Wolters-Noordhoff (1971) (Translated from Russian) MR0322926 Zbl 0219.60027 |
[P] | V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) MR0388499 Zbl 0322.60043 Zbl 0322.60042 |
Comments
See also Limit theorems; Probability of large deviations.
References
[E] | R.S. Ellis, "Entropy, large deviations, and statistical mechanics" , Springer (1985) MR0793553 Zbl 0566.60097 |
Cramér theorem. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Cram%C3%A9r_theorem&oldid=46552