Namespaces
Variants
Actions

Difference between revisions of "Berry-Esseen inequality"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (→‎References: isbn link)
 
(8 intermediate revisions by 4 users not shown)
Line 1: Line 1:
An inequality giving an estimate of the deviation of the distribution function of a sum of independent random variables from the normal distribution function. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157601.png" /> be independent random variables with the same distribution such that
+
{{MSC|60F05}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157602.png" /></td> </tr></table>
+
[[Category:Limit theorems]]
  
Let
+
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157603.png" /></td> </tr></table>
+
An inequality giving an estimate of the deviation of the distribution function of a sum of independent random variables from the normal distribution function. Let $X_1,\ldots,X_n$ be independent random variables with the same distribution such that
  
and
+
$$\mathbf{E}X_j=0,\quad \mathbf{E}X_j^2=\sigma^2>0,\quad\mathbf{E}\lvert X_j\rvert^3<\infty.$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157604.png" /></td> </tr></table>
+
Let
  
then, for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157605.png" />,
+
$$\rho=\frac{\mathbf{E}\lvert X_j\rvert^3}{\sigma^3}$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157606.png" /></td> </tr></table>
+
and
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157607.png" /> is an absolute positive constant. This result was obtained by A.C. Berry [[#References|[1]]] and, independently, by C.G. Esseen [[#References|[2]]].
+
$$\Phi(x)=\frac{1}{\sqrt{2\pi}}\int_{-\infty}^x e^{-t^2/2}\,\mathrm{d}t;$$
 
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  A.C. Berry,  "The accuracy of the Gaussian approximation to the sum of independent variables"  ''Trans. Amer. Math. Soc.'' , '''49'''  (1941)  pp. 122–136</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  C.G. Esseen,  "On the Liapunoff limit of error in the theory of probability"  ''Ark. Mat. Astr. Fysik'' , '''28A''' :  2 (1942)  pp. 1–19</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  V.V. Petrov,   "Sums of independent random variables" , Springer  (1975)  (Translated from Russian)</TD></TR></table>
 
  
 +
then, for any $n$,
  
 +
$$\sup_x\left\lvert\mathbf{P}\left\{\frac{1}{\sigma\sqrt{n}}\sum_{j=1}^nX_j\leq x\right\}-\Phi(x)\right\rvert\leq A\frac{\rho}{\sqrt{n}},$$
  
====Comments====
+
where $A$ is an absolute positive constant. This result was obtained by A.C. Berry {{Cite|Be}} and, independently, by C.G. Esseen {{Cite|Es}}. Feller obtained an explicit value for the constant $A \le 33/4$ , see {{Cite|Fe|p. 515}}: it is now known that $A \le 0.7655$, see {{Cite|Fi|p. 264}}.
The constant <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157608.png" /> can be taken to be <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015760/b0157609.png" />, cf. [[#References|[a1]]], p. 515 ff.
 
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> W. Feller,  "An introduction to probability theory and its applications" , '''2''' , Wiley  (1966)  pp. 210</TD></TR></table>
+
{|
 +
|-
 +
|valign="top"|{{Ref|Be}}||valign="top"|  A.C. Berry,  "The accuracy of the Gaussian approximation to the sum of independent variables"  ''Trans. Amer. Math. Soc.'' , '''49'''  (1941)  pp. 122–136
 +
|-
 +
|valign="top"|{{Ref|Es}}||valign="top"|  C.G. Esseen,  "On the Liapunoff limit of error in the theory of probability"  ''Ark. Mat. Astr. Fysik'' , '''28A''' :  2  (1942)  pp. 1–19
 +
|-
 +
|valign="top"|{{Ref|Fe}}||valign="top"| W. Feller, [[Feller, "An introduction to probability theory and its   applications"|"An introduction to probability theory and its   applications"]], '''2''' , Wiley  (1966)  pp. 210
 +
|-
 +
|valign="top"|{{Ref|Fi}}||valign="top"|  Steven R. Finch, "Mathematical Constants", Cambridge University Press (2003) {{ISBN|0-521-81805-2}} Sect. 4.6
 +
|-
 +
|valign="top"|{{Ref|Pe}}||valign="top"| V.V. Petrov,  "Sums of independent random variables" , Springer  (1975)  (Translated from Russian) {{MR|0388499}} {{ZBL|0322.60043}} {{ZBL|0322.60042}}
 +
|}

Latest revision as of 11:49, 23 November 2023

2020 Mathematics Subject Classification: Primary: 60F05 [MSN][ZBL]

An inequality giving an estimate of the deviation of the distribution function of a sum of independent random variables from the normal distribution function. Let $X_1,\ldots,X_n$ be independent random variables with the same distribution such that

$$\mathbf{E}X_j=0,\quad \mathbf{E}X_j^2=\sigma^2>0,\quad\mathbf{E}\lvert X_j\rvert^3<\infty.$$

Let

$$\rho=\frac{\mathbf{E}\lvert X_j\rvert^3}{\sigma^3}$$

and

$$\Phi(x)=\frac{1}{\sqrt{2\pi}}\int_{-\infty}^x e^{-t^2/2}\,\mathrm{d}t;$$

then, for any $n$,

$$\sup_x\left\lvert\mathbf{P}\left\{\frac{1}{\sigma\sqrt{n}}\sum_{j=1}^nX_j\leq x\right\}-\Phi(x)\right\rvert\leq A\frac{\rho}{\sqrt{n}},$$

where $A$ is an absolute positive constant. This result was obtained by A.C. Berry [Be] and, independently, by C.G. Esseen [Es]. Feller obtained an explicit value for the constant $A \le 33/4$ , see [Fe, p. 515]: it is now known that $A \le 0.7655$, see [Fi, p. 264].

References

[Be] A.C. Berry, "The accuracy of the Gaussian approximation to the sum of independent variables" Trans. Amer. Math. Soc. , 49 (1941) pp. 122–136
[Es] C.G. Esseen, "On the Liapunoff limit of error in the theory of probability" Ark. Mat. Astr. Fysik , 28A : 2 (1942) pp. 1–19
[Fe] W. Feller, "An introduction to probability theory and its applications", 2 , Wiley (1966) pp. 210
[Fi] Steven R. Finch, "Mathematical Constants", Cambridge University Press (2003) ISBN 0-521-81805-2 Sect. 4.6
[Pe] V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) MR0388499 Zbl 0322.60043 Zbl 0322.60042
How to Cite This Entry:
Berry-Esseen inequality. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Berry-Esseen_inequality&oldid=16984
This article was adapted from an original article by V.V. Petrov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article