Difference between revisions of "Berry-Esseen inequality"
Ulf Rehmann (talk | contribs) m (moved Berry–Esseen inequality to Berry-Esseen inequality: ascii title) |
(TeX) |
||
Line 1: | Line 1: | ||
− | An inequality giving an estimate of the deviation of the distribution function of a sum of independent random variables from the normal distribution function. Let | + | {{TEX|done}} |
+ | An inequality giving an estimate of the deviation of the distribution function of a sum of independent random variables from the normal distribution function. Let $X_1,\ldots,X_n$ be independent random variables with the same distribution such that | ||
− | + | $$\mathbf{E}X_j=0$,\quad \mathbf{E}X_j^2=\sigma^2>0,\quad\mathbf{E}\lvert X_j\rvert^3<\infty.$$ | |
Let | Let | ||
− | + | $$\rho=\frac{\mathbf{E}\lvert X_j\rvert^3}{\sigma^3}$$ | |
and | and | ||
− | + | $$\Phi(x)=\frac{1}{\sqrt{2\pi}}\int_{-\infty}^x e^{-t^2/2}\,\mathrm{d}t;$$ | |
− | then, for any | + | then, for any $n$, |
− | + | $$\sup_x\left\lvert\mathbf{P}\left\{\frac{1}{\sigma\sqrt{n}}\sum_{j=1}^nX_j\leq x\right\}-\Phi(x)\right\rvert\leq A\frac{\rho}{\sqrt{n}},$$ | |
− | where | + | where $A$ is an absolute positive constant. This result was obtained by A.C. Berry {{Cite|Be}} and, independently, by C.G. Esseen {{Cite|Es}}. The constant $A$ can be taken to be $33/4$ , see {{Cite|Fe|p. 515}}. |
====References==== | ====References==== | ||
− | + | {| | |
− | + | |- | |
− | + | |valign="top"|{{Ref|Be}}||valign="top"| A.C. Berry, "The accuracy of the Gaussian approximation to the sum of independent variables" ''Trans. Amer. Math. Soc.'' , '''49''' (1941) pp. 122–136 | |
− | + | |- | |
− | + | |valign="top"|{{Ref|Es}}||valign="top"| C.G. Esseen, "On the Liapunoff limit of error in the theory of probability" ''Ark. Mat. Astr. Fysik'' , '''28A''' : 2 (1942) pp. 1–19 | |
− | + | |- | |
− | + | |valign="top"|{{Ref|Fe}}||valign="top"| W. Feller, "An introduction to probability theory and its applications" , '''2''' , Wiley (1966) pp. 210 | |
− | + | |- | |
− | + | |valign="top"|{{Ref|Pe}}||valign="top"| V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) | |
+ | |- | ||
+ | |} |
Revision as of 10:57, 22 April 2012
An inequality giving an estimate of the deviation of the distribution function of a sum of independent random variables from the normal distribution function. Let $X_1,\ldots,X_n$ be independent random variables with the same distribution such that
$$\mathbf{E}X_j=0$,\quad \mathbf{E}X_j^2=\sigma^2>0,\quad\mathbf{E}\lvert X_j\rvert^3<\infty.$$
Let
$$\rho=\frac{\mathbf{E}\lvert X_j\rvert^3}{\sigma^3}$$
and
$$\Phi(x)=\frac{1}{\sqrt{2\pi}}\int_{-\infty}^x e^{-t^2/2}\,\mathrm{d}t;$$
then, for any $n$,
$$\sup_x\left\lvert\mathbf{P}\left\{\frac{1}{\sigma\sqrt{n}}\sum_{j=1}^nX_j\leq x\right\}-\Phi(x)\right\rvert\leq A\frac{\rho}{\sqrt{n}},$$
where $A$ is an absolute positive constant. This result was obtained by A.C. Berry [Be] and, independently, by C.G. Esseen [Es]. The constant $A$ can be taken to be $33/4$ , see [Fe, p. 515].
References
[Be] | A.C. Berry, "The accuracy of the Gaussian approximation to the sum of independent variables" Trans. Amer. Math. Soc. , 49 (1941) pp. 122–136 |
[Es] | C.G. Esseen, "On the Liapunoff limit of error in the theory of probability" Ark. Mat. Astr. Fysik , 28A : 2 (1942) pp. 1–19 |
[Fe] | W. Feller, "An introduction to probability theory and its applications" , 2 , Wiley (1966) pp. 210 |
[Pe] | V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) |
Berry-Esseen inequality. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Berry-Esseen_inequality&oldid=25038