Difference between revisions of "Laplace theorem"
Ulf Rehmann (talk | contribs) m (MR/ZBL numbers added) |
m (Corrected some minor errors.) |
||
(3 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
− | + | Laplace’s theorem on determinants. See [[Cofactor|Cofactor]]. | |
{{MSC|60F05}} | {{MSC|60F05}} | ||
Line 5: | Line 5: | ||
[[Category:Limit theorems]] | [[Category:Limit theorems]] | ||
− | + | Laplace’s theorem on the approximation of the binomial distribution by the normal distribution. This is the first version of the [[Central limit theorem|Central Limit Theorem]] of probability theory: If $ S_{n} $ denotes the number of “successes” in $ n $ [[Bernoulli trials|Bernoulli trials]] with probability of success $ p $ ($ 0 < p < 1 $), then for any two real numbers $ a $ and $ b $ satisfying $ a < b $, one has | |
− | + | $$ | |
− | < | + | \lim_{n \to \infty} \mathsf{P} \! \left( a < \frac{S_{n} - n p}{\sqrt{n p (1 - p)}} < b \right) = \Phi(b) - \Phi(a), |
− | + | $$ | |
− | where | + | where $ \Phi: \mathbb{R} \to (0,1) $ defined by |
− | + | $$ | |
− | + | \forall x \in \mathbb{R}: \qquad | |
− | + | \Phi(x) \stackrel{\text{df}}{=} \frac{1}{\sqrt{2 \pi}} \int_{- \infty}^{x} e^{- y^{2} / 2} ~ \mathrm{d}{y} | |
− | is the distribution function of the standard normal law. | + | $$ |
− | + | is the cumulative distribution function of the standard normal law. | |
− | |||
− | |||
− | |||
+ | The local Laplace Theorem has independent significance: For the probability value | ||
+ | $$ | ||
+ | \mathsf{P}(S_{n} = m) = \binom{n}{m} p^{m} (1 - p)^{n - m}, \qquad \text{where} ~ m \in \mathbb{Z} \cap [0,n], | ||
+ | $$ | ||
one has | one has | ||
+ | $$ | ||
+ | \mathsf{P}(S_{n} = m) = \frac{1}{\sqrt{n p (1 - p)}} (1 + \epsilon_{n}) \phi(x), | ||
+ | $$ | ||
+ | where $ \phi: \mathbb{R} \to \mathbb{R}_{> 0} $ defined by | ||
+ | $$ | ||
+ | \forall x \in \mathbb{R}: \qquad | ||
+ | \phi(x) \stackrel{\text{df}}{=} \frac{1}{\sqrt{2 \pi}} e^{- x^{2} / 2} | ||
+ | $$ | ||
+ | is the density of the standard normal distribution, and $ \epsilon_{n} \to 0 $ as $ n \to \infty $ uniformly for all $ m $ for which $ x = \dfrac{m - n p}{\sqrt{n p (1 - p)}} $ lies in some finite interval. | ||
− | + | In its general form, the theorem was proved by P. S. Laplace {{Cite|L}}. The special case $ p = 0.5 $ of the Laplace Theorem was studied by A. de Moivre {{Cite|M}}. Therefore, the Laplace Theorem is sometimes called the “de Moivre–Laplace Theorem”. | |
− | where | + | For practical applications, the Laplace Theorem is important in order to obtain an idea of the errors that arise in the use of approximation formulas. In the more precise (by comparison with {{Cite|L}}) asymptotic formula |
+ | $$ | ||
+ | \forall y \in \mathbb{R}: \qquad | ||
+ | \mathsf{P}(S_{n} < y) = \Phi \! \left( \frac{y - n p + 0.5}{\sqrt{n p (1 - p)}} \right) + {R_{n}}(y), | ||
+ | $$ | ||
+ | the remainder term $ {R_{n}}(y) $ has order $ \mathcal{O} \! \left( \dfrac{1}{\sqrt{n}} \right) $ uniformly for all real numbers $ y $. For the uniform approximation of the [[Binomial distribution|binomial distribution]] by means of the [[Normal distribution|normal distribution]], the following formula of Ya. Uspenskii (1937) is more useful: If $ \sigma = \sqrt{n p (1 - p)} $, then for any two real numbers $ a $ and $ b $ satisfying $ a < b $, one has | ||
+ | $$ | ||
+ | \mathsf{P}(a < S_{n} < b) = | ||
+ | \Phi \! \left( \frac{b - n p + 0.5}{\sigma} \right) - | ||
+ | \Phi \! \left( \frac{a - n p - 0.5}{\sigma} \right) + | ||
+ | \psi \! \left( \frac{b - n p + 0.5}{\sigma} \right) - | ||
+ | \psi \! \left( \frac{a - n p - 0.5}{\sigma} \right) + | ||
+ | \Delta, | ||
+ | $$ | ||
+ | where $ \psi: \mathbb{R} \to \mathbb{R} $ is defined by | ||
+ | $$ | ||
+ | \forall x \in \mathbb{R}: \qquad | ||
+ | \psi(x) \stackrel{\text{df}}{=} \frac{1 - 2 p}{6 \sigma} (1 - x^{2}) \phi(x), | ||
+ | $$ | ||
+ | and for $ \sigma \geq 5 $, | ||
+ | $$ | ||
+ | |\Delta| < \frac{(0.13 + 0.18 |1 - 2 p|)}{\sigma^{2}} + \frac{1}{e^{3 \sigma / 2}}. | ||
+ | $$ | ||
− | + | To improve the relative accuracy of the approximation, S. N. Bernstein [S. N. Bernstein] (1943) and W. Feller (1945) suggested other formulas. | |
− | + | ====References==== | |
− | + | {| | |
+ | |valign = "top"|{{Ref|L}}|| P. S. Laplace, “Théorie analytique des probabilités”, Paris (1812). {{MR|2274728}}, {{MR|1400403}}, {{MR|1400402}}, {{ZBL|1047.01534}}, {{ZBL|1047.01533}} | ||
+ | |- | ||
+ | |valign = "top"|{{Ref|M}}|| A. de Moivre, “Miscellanea analytica de seriebus et quadraturis”, London (1730). | ||
+ | |- | ||
+ | |valign = "top"|{{Ref|PR}}|| Yu. V. Prohorov, Yu. A. Rozanov, “Probability theory, basic concepts. Limit theorems, random processes”, Springer (1969). (Translated from Russian) {{MR|0251754}} | ||
+ | |- | ||
+ | |valign = "top"|{{Ref|F}}|| W. Feller, “On the normal approximation to the binomial distribution”, ''Ann. Math. Statist.'', '''16''' (1945), pp. 319–329. {{MR|0015706}}, {{ZBL|0060.28703}} | ||
+ | |- | ||
+ | |valign = "top"|{{Ref|F2}}|| W. Feller, [[Feller, "An introduction to probability theory and its applications"|“An introduction to probability theory and its applications”]], '''1''', Wiley (1968). | ||
+ | |} | ||
− | + | ====Comments==== | |
− | + | For a more detailed and more general discussion of approximation by the [[Normal distribution|normal distribution]], see {{Cite|P}}. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
====References==== | ====References==== | ||
− | |||
− | |||
− | + | {| | |
− | + | |valign = "top"|{{Ref|P}}|| V. V. Petrov, “Sums of independent random variables”, Springer (1975). (Translated from Russian) {{MR|0388499}}, {{ZBL|0322.60043}}, {{ZBL|0322.60042}} | |
− | + | |} | |
− | |||
− | |||
− |
Latest revision as of 19:33, 7 July 2016
Laplace’s theorem on determinants. See Cofactor.
2020 Mathematics Subject Classification: Primary: 60F05 [MSN][ZBL]
Laplace’s theorem on the approximation of the binomial distribution by the normal distribution. This is the first version of the Central Limit Theorem of probability theory: If $ S_{n} $ denotes the number of “successes” in $ n $ Bernoulli trials with probability of success $ p $ ($ 0 < p < 1 $), then for any two real numbers $ a $ and $ b $ satisfying $ a < b $, one has $$ \lim_{n \to \infty} \mathsf{P} \! \left( a < \frac{S_{n} - n p}{\sqrt{n p (1 - p)}} < b \right) = \Phi(b) - \Phi(a), $$ where $ \Phi: \mathbb{R} \to (0,1) $ defined by $$ \forall x \in \mathbb{R}: \qquad \Phi(x) \stackrel{\text{df}}{=} \frac{1}{\sqrt{2 \pi}} \int_{- \infty}^{x} e^{- y^{2} / 2} ~ \mathrm{d}{y} $$ is the cumulative distribution function of the standard normal law.
The local Laplace Theorem has independent significance: For the probability value $$ \mathsf{P}(S_{n} = m) = \binom{n}{m} p^{m} (1 - p)^{n - m}, \qquad \text{where} ~ m \in \mathbb{Z} \cap [0,n], $$ one has $$ \mathsf{P}(S_{n} = m) = \frac{1}{\sqrt{n p (1 - p)}} (1 + \epsilon_{n}) \phi(x), $$ where $ \phi: \mathbb{R} \to \mathbb{R}_{> 0} $ defined by $$ \forall x \in \mathbb{R}: \qquad \phi(x) \stackrel{\text{df}}{=} \frac{1}{\sqrt{2 \pi}} e^{- x^{2} / 2} $$ is the density of the standard normal distribution, and $ \epsilon_{n} \to 0 $ as $ n \to \infty $ uniformly for all $ m $ for which $ x = \dfrac{m - n p}{\sqrt{n p (1 - p)}} $ lies in some finite interval.
In its general form, the theorem was proved by P. S. Laplace [L]. The special case $ p = 0.5 $ of the Laplace Theorem was studied by A. de Moivre [M]. Therefore, the Laplace Theorem is sometimes called the “de Moivre–Laplace Theorem”.
For practical applications, the Laplace Theorem is important in order to obtain an idea of the errors that arise in the use of approximation formulas. In the more precise (by comparison with [L]) asymptotic formula $$ \forall y \in \mathbb{R}: \qquad \mathsf{P}(S_{n} < y) = \Phi \! \left( \frac{y - n p + 0.5}{\sqrt{n p (1 - p)}} \right) + {R_{n}}(y), $$ the remainder term $ {R_{n}}(y) $ has order $ \mathcal{O} \! \left( \dfrac{1}{\sqrt{n}} \right) $ uniformly for all real numbers $ y $. For the uniform approximation of the binomial distribution by means of the normal distribution, the following formula of Ya. Uspenskii (1937) is more useful: If $ \sigma = \sqrt{n p (1 - p)} $, then for any two real numbers $ a $ and $ b $ satisfying $ a < b $, one has $$ \mathsf{P}(a < S_{n} < b) = \Phi \! \left( \frac{b - n p + 0.5}{\sigma} \right) - \Phi \! \left( \frac{a - n p - 0.5}{\sigma} \right) + \psi \! \left( \frac{b - n p + 0.5}{\sigma} \right) - \psi \! \left( \frac{a - n p - 0.5}{\sigma} \right) + \Delta, $$ where $ \psi: \mathbb{R} \to \mathbb{R} $ is defined by $$ \forall x \in \mathbb{R}: \qquad \psi(x) \stackrel{\text{df}}{=} \frac{1 - 2 p}{6 \sigma} (1 - x^{2}) \phi(x), $$ and for $ \sigma \geq 5 $, $$ |\Delta| < \frac{(0.13 + 0.18 |1 - 2 p|)}{\sigma^{2}} + \frac{1}{e^{3 \sigma / 2}}. $$
To improve the relative accuracy of the approximation, S. N. Bernstein [S. N. Bernstein] (1943) and W. Feller (1945) suggested other formulas.
References
[L] | P. S. Laplace, “Théorie analytique des probabilités”, Paris (1812). MR2274728, MR1400403, MR1400402, Zbl 1047.01534, Zbl 1047.01533 |
[M] | A. de Moivre, “Miscellanea analytica de seriebus et quadraturis”, London (1730). |
[PR] | Yu. V. Prohorov, Yu. A. Rozanov, “Probability theory, basic concepts. Limit theorems, random processes”, Springer (1969). (Translated from Russian) MR0251754 |
[F] | W. Feller, “On the normal approximation to the binomial distribution”, Ann. Math. Statist., 16 (1945), pp. 319–329. MR0015706, Zbl 0060.28703 |
[F2] | W. Feller, “An introduction to probability theory and its applications”, 1, Wiley (1968). |
Comments
For a more detailed and more general discussion of approximation by the normal distribution, see [P].
References
[P] | V. V. Petrov, “Sums of independent random variables”, Springer (1975). (Translated from Russian) MR0388499, Zbl 0322.60043, Zbl 0322.60042 |
Laplace theorem. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Laplace_theorem&oldid=23615