Namespaces
Variants
Actions

Difference between revisions of "Moments, method of (in probability theory)"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
Line 1: Line 1:
A method for determining a [[Probability distribution|probability distribution]] by its moments (cf. [[Moment|Moment]]). Theoretically the method of moments is based on the uniqueness of the solution of the [[Moment problem|moment problem]]: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646101.png" /> are constants, then under what conditions does there exist a unique distribution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646102.png" /> such that
+
<!--
 +
m0646101.png
 +
$#A+1 = 20 n = 0
 +
$#C+1 = 20 : ~/encyclopedia/old_files/data/M064/M.0604610 Moments, method of (in probability theory)
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646103.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
are the moments of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646104.png" /> for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646105.png" />? There are various types of sufficient conditions for a distribution to be uniquely determined by its moments, for example, the Carleman condition
+
A method for determining a [[Probability distribution|probability distribution]] by its moments (cf. [[Moment|Moment]]). Theoretically the method of moments is based on the uniqueness of the solution of the [[Moment problem|moment problem]]: If  $  \alpha _ {0} = 1 , \alpha _ {1} , \alpha _ {2} \dots $
 +
are constants, then under what conditions does there exist a unique distribution $  {\mathsf P} $
 +
such that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646106.png" /></td> </tr></table>
+
$$
 +
\alpha _ {n}  = \int\limits x  ^ {n} {\mathsf P} ( dx )
 +
$$
  
The use of the method of moments in the proof of limit theorems in probability theory and mathematical statistics is based on the correspondence between moments and the convergence of distributions: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646107.png" /> is a sequence of distribution functions with finite moments <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646108.png" /> of any order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m0646109.png" />, and if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461010.png" />, as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461011.png" />, for each <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461012.png" />, then the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461013.png" /> are the moments of a distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461014.png" />; if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461015.png" /> is uniquely determined by its moments, then as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461016.png" />, the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461017.png" /> converge weakly to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461018.png" />. The method of moments in the case of convergence to a [[Normal distribution|normal distribution]] was first treated by P.L. Chebyshev (1887), and a proof of the [[Central limit theorem|central limit theorem]] by the method of moments was accomplished by A.A. Markov (1898).
+
are the moments of $  {\mathsf P} $
 +
for all  $  n $?
 +
There are various types of sufficient conditions for a distribution to be uniquely determined by its moments, for example, the Carleman condition
  
The method of moments in mathematical statistics is one of the general methods for finding statistical estimators of unknown parameters of a probability distribution from results of observations. The method of moments was first used to this end by K. Pearson (1894) to solve the problem of the approximation of an empirical distribution by a system of Pearson distributions (cf. [[Pearson distribution|Pearson distribution]]). The procedure in the method of moments is this: The moments of the empirical distribution are determined (the sample moments), equal in number to the number of parameters to be estimated; they are then equated to the corresponding moments of the probability distribution, which are functions of the unknown parameters; the system of equations thus obtained is solved for the parameters and the solutions are the required estimates. In practice the method of moments often leads to very simple calculations. Under fairly general conditions the method of moments allows one to find estimators that are asymptotically normal, have mathematical expectation that differs from the true value of the parameter only by a quantity of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461019.png" /> and standard deviation that deviates by a quantity of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064610/m06461020.png" />. However, the estimators found by the method of moments need not be best possible from the point of view of efficiency: their variance need not be minimal. For a normal distribution the method of moments leads to estimators that coincide with the estimators of the [[Maximum-likelihood method|maximum-likelihood method]], that is, with asymptotically-unbiased asymptotically-efficient estimators.
+
$$
 +
\sum _ { n= } 1 ^  \infty 
 +
 
 +
\frac{1}{\alpha _ {2n} ^ {1 / 2n } }
 +
 
 +
=  \infty .
 +
$$
 +
 
 +
The use of the method of moments in the proof of limit theorems in probability theory and mathematical statistics is based on the correspondence between moments and the convergence of distributions: If  $  F _ {n} $
 +
is a sequence of distribution functions with finite moments  $  \alpha _ {k} ( n) $
 +
of any order  $  k \geq  1 $,
 +
and if  $  \alpha _ {k} ( n) \rightarrow \beta _ {k} $,
 +
as  $  n \rightarrow \infty $,
 +
for each  $  k $,
 +
then the  $  \beta _ {k} $
 +
are the moments of a distribution function  $  F $;
 +
if  $  F $
 +
is uniquely determined by its moments, then as  $  n \rightarrow \infty $,
 +
the  $  F _ {n} $
 +
converge weakly to  $  F $.
 +
The method of moments in the case of convergence to a [[Normal distribution|normal distribution]] was first treated by P.L. Chebyshev (1887), and a proof of the [[Central limit theorem|central limit theorem]] by the method of moments was accomplished by A.A. Markov (1898).
 +
 
 +
The method of moments in mathematical statistics is one of the general methods for finding statistical estimators of unknown parameters of a probability distribution from results of observations. The method of moments was first used to this end by K. Pearson (1894) to solve the problem of the approximation of an empirical distribution by a system of Pearson distributions (cf. [[Pearson distribution|Pearson distribution]]). The procedure in the method of moments is this: The moments of the empirical distribution are determined (the sample moments), equal in number to the number of parameters to be estimated; they are then equated to the corresponding moments of the probability distribution, which are functions of the unknown parameters; the system of equations thus obtained is solved for the parameters and the solutions are the required estimates. In practice the method of moments often leads to very simple calculations. Under fairly general conditions the method of moments allows one to find estimators that are asymptotically normal, have mathematical expectation that differs from the true value of the parameter only by a quantity of order $  1 / n $
 +
and standard deviation that deviates by a quantity of order $  1/ \sqrt {n } $.  
 +
However, the estimators found by the method of moments need not be best possible from the point of view of efficiency: their variance need not be minimal. For a normal distribution the method of moments leads to estimators that coincide with the estimators of the [[Maximum-likelihood method|maximum-likelihood method]], that is, with asymptotically-unbiased asymptotically-efficient estimators.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  Yu.V. [Yu.V. Prokhorov] Prohorov,  Yu.A. Rozanov,  "Probability theory, basic concepts. Limit theorems, random processes" , Springer  (1969)  (Translated from Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  M.G. Kendall,  A. Stuart,  "The advanced theory of statistics" , '''1''' , Griffin  (1987)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  Yu.V. [Yu.V. Prokhorov] Prohorov,  Yu.A. Rozanov,  "Probability theory, basic concepts. Limit theorems, random processes" , Springer  (1969)  (Translated from Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  M.G. Kendall,  A. Stuart,  "The advanced theory of statistics" , '''1''' , Griffin  (1987)</TD></TR></table>

Revision as of 08:01, 6 June 2020


A method for determining a probability distribution by its moments (cf. Moment). Theoretically the method of moments is based on the uniqueness of the solution of the moment problem: If $ \alpha _ {0} = 1 , \alpha _ {1} , \alpha _ {2} \dots $ are constants, then under what conditions does there exist a unique distribution $ {\mathsf P} $ such that

$$ \alpha _ {n} = \int\limits x ^ {n} {\mathsf P} ( dx ) $$

are the moments of $ {\mathsf P} $ for all $ n $? There are various types of sufficient conditions for a distribution to be uniquely determined by its moments, for example, the Carleman condition

$$ \sum _ { n= } 1 ^ \infty \frac{1}{\alpha _ {2n} ^ {1 / 2n } } = \infty . $$

The use of the method of moments in the proof of limit theorems in probability theory and mathematical statistics is based on the correspondence between moments and the convergence of distributions: If $ F _ {n} $ is a sequence of distribution functions with finite moments $ \alpha _ {k} ( n) $ of any order $ k \geq 1 $, and if $ \alpha _ {k} ( n) \rightarrow \beta _ {k} $, as $ n \rightarrow \infty $, for each $ k $, then the $ \beta _ {k} $ are the moments of a distribution function $ F $; if $ F $ is uniquely determined by its moments, then as $ n \rightarrow \infty $, the $ F _ {n} $ converge weakly to $ F $. The method of moments in the case of convergence to a normal distribution was first treated by P.L. Chebyshev (1887), and a proof of the central limit theorem by the method of moments was accomplished by A.A. Markov (1898).

The method of moments in mathematical statistics is one of the general methods for finding statistical estimators of unknown parameters of a probability distribution from results of observations. The method of moments was first used to this end by K. Pearson (1894) to solve the problem of the approximation of an empirical distribution by a system of Pearson distributions (cf. Pearson distribution). The procedure in the method of moments is this: The moments of the empirical distribution are determined (the sample moments), equal in number to the number of parameters to be estimated; they are then equated to the corresponding moments of the probability distribution, which are functions of the unknown parameters; the system of equations thus obtained is solved for the parameters and the solutions are the required estimates. In practice the method of moments often leads to very simple calculations. Under fairly general conditions the method of moments allows one to find estimators that are asymptotically normal, have mathematical expectation that differs from the true value of the parameter only by a quantity of order $ 1 / n $ and standard deviation that deviates by a quantity of order $ 1/ \sqrt {n } $. However, the estimators found by the method of moments need not be best possible from the point of view of efficiency: their variance need not be minimal. For a normal distribution the method of moments leads to estimators that coincide with the estimators of the maximum-likelihood method, that is, with asymptotically-unbiased asymptotically-efficient estimators.

References

[1] Yu.V. [Yu.V. Prokhorov] Prohorov, Yu.A. Rozanov, "Probability theory, basic concepts. Limit theorems, random processes" , Springer (1969) (Translated from Russian)
[2] H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)
[3] M.G. Kendall, A. Stuart, "The advanced theory of statistics" , 1 , Griffin (1987)
How to Cite This Entry:
Moments, method of (in probability theory). Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Moments,_method_of_(in_probability_theory)&oldid=47882
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article