Namespaces
Variants
Actions

Difference between revisions of "Moment"

From Encyclopedia of Mathematics
Jump to: navigation, search
(→‎References: Feller: internal link)
m (tex encoded by computer)
Line 1: Line 1:
A numerical characteristic of a [[Probability distribution|probability distribution]]. The moment of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645802.png" /> (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645803.png" /> an integer) of a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645804.png" /> is defined as the mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645805.png" />, if it exists. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645806.png" /> is the [[Distribution function|distribution function]] of the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645807.png" />, then
+
<!--
 +
m0645802.png
 +
$#A+1 = 73 n = 0
 +
$#C+1 = 73 : ~/encyclopedia/old_files/data/M064/M.0604580 Moment
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645808.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
For the definition of a moment in probability theory, a direct analogy is used with the corresponding idea which plays a major role in mechanics: Formula (*) is defined as the moment of a mass distribution. The first-order moment (a statistical moment in mechanics) of a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m0645809.png" /> is the mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458010.png" />. The value <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458011.png" /> is called the moment of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458013.png" /> relative to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458014.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458015.png" /> is the central moment of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458017.png" />. The second-order central moment <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458018.png" /> is called the [[Dispersion|dispersion]] (or variance) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458019.png" /> (the moment of inertia in mechanics). The value <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458020.png" /> is called the [[Absolute moment|absolute moment]] of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458021.png" /> (absolute moments are also defined for non-integral <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458022.png" />). The moments of the [[Joint distribution|joint distribution]] of random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458023.png" /> (see [[Multi-dimensional distribution|Multi-dimensional distribution]]) are defined similarly: For any integers <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458024.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458025.png" />, the mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458026.png" /> is called a mixed moment of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458028.png" />, and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458029.png" /> is called a central mixed moment of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458031.png" />. The mixed moment <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458032.png" /> is called the [[Covariance|covariance]] and is one of the basic characteristics of dependency between random variables (see [[Correlation (in statistics)|Correlation (in statistics)]]). Many properties of moments (in particular, the inequality for moments) are consequences of the fact that for any random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458033.png" /> the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458034.png" /> is convex with respect to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458035.png" /> in each finite interval on which the function is defined; <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458036.png" /> is a non-decreasing function of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458037.png" />. The moments <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458038.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458039.png" /> exist if and only if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458040.png" />. The existence of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458041.png" /> implies the existence of all moments of orders <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458042.png" />. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458043.png" /> for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458044.png" />, then the mixed moments <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458045.png" /> exist for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458046.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458047.png" />. In some cases, for the definition of moments, the so-called moment generating function is useful — the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458048.png" /> with the moments of the distribution as coefficients in its power-series expansion; for integer-valued random variables this function is related to the [[Generating function|generating function]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458049.png" /> by the relation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458050.png" />. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458051.png" />, then the [[Characteristic function|characteristic function]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458052.png" /> of the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458053.png" /> has continuous derivatives up to order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458054.png" /> inclusively, and the moment of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458055.png" /> is the coefficient of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458056.png" /> in the expansion of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458057.png" /> in powers of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458058.png" />,
+
A numerical characteristic of a [[Probability distribution|probability distribution]]. The moment of order $  k $(
 +
$  k > 0 $
 +
an integer) of a random variable  $  X $
 +
is defined as the mathematical expectation $  {\mathsf E} X  ^ {k} $,  
 +
if it exists. If $  F $
 +
is the [[Distribution function|distribution function]] of the random variable $  X $,  
 +
then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458059.png" /></td> </tr></table>
+
$$ \tag{* }
 +
{\mathsf E} X  ^ {k}  = \int\limits _ {- \infty } ^  \infty  x  ^ {k}  d F ( x ) .
 +
$$
  
If the characteristic function has a derivative of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458060.png" /> at zero, then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458061.png" />.
+
For the definition of a moment in probability theory, a direct analogy is used with the corresponding idea which plays a major role in mechanics: Formula (*) is defined as the moment of a mass distribution. The first-order moment (a statistical moment in mechanics) of a random variable  $  X $
 +
is the mathematical expectation  $  {\mathsf E} X $.
 +
The value  $  {\mathsf E} ( X - a )  ^ {k} $
 +
is called the moment of order  $  k $
 +
relative to  $  a $,
 +
$  {\mathsf E} ( X - {\mathsf E} X )  ^ {k} $
 +
is the central moment of order  $  k $.
 +
The second-order central moment  $  {\mathsf E} ( X - {\mathsf E} X )  ^ {2} $
 +
is called the [[Dispersion|dispersion]] (or variance)  $  {\mathsf D} X $(
 +
the moment of inertia in mechanics). The value  $  {\mathsf E} | X |  ^ {k} $
 +
is called the [[Absolute moment|absolute moment]] of order $  k $(
 +
absolute moments are also defined for non-integral  $  k $).
 +
The moments of the [[Joint distribution|joint distribution]] of random variables  $  X _ {1} \dots X _ {n} $(
 +
see [[Multi-dimensional distribution|Multi-dimensional distribution]]) are defined similarly: For any integers  $  k _ {i} \geq  0 $,
 +
$  k _ {1} + \dots + k _ {n} = k $,
 +
the mathematical expectation  $  {\mathsf E} ( X _ {1} ^ {k _ {1} } \dots X _ {n} ^ {k _ {n} } ) $
 +
is called a mixed moment of order  $  k $,
 +
and  $  {\mathsf E} ( X _ {1} - {\mathsf E} X _ {1} ) ^ {k _ {1} } \dots ( X _ {n} - {\mathsf E} X _ {n} ) ^ {k _ {n} } $
 +
is called a central mixed moment of order  $  k $.
 +
The mixed moment  $  {\mathsf E} ( X _ {1} - {\mathsf E} X _ {1} ) ( X _ {2} - {\mathsf E} X _ {2} ) $
 +
is called the [[Covariance|covariance]] and is one of the basic characteristics of dependency between random variables (see [[Correlation (in statistics)|Correlation (in statistics)]]). Many properties of moments (in particular, the inequality for moments) are consequences of the fact that for any random variable  $  X $
 +
the function  $  g ( k) = \mathop{\rm log}  {\mathsf E} | X |  ^ {k} $
 +
is convex with respect to  $  k $
 +
in each finite interval on which the function is defined;  $  ( {\mathsf E} | X |  ^ {k} ) ^ {1 / k } $
 +
is a non-decreasing function of  $  k $.
 +
The moments  $  {\mathsf E} X  ^ {k} $
 +
and  $  {\mathsf E} ( X - a )  ^ {k} $
 +
exist if and only if  $  {\mathsf E} | X |  ^ {k} < \infty $.
 +
The existence of  $  {\mathsf E} | X | ^ {k _ {0} } $
 +
implies the existence of all moments of orders  $  k \leq  k _ {0} $.
 +
If  $  {\mathsf E} | X _ {i} |  ^ {k} < \infty $
 +
for all  $  i = 1 \dots n $,
 +
then the mixed moments  $  {\mathsf E} X _ {1} ^ {k _ {1} } \dots X _ {n} ^ {k _ {n} } $
 +
exist for all  $  k _ {i} \geq  0 $,
 +
$  k _ {1} + \dots + k _ {n} \leq  k $.  
 +
In some cases, for the definition of moments, the so-called moment generating function is useful — the function  $  M ( t ) $
 +
with the moments of the distribution as coefficients in its power-series expansion; for integer-valued random variables this function is related to the [[Generating function|generating function]]  $  P ( s ) $
 +
by the relation  $  M ( t) = P ( e  ^ {t} ) $.  
 +
If  $  {\mathsf E} | X |  ^ {k} < \infty $,
 +
then the [[Characteristic function|characteristic function]]  $  f ( t ) $
 +
of the random variable  $  X $
 +
has continuous derivatives up to order  $  k $
 +
inclusively, and the moment of order  $  k $
 +
is the coefficient of  $  ( it )  ^ {k} / k ! $
 +
in the expansion of  $  f ( t ) $
 +
in powers of  $  t $,
 +
 
 +
$$
 +
{\mathsf E} X  ^ {k}  =  \left . ( - i )  ^ {k}
 +
\frac{d  ^ {k} }{d t  ^ {k} }
 +
f ( t) \right | _ {t=} 0 .
 +
$$
 +
 
 +
If the characteristic function has a derivative of order  $  2 k $
 +
at zero, then $  {\mathsf E} | X |  ^ {2k} < \infty $.
  
 
For the connection of moments with semi-invariants see [[Semi-invariant(2)|Semi-invariant]]. If the moments of a distribution are known, then it is possible to make some assertions about the probabilities of deviation of a random variable from its mathematical expectation in terms of inequalities; the best known are the [[Chebyshev inequality in probability theory|Chebyshev inequality in probability theory]]
 
For the connection of moments with semi-invariants see [[Semi-invariant(2)|Semi-invariant]]. If the moments of a distribution are known, then it is possible to make some assertions about the probabilities of deviation of a random variable from its mathematical expectation in terms of inequalities; the best known are the [[Chebyshev inequality in probability theory|Chebyshev inequality in probability theory]]
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458062.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ | X - {\mathsf E} X | \geq  \epsilon \}  \leq  \
 +
 
 +
\frac{D X }{\epsilon  ^ {2} }
 +
,\  \epsilon > 0 ,
 +
$$
  
 
and its generalizations.
 
and its generalizations.
  
Problems of determining a probability distribution from its sequence of moments are called moment problems (cf. [[Moment problem|Moment problem]]). Such problems were first discussed by P.L. Chebyshev (1874) in connection with research on limit theorems. In order that the probability distribution of a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458063.png" /> be uniquely defined by its moments <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458064.png" /> it is sufficient, for example, that Carleman's condition be satisfied:
+
Problems of determining a probability distribution from its sequence of moments are called moment problems (cf. [[Moment problem|Moment problem]]). Such problems were first discussed by P.L. Chebyshev (1874) in connection with research on limit theorems. In order that the probability distribution of a random variable $  X $
 +
be uniquely defined by its moments $  \alpha _ {k} = {\mathsf E} X  ^ {k} $
 +
it is sufficient, for example, that Carleman's condition be satisfied:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458065.png" /></td> </tr></table>
+
$$
 +
\sum _ { k= } 1 ^  \infty 
 +
\frac{1}{\alpha _ {2k} ^ {1 / 2 k } }
 +
  = \infty .
 +
$$
  
 
A similar result even holds for moments of random vectors.
 
A similar result even holds for moments of random vectors.
  
The use of moments in the proof of limit theorems is based on the following fact. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458066.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458067.png" /> be a sequence of distribution functions, all moments <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458068.png" /> of which are finite, and for each integer <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458069.png" /> let
+
The use of moments in the proof of limit theorems is based on the following fact. Let $  F _ {n} $,
 
+
$  n = 1 , 2 \dots $
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458070.png" /></td> </tr></table>
+
be a sequence of distribution functions, all moments $  \alpha _ {k} ( n ) $
 
+
of which are finite, and for each integer $  k \geq  1 $
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458071.png" /> is finite. Then there is a sequence <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458072.png" /> that weakly converges to a distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458073.png" /> having <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458074.png" /> as its moments. If the moments determine <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458075.png" /> uniquely, then the sequence <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458076.png" /> weakly converges to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458077.png" />. Based on this is the so-called method of moments (cf. [[Moments, method of (in probability theory)|Moments, method of (in probability theory)]]), used, in particular, in mathematical statistics for the study of the deviation of empirical distributions from theoretical ones, and for the statistical estimation of parameters of a distribution (on sampling moments as well as estimators for moments of a certain distribution see [[Empirical distribution|Empirical distribution]]).
+
let
  
 +
$$
 +
\alpha _ {k} ( n )  \rightarrow  \alpha _ {k} \  \textrm{ as }  n \rightarrow \infty ,
 +
$$
  
 +
where  $  \alpha _ {k} $
 +
is finite. Then there is a sequence  $  F _ {n _ {i}  } $
 +
that weakly converges to a distribution function  $  F $
 +
having  $  \alpha _ {k} $
 +
as its moments. If the moments determine  $  F $
 +
uniquely, then the sequence  $  F _ {n} $
 +
weakly converges to  $  F $.
 +
Based on this is the so-called method of moments (cf. [[Moments, method of (in probability theory)|Moments, method of (in probability theory)]]), used, in particular, in mathematical statistics for the study of the deviation of empirical distributions from theoretical ones, and for the statistical estimation of parameters of a distribution (on sampling moments as well as estimators for moments of a certain distribution see [[Empirical distribution|Empirical distribution]]).
  
 
====Comments====
 
====Comments====
The moment generating function is defined by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m064/m064580/m06458078.png" />.
+
The moment generating function is defined by $  M( t) = {\mathsf E} e  ^ {tX} $.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> W. Feller, [[Feller, "An introduction to probability theory and its  applications"|"An introduction to probability theory and its  applications"]], '''2''', Wiley (1971) Chapt. 1</TD></TR></table>
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> W. Feller, [[Feller, "An introduction to probability theory and its  applications"|"An introduction to probability theory and its  applications"]], '''2''', Wiley (1971) Chapt. 1</TD></TR></table>

Revision as of 08:01, 6 June 2020


A numerical characteristic of a probability distribution. The moment of order $ k $( $ k > 0 $ an integer) of a random variable $ X $ is defined as the mathematical expectation $ {\mathsf E} X ^ {k} $, if it exists. If $ F $ is the distribution function of the random variable $ X $, then

$$ \tag{* } {\mathsf E} X ^ {k} = \int\limits _ {- \infty } ^ \infty x ^ {k} d F ( x ) . $$

For the definition of a moment in probability theory, a direct analogy is used with the corresponding idea which plays a major role in mechanics: Formula (*) is defined as the moment of a mass distribution. The first-order moment (a statistical moment in mechanics) of a random variable $ X $ is the mathematical expectation $ {\mathsf E} X $. The value $ {\mathsf E} ( X - a ) ^ {k} $ is called the moment of order $ k $ relative to $ a $, $ {\mathsf E} ( X - {\mathsf E} X ) ^ {k} $ is the central moment of order $ k $. The second-order central moment $ {\mathsf E} ( X - {\mathsf E} X ) ^ {2} $ is called the dispersion (or variance) $ {\mathsf D} X $( the moment of inertia in mechanics). The value $ {\mathsf E} | X | ^ {k} $ is called the absolute moment of order $ k $( absolute moments are also defined for non-integral $ k $). The moments of the joint distribution of random variables $ X _ {1} \dots X _ {n} $( see Multi-dimensional distribution) are defined similarly: For any integers $ k _ {i} \geq 0 $, $ k _ {1} + \dots + k _ {n} = k $, the mathematical expectation $ {\mathsf E} ( X _ {1} ^ {k _ {1} } \dots X _ {n} ^ {k _ {n} } ) $ is called a mixed moment of order $ k $, and $ {\mathsf E} ( X _ {1} - {\mathsf E} X _ {1} ) ^ {k _ {1} } \dots ( X _ {n} - {\mathsf E} X _ {n} ) ^ {k _ {n} } $ is called a central mixed moment of order $ k $. The mixed moment $ {\mathsf E} ( X _ {1} - {\mathsf E} X _ {1} ) ( X _ {2} - {\mathsf E} X _ {2} ) $ is called the covariance and is one of the basic characteristics of dependency between random variables (see Correlation (in statistics)). Many properties of moments (in particular, the inequality for moments) are consequences of the fact that for any random variable $ X $ the function $ g ( k) = \mathop{\rm log} {\mathsf E} | X | ^ {k} $ is convex with respect to $ k $ in each finite interval on which the function is defined; $ ( {\mathsf E} | X | ^ {k} ) ^ {1 / k } $ is a non-decreasing function of $ k $. The moments $ {\mathsf E} X ^ {k} $ and $ {\mathsf E} ( X - a ) ^ {k} $ exist if and only if $ {\mathsf E} | X | ^ {k} < \infty $. The existence of $ {\mathsf E} | X | ^ {k _ {0} } $ implies the existence of all moments of orders $ k \leq k _ {0} $. If $ {\mathsf E} | X _ {i} | ^ {k} < \infty $ for all $ i = 1 \dots n $, then the mixed moments $ {\mathsf E} X _ {1} ^ {k _ {1} } \dots X _ {n} ^ {k _ {n} } $ exist for all $ k _ {i} \geq 0 $, $ k _ {1} + \dots + k _ {n} \leq k $. In some cases, for the definition of moments, the so-called moment generating function is useful — the function $ M ( t ) $ with the moments of the distribution as coefficients in its power-series expansion; for integer-valued random variables this function is related to the generating function $ P ( s ) $ by the relation $ M ( t) = P ( e ^ {t} ) $. If $ {\mathsf E} | X | ^ {k} < \infty $, then the characteristic function $ f ( t ) $ of the random variable $ X $ has continuous derivatives up to order $ k $ inclusively, and the moment of order $ k $ is the coefficient of $ ( it ) ^ {k} / k ! $ in the expansion of $ f ( t ) $ in powers of $ t $,

$$ {\mathsf E} X ^ {k} = \left . ( - i ) ^ {k} \frac{d ^ {k} }{d t ^ {k} } f ( t) \right | _ {t=} 0 . $$

If the characteristic function has a derivative of order $ 2 k $ at zero, then $ {\mathsf E} | X | ^ {2k} < \infty $.

For the connection of moments with semi-invariants see Semi-invariant. If the moments of a distribution are known, then it is possible to make some assertions about the probabilities of deviation of a random variable from its mathematical expectation in terms of inequalities; the best known are the Chebyshev inequality in probability theory

$$ {\mathsf P} \{ | X - {\mathsf E} X | \geq \epsilon \} \leq \ \frac{D X }{\epsilon ^ {2} } ,\ \epsilon > 0 , $$

and its generalizations.

Problems of determining a probability distribution from its sequence of moments are called moment problems (cf. Moment problem). Such problems were first discussed by P.L. Chebyshev (1874) in connection with research on limit theorems. In order that the probability distribution of a random variable $ X $ be uniquely defined by its moments $ \alpha _ {k} = {\mathsf E} X ^ {k} $ it is sufficient, for example, that Carleman's condition be satisfied:

$$ \sum _ { k= } 1 ^ \infty \frac{1}{\alpha _ {2k} ^ {1 / 2 k } } = \infty . $$

A similar result even holds for moments of random vectors.

The use of moments in the proof of limit theorems is based on the following fact. Let $ F _ {n} $, $ n = 1 , 2 \dots $ be a sequence of distribution functions, all moments $ \alpha _ {k} ( n ) $ of which are finite, and for each integer $ k \geq 1 $ let

$$ \alpha _ {k} ( n ) \rightarrow \alpha _ {k} \ \textrm{ as } n \rightarrow \infty , $$

where $ \alpha _ {k} $ is finite. Then there is a sequence $ F _ {n _ {i} } $ that weakly converges to a distribution function $ F $ having $ \alpha _ {k} $ as its moments. If the moments determine $ F $ uniquely, then the sequence $ F _ {n} $ weakly converges to $ F $. Based on this is the so-called method of moments (cf. Moments, method of (in probability theory)), used, in particular, in mathematical statistics for the study of the deviation of empirical distributions from theoretical ones, and for the statistical estimation of parameters of a distribution (on sampling moments as well as estimators for moments of a certain distribution see Empirical distribution).

Comments

The moment generating function is defined by $ M( t) = {\mathsf E} e ^ {tX} $.

References

[a1] W. Feller, "An introduction to probability theory and its applications", 2, Wiley (1971) Chapt. 1
How to Cite This Entry:
Moment. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Moment&oldid=47880
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article