Namespaces
Variants
Actions

Characterization theorems

From Encyclopedia of Mathematics
Jump to: navigation, search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.


in probability theory and mathematical statistics

Theorems that establish a connection between the type of the distribution of random variables or random vectors and certain general properties of functions in them.

Example 1.

Let $ X $ be a three-dimensional random vector such that:

1) its projections $ X _ {1} , X _ {2} , X _ {3} $ onto any three mutually-orthogonal axes are independent; and

2) the density $ p ( x) $, $ x = ( x _ {1} , x _ {2} , x _ {3} ) $, of the probability distribution of $ X $ depends only on $ x _ {1} ^ {2} + x _ {2} ^ {2} + x _ {3} ^ {2} $. Then the distribution of $ X $ is normal and

$$ p ( x) = \ \frac{1}{( 2 \pi ) ^ {3/2} \sigma ^ {2} } \ \mathop{\rm exp} \left \{ - \frac{1}{2 \sigma ^ {2} } ( x _ {1} ^ {2} + x _ {2} ^ {2} + x _ {3} ^ {2} ) \right \} , $$

where $ \sigma > 0 $ is a certain constant (the Maxwell law for the distribution of the velocities of molecules in a gas in stationary state).

Example 2.

Let $ X \in \mathbf R ^ {n} $ be a random vector with independent and identically-distributed components $ X = ( X _ {1} \dots X _ {n} ) $. If the distribution is normal then the "sample meansample mean"

$$ \overline{X}\; = \ { \frac{1}{n} } \sum _ {j = 1 } ^ { n } X _ {j} $$

and the "sample variancesample variance"

$$ \overline{ {s ^ {2} }}\; = \ { \frac{1}{n} } \sum _ {j = 1 } ^ { n } ( X _ {j} - \overline{X}\; ) ^ {2} $$

are independent random variables. Conversely, if they are independent, then the distribution of $ X $ is normal.

Example 3.

Let $ X \in \mathbf R ^ {n} $ be a vector with independent and identically-distributed components. There are non-zero constants $ a _ {1} \dots a _ {n} $, $ b _ {1} \dots b _ {n} $ such that the random variables

$$ Y _ {1} = \ a _ {1} X _ {1} + \dots + a _ {n} X _ {n} $$

and

$$ Y _ {2} = \ b _ {1} X _ {1} + \dots + b _ {n} X _ {n} $$

are independent if and only if $ X $ has a normal distribution. The last assertion remains true if the assumption that $ Y _ {1} $ and $ Y _ {2} $ are independent is replaced by the assumption that they are identically distributed, adding, however, certain restrictions on the coefficients $ a _ {j} $ and $ b _ {j} $.

A characterization of a similar kind of the distribution of a random vector $ X \in \mathbf R ^ {n} $ by the property of identical distribution or of independence of two polynomials $ Q _ {1} ( X) $ and $ Q _ {2} ( X) $ is given by a number of characterization theorems that play an important role in mathematical statistics.

References

[1] A.M. Kagan, Yu.V. Linnik, S.R. Rao, "Characterization problems in mathematical statistics" , Wiley (1973) (Translated from Russian)
How to Cite This Entry:
Characterization theorems. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Characterization_theorems&oldid=46324
This article was adapted from an original article by Yu.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article