Difference between revisions of "Marginal distribution"
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
(latex details) |
||
Line 37: | Line 37: | ||
$$ | $$ | ||
− | F ( x _ {1} \dots x _ {n} ) = \ | + | F ( x _ {1} \dots x _ {n} ) = \prod_{i=1}^ { n } F _ {i} ( x _ {i} ) |
− | |||
− | F _ {i} ( x _ {i} ) | ||
$$ | $$ | ||
Line 45: | Line 43: | ||
$$ | $$ | ||
− | p ( x _ {1} \dots x _ {n} ) = \ | + | p ( x _ {1} \dots x _ {n} ) = \prod_{i=1}^ { n } p _ {i} ( x _ {i} ) . |
− | |||
− | p _ {i} ( x _ {i} ) . | ||
$$ | $$ | ||
Line 53: | Line 49: | ||
====References==== | ====References==== | ||
− | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> M. Loève, "Probability theory" , Springer (1977)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)</TD></TR></table> | + | <table> |
+ | <TR><TD valign="top">[1]</TD> <TD valign="top"> M. Loève, "Probability theory" , Springer (1977)</TD></TR> | ||
+ | <TR><TD valign="top">[2]</TD> <TD valign="top"> H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)</TD></TR> | ||
+ | </table> |
Latest revision as of 07:40, 14 January 2024
The distribution of a random variable, or set of random variables, obtained by considering a component, or subset of components, of a larger random vector (see Multi-dimensional distribution) with a given distribution. Thus the marginal distribution is the projection of the distribution of the random vector $ X= ( X _ {1} \dots X _ {n} ) $
onto an axis $ x _ {1} $
or subspace defined by variables $ x _ {i _ {1} } \dots x _ {i _ {k} } $,
and is completely determined by the distribution of the original vector. For example, if $ F ( x _ {1} , x _ {2} ) $
is the distribution function of $ X = ( X _ {1} , X _ {2} ) $
in $ \mathbf R ^ {2} $,
then the distribution function of $ X _ {1} $
is equal to $ F _ {1} ( x _ {1} ) = F ( x _ {1} , + \infty ) $;
if the two-dimensional distribution is absolutely continuous and if $ p ( x _ {1} , x _ {2} ) $
is its density, then the density of the marginal distribution of $ X _ {1} $
is
$$ p _ {1} ( x _ {1} ) = \ \int\limits _ {- \infty } ^ { {+ } \infty } p ( x _ {1} , x _ {2} ) d x _ {2} . $$
The marginal distribution is calculated similarly for any component or set of components of the vector $ X = ( X _ {1} \dots X _ {n} ) $ for any $ n $. If the distribution of $ X $ is normal, then all marginal distributions are also normal. When $ X _ {1} \dots X _ {n} $ are mutually independent, then the distribution of $ X $ is uniquely determined by the marginal distributions of the components $ X _ {1} \dots X _ {n} $ of $ X $:
$$ F ( x _ {1} \dots x _ {n} ) = \prod_{i=1}^ { n } F _ {i} ( x _ {i} ) $$
and
$$ p ( x _ {1} \dots x _ {n} ) = \prod_{i=1}^ { n } p _ {i} ( x _ {i} ) . $$
The marginal distribution with respect to a probability distribution given on a product of spaces more general than real lines is defined similarly.
References
[1] | M. Loève, "Probability theory" , Springer (1977) |
[2] | H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946) |
Marginal distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Marginal_distribution&oldid=47762