Difference between revisions of "Marginal distribution"
(Importing text file) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
| Line 1: | Line 1: | ||
| − | + | <!-- | |
| + | m0623401.png | ||
| + | $#A+1 = 20 n = 0 | ||
| + | $#C+1 = 20 : ~/encyclopedia/old_files/data/M062/M.0602340 Marginal distribution | ||
| + | Automatically converted into TeX, above some diagnostics. | ||
| + | Please remove this comment and the {{TEX|auto}} line below, | ||
| + | if TeX found to be correct. | ||
| + | --> | ||
| − | + | {{TEX|auto}} | |
| + | {{TEX|done}} | ||
| − | The | + | The distribution of a random variable, or set of random variables, obtained by considering a component, or subset of components, of a larger random vector (see [[Multi-dimensional distribution|Multi-dimensional distribution]]) with a given distribution. Thus the marginal distribution is the projection of the distribution of the random vector $ X= ( X _ {1} \dots X _ {n} ) $ |
| + | onto an axis $ x _ {1} $ | ||
| + | or subspace defined by variables $ x _ {i _ {1} } \dots x _ {i _ {k} } $, | ||
| + | and is completely determined by the distribution of the original vector. For example, if $ F ( x _ {1} , x _ {2} ) $ | ||
| + | is the distribution function of $ X = ( X _ {1} , X _ {2} ) $ | ||
| + | in $ \mathbf R ^ {2} $, | ||
| + | then the distribution function of $ X _ {1} $ | ||
| + | is equal to $ F _ {1} ( x _ {1} ) = F ( x _ {1} , + \infty ) $; | ||
| + | if the two-dimensional distribution is absolutely continuous and if $ p ( x _ {1} , x _ {2} ) $ | ||
| + | is its density, then the density of the marginal distribution of $ X _ {1} $ | ||
| + | is | ||
| − | + | $$ | |
| + | p _ {1} ( x _ {1} ) = \ | ||
| + | \int\limits _ {- \infty } ^ { {+ } \infty } p ( x _ {1} , x _ {2} ) d x _ {2} . | ||
| + | $$ | ||
| + | |||
| + | The marginal distribution is calculated similarly for any component or set of components of the vector $ X = ( X _ {1} \dots X _ {n} ) $ | ||
| + | for any $ n $. | ||
| + | If the distribution of $ X $ | ||
| + | is normal, then all marginal distributions are also normal. When $ X _ {1} \dots X _ {n} $ | ||
| + | are mutually independent, then the distribution of $ X $ | ||
| + | is uniquely determined by the marginal distributions of the components $ X _ {1} \dots X _ {n} $ | ||
| + | of $ X $: | ||
| + | |||
| + | $$ | ||
| + | F ( x _ {1} \dots x _ {n} ) = \ | ||
| + | \prod _ { i= } 1 ^ { n } | ||
| + | F _ {i} ( x _ {i} ) | ||
| + | $$ | ||
and | and | ||
| − | + | $$ | |
| + | p ( x _ {1} \dots x _ {n} ) = \ | ||
| + | \prod _ { i= } 1 ^ { n } | ||
| + | p _ {i} ( x _ {i} ) . | ||
| + | $$ | ||
The marginal distribution with respect to a probability distribution given on a product of spaces more general than real lines is defined similarly. | The marginal distribution with respect to a probability distribution given on a product of spaces more general than real lines is defined similarly. | ||
Revision as of 07:59, 6 June 2020
The distribution of a random variable, or set of random variables, obtained by considering a component, or subset of components, of a larger random vector (see Multi-dimensional distribution) with a given distribution. Thus the marginal distribution is the projection of the distribution of the random vector $ X= ( X _ {1} \dots X _ {n} ) $
onto an axis $ x _ {1} $
or subspace defined by variables $ x _ {i _ {1} } \dots x _ {i _ {k} } $,
and is completely determined by the distribution of the original vector. For example, if $ F ( x _ {1} , x _ {2} ) $
is the distribution function of $ X = ( X _ {1} , X _ {2} ) $
in $ \mathbf R ^ {2} $,
then the distribution function of $ X _ {1} $
is equal to $ F _ {1} ( x _ {1} ) = F ( x _ {1} , + \infty ) $;
if the two-dimensional distribution is absolutely continuous and if $ p ( x _ {1} , x _ {2} ) $
is its density, then the density of the marginal distribution of $ X _ {1} $
is
$$ p _ {1} ( x _ {1} ) = \ \int\limits _ {- \infty } ^ { {+ } \infty } p ( x _ {1} , x _ {2} ) d x _ {2} . $$
The marginal distribution is calculated similarly for any component or set of components of the vector $ X = ( X _ {1} \dots X _ {n} ) $ for any $ n $. If the distribution of $ X $ is normal, then all marginal distributions are also normal. When $ X _ {1} \dots X _ {n} $ are mutually independent, then the distribution of $ X $ is uniquely determined by the marginal distributions of the components $ X _ {1} \dots X _ {n} $ of $ X $:
$$ F ( x _ {1} \dots x _ {n} ) = \ \prod _ { i= } 1 ^ { n } F _ {i} ( x _ {i} ) $$
and
$$ p ( x _ {1} \dots x _ {n} ) = \ \prod _ { i= } 1 ^ { n } p _ {i} ( x _ {i} ) . $$
The marginal distribution with respect to a probability distribution given on a product of spaces more general than real lines is defined similarly.
References
| [1] | M. Loève, "Probability theory" , Springer (1977) |
| [2] | H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946) |
Marginal distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Marginal_distribution&oldid=13898