Namespaces
Variants
Actions

Difference between revisions of "Multi-dimensional distribution"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
 +
<!--
 +
m0651201.png
 +
$#A+1 = 51 n = 0
 +
$#C+1 = 51 : ~/encyclopedia/old_files/data/M065/M.0605120 Multi\AAhdimensional distribution,
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
''multivariate distribution''
 
''multivariate distribution''
  
A probability distribution on the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651201.png" />-algebra of Borel sets of an <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651202.png" />-dimensional Euclidean space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651203.png" />. One usually speaks of a multivariate distribution as the distribution of a multi-dimensional random variable, or random vector, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651204.png" />, meaning by this the [[Joint distribution|joint distribution]] of the real random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651205.png" /> given on the same space of elementary events <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651206.png" /> (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651207.png" /> may be regarded as coordinate variables in the space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651208.png" />). A multivariate distribution is uniquely determined by its [[Distribution function|distribution function]] — the function
+
A probability distribution on the $  \sigma $-
 +
algebra of Borel sets of an $  s $-
 +
dimensional Euclidean space $  \mathbf R  ^ {s} $.  
 +
One usually speaks of a multivariate distribution as the distribution of a multi-dimensional random variable, or random vector, $  X = ( X _ {1} \dots X _ {s} ) $,  
 +
meaning by this the [[Joint distribution|joint distribution]] of the real random variables $  X _ {1} ( \omega ) \dots X _ {s} ( \omega ) $
 +
given on the same space of elementary events $  \Omega $(
 +
$  X _ {1} \dots X _ {s} $
 +
may be regarded as coordinate variables in the space $  \Omega = \mathbf R  ^ {s} $).  
 +
A multivariate distribution is uniquely determined by its [[Distribution function|distribution function]] — the function
 +
 
 +
$$
 +
F ( x _ {1} \dots x _ {s} )  = \
 +
{\mathsf P} \{ X _ {1} < x _ {1} \dots X _ {s} < x _ {s} \}
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m0651209.png" /></td> </tr></table>
+
of the real variables  $  x _ {1} \dots x _ {s} $.
  
of the real variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512010.png" />.
+
As in the one-dimensional case, the most widespread multivariate distributions are the discrete and the absolutely-continuous distributions. In the discrete case a multivariate distribution is concentrated on a finite or countable set of points  $  ( x _ {i _ {1}  } \dots x _ {i _ {s}  } ) $
 +
of  $  \mathbf R  ^ {s} $
 +
such that
  
As in the one-dimensional case, the most widespread multivariate distributions are the discrete and the absolutely-continuous distributions. In the discrete case a multivariate distribution is concentrated on a finite or countable set of points <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512011.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512012.png" /> such that
+
$$
 +
{\mathsf P} \{ X _ {1} = x _ {i _ {1}  } \dots X _ {s} = x _ {i _ {s}  }
 +
\}  = \
 +
p _ {i _ {1}  \dots i _ {s} }  \geq  0 ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512013.png" /></td> </tr></table>
+
$$
 +
\sum _ {i _ {1} \dots i _ {s} } p _ {i _ {1}  \dots i _ {s} }  = 1
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512014.png" /></td> </tr></table>
+
(see, for example, [[Multinomial distribution|Multinomial distribution]]). In the absolutely-continuous case almost-everywhere (with respect to Lebesgue measure) on  $  \mathbf R  ^ {s} $,
  
(see, for example, [[Multinomial distribution|Multinomial distribution]]). In the absolutely-continuous case almost-everywhere (with respect to Lebesgue measure) on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512015.png" />,
+
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512016.png" /></td> </tr></table>
+
\frac{\partial  ^ {s} F ( x _ {1} \dots x _ {s} ) }{\partial  x _ {1} \dots \partial  x _ {s} }
 +
  = p ( x _ {1} \dots x _ {s} ) ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512017.png" /> is the density of the multivariate distribution:
+
where $  p ( x _ {1} \dots x _ {s} ) \geq  0 $
 +
is the density of the multivariate distribution:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512018.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ X \in A \}  = \
 +
\int\limits _ { A } p ( x _ {1} \dots x _ {s} ) \
 +
d x _ {1} \dots d x _ {s} ,
 +
$$
  
for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512019.png" /> from the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512020.png" />-algebra of Borel subsets of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512021.png" />, and
+
for any $  A $
 +
from the $  \sigma $-
 +
algebra of Borel subsets of $  \mathbf R  ^ {s} $,  
 +
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512022.png" /></td> </tr></table>
+
$$
 +
\int\limits _ {\mathbf R  ^ {s} }
 +
p ( x _ {1} \dots x _ {s} ) \
 +
d x _ {1} \dots d x _ {s}  = 1 .
 +
$$
  
The distribution of any random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512023.png" /> (and also, for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512024.png" />, the distribution of the variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512025.png" />) relative to a multivariate distribution is called a [[Marginal distribution|marginal distribution]]. The marginal distributions are completely determined by the given multivariate distribution. When <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512026.png" /> are independent, then
+
The distribution of any random variable $  X _ {i} $(
 +
and also, for any $  m < s $,  
 +
the distribution of the variables $  X _ {i _ {1}  } \dots X _ {i _ {m}  } $)  
 +
relative to a multivariate distribution is called a [[Marginal distribution|marginal distribution]]. The marginal distributions are completely determined by the given multivariate distribution. When $  X _ {1} \dots X _ {s} $
 +
are independent, then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512027.png" /></td> </tr></table>
+
$$
 +
F ( x _ {1} \dots x _ {s} )  = \
 +
F _ {1} ( x _ {1} ) \dots F _ {s} ( x _ {s} )
 +
$$
  
 
and
 
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512028.png" /></td> </tr></table>
+
$$
 +
p ( x _ {1} \dots x _ {s} )  = \
 +
p _ {1} ( x _ {1} ) \dots p _ {s} ( x _ {s} ) ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512029.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512030.png" /> are, respectively, the marginal distribution functions and densities of the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512031.png" />.
+
where $  F _ {i} ( x) $
 +
and $  p _ {i} ( x) $
 +
are, respectively, the marginal distribution functions and densities of the $  X _ {i} $.
  
The mathematical expectation of any function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512032.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512033.png" /> is defined by the integral of this function with respect to the multivariate distribution; in particular, in the absolutely-continuous case it is defined by the integral
+
The mathematical expectation of any function $  f ( X _ {1} \dots X _ {s} ) $
 +
of $  X _ {1} \dots X _ {s} $
 +
is defined by the integral of this function with respect to the multivariate distribution; in particular, in the absolutely-continuous case it is defined by the integral
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512034.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} f ( X _ {1} \dots X _ {s} ) =
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512035.png" /></td> </tr></table>
+
$$
 +
= \
 +
\int\limits _ {\mathbf R  ^ {s} } f ( x _ {1} \dots x _ {s} ) p
 +
( x _ {1} \dots x _ {s} )  d x _ {1} \dots d x _ {s} .
 +
$$
  
The characteristic function of a multivariate distribution is the function of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512036.png" /> given by
+
The characteristic function of a multivariate distribution is the function of $  t = ( t _ {1} \dots t _ {s} ) $
 +
given by
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512037.png" /></td> </tr></table>
+
$$
 
+
\phi ( t) = \
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512038.png" />. The fundamental characteristics of a multivariate distribution are the moments (cf. [[Moment|Moment]]): the mixed moments <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512039.png" /> and the central mixed moments <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512040.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512041.png" /> is the order of the corresponding moment. The roles of the expectation and the variance for a multivariate distribution are played by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512042.png" /> and the set of second-order central mixed moments, which form the [[Covariance matrix|covariance matrix]]. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512043.png" /> for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512044.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512045.png" />, then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512046.png" /> are called pairwise uncorrelated or orthogonal (the covariance matrix is diagonal). If the rank <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512047.png" /> of the covariance matrix is less than <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512048.png" />, then the multivariate distribution is called a [[Degenerate distribution|degenerate distribution]]; in this case the distribution is concentrated on some linear manifold in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512049.png" /> of dimension <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512050.png" />.
+
{\mathsf E} e ^
 
+
{i t x }  ^  \prime  ,
For methods of investigating dependencies between <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m065/m065120/m06512051.png" /> see [[Correlation|Correlation]]; [[Regression|Regression]].
+
$$
  
 +
where  $  t x  ^  \prime  = t _ {1} x _ {1} + \dots + t _ {s} x _ {s} $.
 +
The fundamental characteristics of a multivariate distribution are the moments (cf. [[Moment|Moment]]): the mixed moments  $  {\mathsf E} X _ {1} ^ {k _ {1} } \dots X _ {s} ^ {k _ {s} } $
 +
and the central mixed moments  $  {\mathsf E} ( X _ {1} - {\mathsf E} X _ {1} ) ^ {k _ {1} } \dots ( X _ {s} - {\mathsf E} X _ {s} ) ^ {k _ {s} } $,
 +
where  $  k _ {1} + \dots + k _ {s} $
 +
is the order of the corresponding moment. The roles of the expectation and the variance for a multivariate distribution are played by  $  {\mathsf E} X = ( {\mathsf E} X _ {1} \dots {\mathsf E} X _ {s} ) $
 +
and the set of second-order central mixed moments, which form the [[Covariance matrix|covariance matrix]]. If  $  {\mathsf E} ( X _ {i} - {\mathsf E} X _ {i} ) ( X _ {j} - {\mathsf E} X _ {j} ) = 0 $
 +
for all  $  i , j $,
 +
$  i \neq j $,
 +
then  $  X _ {1} \dots X _ {s} $
 +
are called pairwise uncorrelated or orthogonal (the covariance matrix is diagonal). If the rank  $  r $
 +
of the covariance matrix is less than  $  s $,
 +
then the multivariate distribution is called a [[Degenerate distribution|degenerate distribution]]; in this case the distribution is concentrated on some linear manifold in  $  \mathbf R  ^ {s} $
 +
of dimension  $  r < n $.
  
 +
For methods of investigating dependencies between  $  X _ {1} \dots X _ {s} $
 +
see [[Correlation|Correlation]]; [[Regression|Regression]].
  
 
====Comments====
 
====Comments====
 
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  N.L. Johnson,  S. Kotz,  "Discrete distributions" , Houghton Mifflin  (1969)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  N.L. Johnson,  S. Kotz,  "Continuous multivariate distributions" , Wiley  (1942)</TD></TR></table>
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  N.L. Johnson,  S. Kotz,  "Discrete distributions" , Houghton Mifflin  (1969)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  N.L. Johnson,  S. Kotz,  "Continuous multivariate distributions" , Wiley  (1942)</TD></TR></table>

Latest revision as of 08:01, 6 June 2020


multivariate distribution

A probability distribution on the $ \sigma $- algebra of Borel sets of an $ s $- dimensional Euclidean space $ \mathbf R ^ {s} $. One usually speaks of a multivariate distribution as the distribution of a multi-dimensional random variable, or random vector, $ X = ( X _ {1} \dots X _ {s} ) $, meaning by this the joint distribution of the real random variables $ X _ {1} ( \omega ) \dots X _ {s} ( \omega ) $ given on the same space of elementary events $ \Omega $( $ X _ {1} \dots X _ {s} $ may be regarded as coordinate variables in the space $ \Omega = \mathbf R ^ {s} $). A multivariate distribution is uniquely determined by its distribution function — the function

$$ F ( x _ {1} \dots x _ {s} ) = \ {\mathsf P} \{ X _ {1} < x _ {1} \dots X _ {s} < x _ {s} \} $$

of the real variables $ x _ {1} \dots x _ {s} $.

As in the one-dimensional case, the most widespread multivariate distributions are the discrete and the absolutely-continuous distributions. In the discrete case a multivariate distribution is concentrated on a finite or countable set of points $ ( x _ {i _ {1} } \dots x _ {i _ {s} } ) $ of $ \mathbf R ^ {s} $ such that

$$ {\mathsf P} \{ X _ {1} = x _ {i _ {1} } \dots X _ {s} = x _ {i _ {s} } \} = \ p _ {i _ {1} \dots i _ {s} } \geq 0 , $$

$$ \sum _ {i _ {1} \dots i _ {s} } p _ {i _ {1} \dots i _ {s} } = 1 $$

(see, for example, Multinomial distribution). In the absolutely-continuous case almost-everywhere (with respect to Lebesgue measure) on $ \mathbf R ^ {s} $,

$$ \frac{\partial ^ {s} F ( x _ {1} \dots x _ {s} ) }{\partial x _ {1} \dots \partial x _ {s} } = p ( x _ {1} \dots x _ {s} ) , $$

where $ p ( x _ {1} \dots x _ {s} ) \geq 0 $ is the density of the multivariate distribution:

$$ {\mathsf P} \{ X \in A \} = \ \int\limits _ { A } p ( x _ {1} \dots x _ {s} ) \ d x _ {1} \dots d x _ {s} , $$

for any $ A $ from the $ \sigma $- algebra of Borel subsets of $ \mathbf R ^ {s} $, and

$$ \int\limits _ {\mathbf R ^ {s} } p ( x _ {1} \dots x _ {s} ) \ d x _ {1} \dots d x _ {s} = 1 . $$

The distribution of any random variable $ X _ {i} $( and also, for any $ m < s $, the distribution of the variables $ X _ {i _ {1} } \dots X _ {i _ {m} } $) relative to a multivariate distribution is called a marginal distribution. The marginal distributions are completely determined by the given multivariate distribution. When $ X _ {1} \dots X _ {s} $ are independent, then

$$ F ( x _ {1} \dots x _ {s} ) = \ F _ {1} ( x _ {1} ) \dots F _ {s} ( x _ {s} ) $$

and

$$ p ( x _ {1} \dots x _ {s} ) = \ p _ {1} ( x _ {1} ) \dots p _ {s} ( x _ {s} ) , $$

where $ F _ {i} ( x) $ and $ p _ {i} ( x) $ are, respectively, the marginal distribution functions and densities of the $ X _ {i} $.

The mathematical expectation of any function $ f ( X _ {1} \dots X _ {s} ) $ of $ X _ {1} \dots X _ {s} $ is defined by the integral of this function with respect to the multivariate distribution; in particular, in the absolutely-continuous case it is defined by the integral

$$ {\mathsf E} f ( X _ {1} \dots X _ {s} ) = $$

$$ = \ \int\limits _ {\mathbf R ^ {s} } f ( x _ {1} \dots x _ {s} ) p ( x _ {1} \dots x _ {s} ) d x _ {1} \dots d x _ {s} . $$

The characteristic function of a multivariate distribution is the function of $ t = ( t _ {1} \dots t _ {s} ) $ given by

$$ \phi ( t) = \ {\mathsf E} e ^ {i t x } ^ \prime , $$

where $ t x ^ \prime = t _ {1} x _ {1} + \dots + t _ {s} x _ {s} $. The fundamental characteristics of a multivariate distribution are the moments (cf. Moment): the mixed moments $ {\mathsf E} X _ {1} ^ {k _ {1} } \dots X _ {s} ^ {k _ {s} } $ and the central mixed moments $ {\mathsf E} ( X _ {1} - {\mathsf E} X _ {1} ) ^ {k _ {1} } \dots ( X _ {s} - {\mathsf E} X _ {s} ) ^ {k _ {s} } $, where $ k _ {1} + \dots + k _ {s} $ is the order of the corresponding moment. The roles of the expectation and the variance for a multivariate distribution are played by $ {\mathsf E} X = ( {\mathsf E} X _ {1} \dots {\mathsf E} X _ {s} ) $ and the set of second-order central mixed moments, which form the covariance matrix. If $ {\mathsf E} ( X _ {i} - {\mathsf E} X _ {i} ) ( X _ {j} - {\mathsf E} X _ {j} ) = 0 $ for all $ i , j $, $ i \neq j $, then $ X _ {1} \dots X _ {s} $ are called pairwise uncorrelated or orthogonal (the covariance matrix is diagonal). If the rank $ r $ of the covariance matrix is less than $ s $, then the multivariate distribution is called a degenerate distribution; in this case the distribution is concentrated on some linear manifold in $ \mathbf R ^ {s} $ of dimension $ r < n $.

For methods of investigating dependencies between $ X _ {1} \dots X _ {s} $ see Correlation; Regression.

Comments

References

[a1] N.L. Johnson, S. Kotz, "Discrete distributions" , Houghton Mifflin (1969)
[a2] N.L. Johnson, S. Kotz, "Continuous multivariate distributions" , Wiley (1942)
How to Cite This Entry:
Multi-dimensional distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Multi-dimensional_distribution&oldid=47913
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article