Multi-dimensional distribution
multivariate distribution
A probability distribution on the $ \sigma $- algebra of Borel sets of an $ s $- dimensional Euclidean space $ \mathbf R ^ {s} $. One usually speaks of a multivariate distribution as the distribution of a multi-dimensional random variable, or random vector, $ X = ( X _ {1} \dots X _ {s} ) $, meaning by this the joint distribution of the real random variables $ X _ {1} ( \omega ) \dots X _ {s} ( \omega ) $ given on the same space of elementary events $ \Omega $( $ X _ {1} \dots X _ {s} $ may be regarded as coordinate variables in the space $ \Omega = \mathbf R ^ {s} $). A multivariate distribution is uniquely determined by its distribution function — the function
$$ F ( x _ {1} \dots x _ {s} ) = \ {\mathsf P} \{ X _ {1} < x _ {1} \dots X _ {s} < x _ {s} \} $$
of the real variables $ x _ {1} \dots x _ {s} $.
As in the one-dimensional case, the most widespread multivariate distributions are the discrete and the absolutely-continuous distributions. In the discrete case a multivariate distribution is concentrated on a finite or countable set of points $ ( x _ {i _ {1} } \dots x _ {i _ {s} } ) $ of $ \mathbf R ^ {s} $ such that
$$ {\mathsf P} \{ X _ {1} = x _ {i _ {1} } \dots X _ {s} = x _ {i _ {s} } \} = \ p _ {i _ {1} \dots i _ {s} } \geq 0 , $$
$$ \sum _ {i _ {1} \dots i _ {s} } p _ {i _ {1} \dots i _ {s} } = 1 $$
(see, for example, Multinomial distribution). In the absolutely-continuous case almost-everywhere (with respect to Lebesgue measure) on $ \mathbf R ^ {s} $,
$$ \frac{\partial ^ {s} F ( x _ {1} \dots x _ {s} ) }{\partial x _ {1} \dots \partial x _ {s} } = p ( x _ {1} \dots x _ {s} ) , $$
where $ p ( x _ {1} \dots x _ {s} ) \geq 0 $ is the density of the multivariate distribution:
$$ {\mathsf P} \{ X \in A \} = \ \int\limits _ { A } p ( x _ {1} \dots x _ {s} ) \ d x _ {1} \dots d x _ {s} , $$
for any $ A $ from the $ \sigma $- algebra of Borel subsets of $ \mathbf R ^ {s} $, and
$$ \int\limits _ {\mathbf R ^ {s} } p ( x _ {1} \dots x _ {s} ) \ d x _ {1} \dots d x _ {s} = 1 . $$
The distribution of any random variable $ X _ {i} $( and also, for any $ m < s $, the distribution of the variables $ X _ {i _ {1} } \dots X _ {i _ {m} } $) relative to a multivariate distribution is called a marginal distribution. The marginal distributions are completely determined by the given multivariate distribution. When $ X _ {1} \dots X _ {s} $ are independent, then
$$ F ( x _ {1} \dots x _ {s} ) = \ F _ {1} ( x _ {1} ) \dots F _ {s} ( x _ {s} ) $$
and
$$ p ( x _ {1} \dots x _ {s} ) = \ p _ {1} ( x _ {1} ) \dots p _ {s} ( x _ {s} ) , $$
where $ F _ {i} ( x) $ and $ p _ {i} ( x) $ are, respectively, the marginal distribution functions and densities of the $ X _ {i} $.
The mathematical expectation of any function $ f ( X _ {1} \dots X _ {s} ) $ of $ X _ {1} \dots X _ {s} $ is defined by the integral of this function with respect to the multivariate distribution; in particular, in the absolutely-continuous case it is defined by the integral
$$ {\mathsf E} f ( X _ {1} \dots X _ {s} ) = $$
$$ = \ \int\limits _ {\mathbf R ^ {s} } f ( x _ {1} \dots x _ {s} ) p ( x _ {1} \dots x _ {s} ) d x _ {1} \dots d x _ {s} . $$
The characteristic function of a multivariate distribution is the function of $ t = ( t _ {1} \dots t _ {s} ) $ given by
$$ \phi ( t) = {\mathsf E} e ^ {i t x } ^ \prime , $$
where $ t x ^ \prime = t _ {1} x _ {1} + \dots + t _ {s} x _ {s} $. The fundamental characteristics of a multivariate distribution are the moments (cf. Moment): the mixed moments $ {\mathsf E} X _ {1} ^ {k _ {1} } \dots X _ {s} ^ {k _ {s} } $ and the central mixed moments $ {\mathsf E} ( X _ {1} - {\mathsf E} X _ {1} ) ^ {k _ {1} } \dots ( X _ {s} - {\mathsf E} X _ {s} ) ^ {k _ {s} } $, where $ k _ {1} + \dots + k _ {s} $ is the order of the corresponding moment. The roles of the expectation and the variance for a multivariate distribution are played by $ {\mathsf E} X = ( {\mathsf E} X _ {1} \dots {\mathsf E} X _ {s} ) $ and the set of second-order central mixed moments, which form the covariance matrix. If $ {\mathsf E} ( X _ {i} - {\mathsf E} X _ {i} ) ( X _ {j} - {\mathsf E} X _ {j} ) = 0 $ for all $ i , j $, $ i \neq j $, then $ X _ {1} \dots X _ {s} $ are called pairwise uncorrelated or orthogonal (the covariance matrix is diagonal). If the rank $ r $ of the covariance matrix is less than $ s $, then the multivariate distribution is called a degenerate distribution; in this case the distribution is concentrated on some linear manifold in $ \mathbf R ^ {s} $ of dimension $ r < n $.
For methods of investigating dependencies between $ X _ {1} \dots X _ {s} $ see Correlation; Regression.
References
[a1] | N.L. Johnson, S. Kotz, "Discrete distributions" , Houghton Mifflin (1969) |
[a2] | N.L. Johnson, S. Kotz, "Continuous multivariate distributions" , Wiley (1942) |
Multi-dimensional distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Multi-dimensional_distribution&oldid=47913