Difference between revisions of "Covariance matrix"
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
(latex details) |
||
Line 38: | Line 38: | ||
is the same as the [[Correlation matrix|correlation matrix]]. | is the same as the [[Correlation matrix|correlation matrix]]. | ||
− | The sample covariance matrix for the sample $ X ^ {( | + | The sample covariance matrix for the sample $ X ^ {(1)} \dots X ^ {(n)} $, |
− | where the $ X ^ {( | + | where the $ X ^ {(m)} $, |
$ m = 1 \dots n $, | $ m = 1 \dots n $, | ||
are independent and identically-distributed random $ k $- | are independent and identically-distributed random $ k $- | ||
Line 46: | Line 46: | ||
$$ | $$ | ||
S = | S = | ||
− | \frac{1}{n-} | + | \frac{1}{n-1}\sum_{m=1}^n |
− | + | ( X ^ {(m)} - \overline{X}\; ) ( X ^ {(m)} - \overline{X}\; ) ^ {T} , | |
− | \ | ||
− | ( X ^ {( | ||
$$ | $$ | ||
where the vector $ \overline{X}\; $ | where the vector $ \overline{X}\; $ | ||
− | is the arithmetic mean of the $ X ^ {( | + | is the arithmetic mean of the $ X ^ {(1)} \dots X ^ {(n)} $. |
− | If the $ X ^ {( | + | If the $ X ^ {(1)} \dots X ^ {(n)} $ |
are multivariate normally distributed with covariance matrix $ \Sigma $, | are multivariate normally distributed with covariance matrix $ \Sigma $, | ||
then $ S ( n - 1 ) / n $ | then $ S ( n - 1 ) / n $ | ||
is the maximum-likelihood estimator of $ \Sigma $; | is the maximum-likelihood estimator of $ \Sigma $; | ||
in this case the joint distribution of the elements of the matrix $ ( n - 1 ) S $ | in this case the joint distribution of the elements of the matrix $ ( n - 1 ) S $ | ||
− | is called the [[ | + | is called the [[Wishart distribution]]; it is one of the fundamental distributions in multivariate statistical analysis by means of which hypotheses concerning the covariance matrix $ \Sigma $ |
can be tested. | can be tested. |
Latest revision as of 16:42, 20 January 2024
The matrix formed from the pairwise covariances of several random variables; more precisely, for the $ k $-
dimensional vector $ X = ( X _ {1} \dots X _ {k} ) $
the covariance matrix is the square matrix $ \Sigma = {\mathsf E} [ ( X - {\mathsf E} X ) ( X - {\mathsf E} X ) ^ {T} ] $,
where $ {\mathsf E} X = ( {\mathsf E} X _ {1} \dots {\mathsf E} X _ {k} ) $
is the vector of mean values. The components of the covariance matrix are:
$$ \sigma _ {ij} = {\mathsf E} [ ( X _ {i} - {\mathsf E} X _ {i} ) ( X _ {j} - {\mathsf E} X _ {j} ) ] = \ \mathop{\rm cov} ( X _ {i} , X _ {j} ) , $$
$$ i , j = 1 \dots k , $$
and for $ i = j $ they are the same as $ {\mathsf D} X _ {i} $( $ = \mathop{\rm var} ( X _ {i} ) $) (that is, the variances of the $ X _ {i} $ lie on the principal diagonal). The covariance matrix is a symmetric positive semi-definite matrix. If the covariance matrix is positive definite, then the distribution of $ X $ is non-degenerate; otherwise it is degenerate. For the random vector $ X $ the covariance matrix plays the same role as the variance of a random variable. If the variances of the random variables $ X _ {1} \dots X _ {k} $ are all equal to 1, then the covariance matrix of $ X = ( X _ {1} \dots X _ {k} ) $ is the same as the correlation matrix.
The sample covariance matrix for the sample $ X ^ {(1)} \dots X ^ {(n)} $, where the $ X ^ {(m)} $, $ m = 1 \dots n $, are independent and identically-distributed random $ k $- dimensional vectors, consists of the variance and covariance estimators:
$$ S = \frac{1}{n-1}\sum_{m=1}^n ( X ^ {(m)} - \overline{X}\; ) ( X ^ {(m)} - \overline{X}\; ) ^ {T} , $$
where the vector $ \overline{X}\; $ is the arithmetic mean of the $ X ^ {(1)} \dots X ^ {(n)} $. If the $ X ^ {(1)} \dots X ^ {(n)} $ are multivariate normally distributed with covariance matrix $ \Sigma $, then $ S ( n - 1 ) / n $ is the maximum-likelihood estimator of $ \Sigma $; in this case the joint distribution of the elements of the matrix $ ( n - 1 ) S $ is called the Wishart distribution; it is one of the fundamental distributions in multivariate statistical analysis by means of which hypotheses concerning the covariance matrix $ \Sigma $ can be tested.
Covariance matrix. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Covariance_matrix&oldid=46540