Namespaces
Variants
Actions

Covariance matrix

From Encyclopedia of Mathematics
Jump to: navigation, search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.


The matrix formed from the pairwise covariances of several random variables; more precisely, for the $ k $- dimensional vector $ X = ( X _ {1} \dots X _ {k} ) $ the covariance matrix is the square matrix $ \Sigma = {\mathsf E} [ ( X - {\mathsf E} X ) ( X - {\mathsf E} X ) ^ {T} ] $, where $ {\mathsf E} X = ( {\mathsf E} X _ {1} \dots {\mathsf E} X _ {k} ) $ is the vector of mean values. The components of the covariance matrix are:

$$ \sigma _ {ij} = {\mathsf E} [ ( X _ {i} - {\mathsf E} X _ {i} ) ( X _ {j} - {\mathsf E} X _ {j} ) ] = \ \mathop{\rm cov} ( X _ {i} , X _ {j} ) , $$

$$ i , j = 1 \dots k , $$

and for $ i = j $ they are the same as $ {\mathsf D} X _ {i} $( $ = \mathop{\rm var} ( X _ {i} ) $) (that is, the variances of the $ X _ {i} $ lie on the principal diagonal). The covariance matrix is a symmetric positive semi-definite matrix. If the covariance matrix is positive definite, then the distribution of $ X $ is non-degenerate; otherwise it is degenerate. For the random vector $ X $ the covariance matrix plays the same role as the variance of a random variable. If the variances of the random variables $ X _ {1} \dots X _ {k} $ are all equal to 1, then the covariance matrix of $ X = ( X _ {1} \dots X _ {k} ) $ is the same as the correlation matrix.

The sample covariance matrix for the sample $ X ^ {(1)} \dots X ^ {(n)} $, where the $ X ^ {(m)} $, $ m = 1 \dots n $, are independent and identically-distributed random $ k $- dimensional vectors, consists of the variance and covariance estimators:

$$ S = \frac{1}{n-1}\sum_{m=1}^n ( X ^ {(m)} - \overline{X}\; ) ( X ^ {(m)} - \overline{X}\; ) ^ {T} , $$

where the vector $ \overline{X}\; $ is the arithmetic mean of the $ X ^ {(1)} \dots X ^ {(n)} $. If the $ X ^ {(1)} \dots X ^ {(n)} $ are multivariate normally distributed with covariance matrix $ \Sigma $, then $ S ( n - 1 ) / n $ is the maximum-likelihood estimator of $ \Sigma $; in this case the joint distribution of the elements of the matrix $ ( n - 1 ) S $ is called the Wishart distribution; it is one of the fundamental distributions in multivariate statistical analysis by means of which hypotheses concerning the covariance matrix $ \Sigma $ can be tested.

How to Cite This Entry:
Covariance matrix. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Covariance_matrix&oldid=55243
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article