|
|
Line 1: |
Line 1: |
− | {{TEX|want}} | + | {{TEX|done}} |
| | | |
− | A numerical characteristic of the joint distribution of two random variables, equal to the mathematical expectation of the product of the deviations of these two random variables from their mathematical expectations. The covariance is defined for random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268001.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268002.png" /> with finite variance and is usually denoted by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268003.png" />. Thus,
| + | $ \DeclareMathOperator{\cov}{cov} $ |
| + | $ \DeclareMathOperator{\var}{var} $ |
| + | $ \DeclareMathOperator{\E}{\mathbf{E}} $ |
| | | |
− | <table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268004.png" /></td> </tr></table>
| + | A numerical characteristic of the joint distribution of two random variables, equal to the mathematical expectation of the product of the deviations of these two random variables from their mathematical expectations. The covariance is defined for random variables $X_1$ and $X_2$ with finite variance and is usually denoted by $\cov(X_1, X_2)$. Thus, |
| | | |
− | so that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268005.png" />; <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268006.png" />. The covariance naturally occurs in the expression for the variance of the sum of two random variables:
| + | \[ |
| + | \cov(X_1, X_2) = \E[(X_1 - \E X_1)(X_2 - \E X_2)], |
| + | \] |
| | | |
− | <table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268007.png" /></td> </tr></table>
| + | so that $\cov(X_1, X_2) = \cov(X_2, X_1)$; $\cov(X, X) = DX = \var(X) $. The covariance naturally occurs in the expression for the variance of the sum of two random variables: |
| + | \[ |
| + | D(X_1 + X_2) = DX_1 + DX_2 + 2 \cov(X_1, X_2). |
| + | \] |
| | | |
− | If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268008.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c0268009.png" /> are independent random variables, then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680010.png" />. The covariance gives a characterization of the dependence of the random variables; the [[Correlation coefficient|correlation coefficient]] is defined by means of the covariance. In order to statistically estimate the covariance one uses the sample covariance, computed from the formula | + | If $X_1$ and $X_2$ are independent random variables, then $\cov(X_1, X_2)=0$. The covariance gives a characterization of the dependence of the random variables; the [[Correlation coefficient|correlation coefficient]] is defined by means of the covariance. In order to statistically estimate the covariance one uses the sample covariance, computed from the formula |
| | | |
− | <table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680011.png" /></td> </tr></table>
| + | \[ |
− | | + | \frac{1}{n - 1}\sum\limits_{i = 1}^n {(X_1^{(i)} - {X_1})(X_2^{(i)} - {X_2})} |
− | where the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680012.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680013.png" />, are independent random variables and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680014.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680015.png" /> are their arithmetic means. | + | \] |
| + | where the $(X_1^{(i)} - {X_1})(X_2^{(i)} - {X_2}) , i = 1, \dots, n$, are independent random variables and $X_1$ and $X_2$ are their arithmetic means. |
| | | |
| | | |
| | | |
| ====Comments==== | | ====Comments==== |
− | In the Western literature one always uses <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680016.png" /> or <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680017.png" /> for the variance, instead of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026800/c02680018.png" />. | + | In the Western literature one always uses $V(x)$ or $\var(X)$ for the variance, instead of $ D(X)$. |
Revision as of 09:11, 31 May 2013
$ \DeclareMathOperator{\cov}{cov} $
$ \DeclareMathOperator{\var}{var} $
$ \DeclareMathOperator{\E}{\mathbf{E}} $
A numerical characteristic of the joint distribution of two random variables, equal to the mathematical expectation of the product of the deviations of these two random variables from their mathematical expectations. The covariance is defined for random variables $X_1$ and $X_2$ with finite variance and is usually denoted by $\cov(X_1, X_2)$. Thus,
\[
\cov(X_1, X_2) = \E[(X_1 - \E X_1)(X_2 - \E X_2)],
\]
so that $\cov(X_1, X_2) = \cov(X_2, X_1)$; $\cov(X, X) = DX = \var(X) $. The covariance naturally occurs in the expression for the variance of the sum of two random variables:
\[
D(X_1 + X_2) = DX_1 + DX_2 + 2 \cov(X_1, X_2).
\]
If $X_1$ and $X_2$ are independent random variables, then $\cov(X_1, X_2)=0$. The covariance gives a characterization of the dependence of the random variables; the correlation coefficient is defined by means of the covariance. In order to statistically estimate the covariance one uses the sample covariance, computed from the formula
\[
\frac{1}{n - 1}\sum\limits_{i = 1}^n {(X_1^{(i)} - {X_1})(X_2^{(i)} - {X_2})}
\]
where the $(X_1^{(i)} - {X_1})(X_2^{(i)} - {X_2}) , i = 1, \dots, n$, are independent random variables and $X_1$ and $X_2$ are their arithmetic means.
In the Western literature one always uses $V(x)$ or $\var(X)$ for the variance, instead of $ D(X)$.
How to Cite This Entry:
Covariance. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Covariance&oldid=29802
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098.
See original article