Namespaces
Variants
Actions

Difference between revisions of "Hotelling test"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
Line 1: Line 1:
''<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480802.png" />-test''
+
<!--
 +
h0480802.png
 +
$#A+1 = 55 n = 0
 +
$#C+1 = 55 : ~/encyclopedia/old_files/data/H048/H.0408080 Hotelling test,
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
A test intended for testing a hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480803.png" /> according to which the true value of the unknown vector <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480804.png" /> of mathematical expectation of a non-degenerate <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480805.png" />-dimensional normal law <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480806.png" /> whose covariance matrix <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480807.png" /> is also unknown, is the vector <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480808.png" />. Hotelling's test is based on the following result. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h0480809.png" /> be independent <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808010.png" />-dimensional random vectors, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808011.png" />, subject to the non-degenerate normal law <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808012.png" />, and let
+
{{TEX|auto}}
 +
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808013.png" /></td> </tr></table>
+
'' $  T  ^ {2} $-
 +
test''
 +
 
 +
A test intended for testing a hypothesis  $  H _ {0} $
 +
according to which the true value of the unknown vector  $  \mu = ( \mu _ {1} \dots \mu _ {p} ) $
 +
of mathematical expectation of a non-degenerate  $  p $-
 +
dimensional normal law  $  N ( \mu , B) $
 +
whose covariance matrix  $  B $
 +
is also unknown, is the vector  $  \mu = ( \mu _ {10} \dots \mu _ {p0} ) $.
 +
Hotelling's test is based on the following result. Let  $  X _ {1} \dots X _ {n} $
 +
be independent  $  p $-
 +
dimensional random vectors,  $  n - 1 \geq  p $,
 +
subject to the non-degenerate normal law  $  N ( \mu , B) $,
 +
and let
 +
 
 +
$$
 +
T  ^ {2}  = \
 +
n ( \overline{X}\; - \mu _ {0} ) ^ {T }
 +
S  ^ {-} 1 ( \overline{X}\; - \mu _ {0} ),
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808014.png" /></td> </tr></table>
+
$$
 +
\overline{X}\; = {
 +
\frac{1}{n}
 +
} \sum _ {i = 1 } ^ { n }  X _ {i}  $$
  
 
and
 
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808015.png" /></td> </tr></table>
+
$$
 +
=
 +
\frac{1}{n - 1 }
 +
 
 +
\sum _ {i = 1 } ^ { n }
 +
( X _ {i} - \overline{X}\; ) ( X _ {i} - \overline{X}\; ) ^ {T }
 +
$$
 +
 
 +
are maximum-likelihood estimators for the unknown parameters  $  \mu $
 +
and  $  B $.
 +
Then the statistic
 +
 
 +
$$
 +
= \
 +
 
 +
\frac{n - p }{p ( n - 1) }
 +
 
 +
T  ^ {2}
 +
$$
 +
 
 +
has the non-central [[Fisher-F-distribution|Fisher  $  F $-
 +
distribution]] with  $  p $
 +
and  $  n - p $
 +
degrees of freedom and non-centrality parameter
 +
 
 +
$$
 +
n ( \mu - \mu _ {0} ) ^ {T } B  ^ {-} 1 ( \mu - \mu _ {0} );
 +
$$
  
are maximum-likelihood estimators for the unknown parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808016.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808017.png" />. Then the statistic
+
the statistic  $  T  ^ {2} $
 +
has the [[Hotelling-T^2-distribution|Hotelling  $  T  ^ {2} $-
 +
distribution]]. Consequently, to test the hypothesis  $  H _ {0} $:  
 +
$  \mu = \mu _ {0} $
 +
against the alternative  $  H _ {1} $:  
 +
$  \mu \neq \mu _ {0} $
 +
one can compute the values of the statistic  $  F $
 +
based on realizations of the independent random vectors  $  X _ {1} \dots X _ {n} $
 +
from the non-degenerate  $  p $-
 +
dimensional normal law  $  N ( \mu , B) $,
 +
which under the hypothesis  $  H _ {0} $
 +
has the central  $  F $-
 +
distribution with  $  p $
 +
and  $  n - p $
 +
degrees of freedom. Using Hotelling's test with significance level  $  \alpha $,
 +
$  H _ {0} $
 +
must be rejected if  $  F \geq  F _  \alpha  ( p, n - p) $,
 +
where  $  F _  \alpha  ( p, n - p) $
 +
is the  $  \alpha $-
 +
quantile of the  $  F $-
 +
distribution. The connection between Hotelling's test and the generalized likelihood-ratio test should be mentioned. Let
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808018.png" /></td> </tr></table>
+
$$
 +
L ( \mu , B)  = \
 +
L ( X _ {1} \dots X _ {n} ; \mu , B) =
 +
$$
  
has the non-central [[Fisher-F-distribution|Fisher <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808019.png" />-distribution]] with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808020.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808021.png" /> degrees of freedom and non-centrality parameter
+
$$
 +
= \
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808022.png" /></td> </tr></table>
+
\frac{| B  ^ {-} 1 |  ^ {n/2} }{( 2 \pi )  ^ {np/2}
 +
}
 +
  \mathop{\rm exp} \left \{ - {
 +
\frac{1}{2}
 +
} \sum _ {i = 1 } ^ { n }  ( X _ {i} - \mu ) ^ {T } B  ^ {-} 1 ( X _ {i} - \mu ) \right \}
 +
$$
  
the statistic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808023.png" /> has the [[Hotelling-T^2-distribution|Hotelling <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808024.png" />-distribution]]. Consequently, to test the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808025.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808026.png" /> against the alternative <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808027.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808028.png" /> one can compute the values of the statistic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808029.png" /> based on realizations of the independent random vectors <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808030.png" /> from the non-degenerate <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808031.png" />-dimensional normal law <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808032.png" />, which under the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808033.png" /> has the central <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808034.png" />-distribution with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808035.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808036.png" /> degrees of freedom. Using Hotelling's test with significance level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808037.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808038.png" /> must be rejected if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808039.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808040.png" /> is the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808041.png" />-quantile of the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808042.png" />-distribution. The connection between Hotelling's test and the generalized likelihood-ratio test should be mentioned. Let
+
be the likelihood function computed from the sample  $  X _ {1} \dots X _ {n} $.  
 +
The generalized likelihood-ratio test for testing the simple hypothesis $  H _ {0} $:  
 +
$  \mu = \mu _ {0} $
 +
against the compound alternative $  H _ {1} $:  
 +
$  \mu \neq \mu _ {0} $
 +
is constructed from the statistic
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808043.png" /></td> </tr></table>
+
$$
 +
\lambda  = \
 +
\lambda ( X _ {1} \dots X _ {n} )  = \
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808044.png" /></td> </tr></table>
+
\frac{\sup _ { B }  L ( \mu _ {0} , B) }{\sup _ {\mu , B }  L ( \mu , B) }
 +
.
 +
$$
  
be the likelihood function computed from the sample <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808045.png" />. The generalized likelihood-ratio test for testing the simple hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808046.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808047.png" /> against the compound alternative <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808048.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808049.png" /> is constructed from the statistic
+
The statistic  $  \lambda $
 +
and the statistics  $  T  ^ {2} $
 +
and  $  F $
 +
are related by:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808050.png" /></td> </tr></table>
+
$$
 +
\lambda  ^ {2/n}  = \
  
The statistic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808051.png" /> and the statistics <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808052.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808053.png" /> are related by:
+
\frac{n - 1 }{T  ^ {2} + n - 1 }
 +
  = \
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808054.png" /></td> </tr></table>
+
\frac{n - p }{pF + n - p }
 +
.
 +
$$
  
For testing the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808055.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/h/h048/h048080/h04808056.png" />, Hotelling's test is uniformly most powerful among all tests that are invariant under similarity transformations (see [[Most-powerful test|Most-powerful test]]; [[Invariant test|Invariant test]]).
+
For testing the hypothesis $  H _ {0} $:  
 +
$  \mu = \mu _ {0} $,  
 +
Hotelling's test is uniformly most powerful among all tests that are invariant under similarity transformations (see [[Most-powerful test|Most-powerful test]]; [[Invariant test|Invariant test]]).
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  T.W. Anderson,  "An introduction to multivariate statistical analysis" , Wiley  (1984)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  C.R. Rao,  "Linear statistical inference and its applications" , Wiley  (1973)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  T.W. Anderson,  "An introduction to multivariate statistical analysis" , Wiley  (1984)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  C.R. Rao,  "Linear statistical inference and its applications" , Wiley  (1973)</TD></TR></table>

Revision as of 22:11, 5 June 2020


$ T ^ {2} $- test

A test intended for testing a hypothesis $ H _ {0} $ according to which the true value of the unknown vector $ \mu = ( \mu _ {1} \dots \mu _ {p} ) $ of mathematical expectation of a non-degenerate $ p $- dimensional normal law $ N ( \mu , B) $ whose covariance matrix $ B $ is also unknown, is the vector $ \mu = ( \mu _ {10} \dots \mu _ {p0} ) $. Hotelling's test is based on the following result. Let $ X _ {1} \dots X _ {n} $ be independent $ p $- dimensional random vectors, $ n - 1 \geq p $, subject to the non-degenerate normal law $ N ( \mu , B) $, and let

$$ T ^ {2} = \ n ( \overline{X}\; - \mu _ {0} ) ^ {T } S ^ {-} 1 ( \overline{X}\; - \mu _ {0} ), $$

where

$$ \overline{X}\; = { \frac{1}{n} } \sum _ {i = 1 } ^ { n } X _ {i} $$

and

$$ S = \frac{1}{n - 1 } \sum _ {i = 1 } ^ { n } ( X _ {i} - \overline{X}\; ) ( X _ {i} - \overline{X}\; ) ^ {T } $$

are maximum-likelihood estimators for the unknown parameters $ \mu $ and $ B $. Then the statistic

$$ F = \ \frac{n - p }{p ( n - 1) } T ^ {2} $$

has the non-central Fisher $ F $- distribution with $ p $ and $ n - p $ degrees of freedom and non-centrality parameter

$$ n ( \mu - \mu _ {0} ) ^ {T } B ^ {-} 1 ( \mu - \mu _ {0} ); $$

the statistic $ T ^ {2} $ has the Hotelling $ T ^ {2} $- distribution. Consequently, to test the hypothesis $ H _ {0} $: $ \mu = \mu _ {0} $ against the alternative $ H _ {1} $: $ \mu \neq \mu _ {0} $ one can compute the values of the statistic $ F $ based on realizations of the independent random vectors $ X _ {1} \dots X _ {n} $ from the non-degenerate $ p $- dimensional normal law $ N ( \mu , B) $, which under the hypothesis $ H _ {0} $ has the central $ F $- distribution with $ p $ and $ n - p $ degrees of freedom. Using Hotelling's test with significance level $ \alpha $, $ H _ {0} $ must be rejected if $ F \geq F _ \alpha ( p, n - p) $, where $ F _ \alpha ( p, n - p) $ is the $ \alpha $- quantile of the $ F $- distribution. The connection between Hotelling's test and the generalized likelihood-ratio test should be mentioned. Let

$$ L ( \mu , B) = \ L ( X _ {1} \dots X _ {n} ; \mu , B) = $$

$$ = \ \frac{| B ^ {-} 1 | ^ {n/2} }{( 2 \pi ) ^ {np/2} } \mathop{\rm exp} \left \{ - { \frac{1}{2} } \sum _ {i = 1 } ^ { n } ( X _ {i} - \mu ) ^ {T } B ^ {-} 1 ( X _ {i} - \mu ) \right \} $$

be the likelihood function computed from the sample $ X _ {1} \dots X _ {n} $. The generalized likelihood-ratio test for testing the simple hypothesis $ H _ {0} $: $ \mu = \mu _ {0} $ against the compound alternative $ H _ {1} $: $ \mu \neq \mu _ {0} $ is constructed from the statistic

$$ \lambda = \ \lambda ( X _ {1} \dots X _ {n} ) = \ \frac{\sup _ { B } L ( \mu _ {0} , B) }{\sup _ {\mu , B } L ( \mu , B) } . $$

The statistic $ \lambda $ and the statistics $ T ^ {2} $ and $ F $ are related by:

$$ \lambda ^ {2/n} = \ \frac{n - 1 }{T ^ {2} + n - 1 } = \ \frac{n - p }{pF + n - p } . $$

For testing the hypothesis $ H _ {0} $: $ \mu = \mu _ {0} $, Hotelling's test is uniformly most powerful among all tests that are invariant under similarity transformations (see Most-powerful test; Invariant test).

References

[1] T.W. Anderson, "An introduction to multivariate statistical analysis" , Wiley (1984)
[2] C.R. Rao, "Linear statistical inference and its applications" , Wiley (1973)
How to Cite This Entry:
Hotelling test. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Hotelling_test&oldid=19014
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article