Namespaces
Variants
Actions

Difference between revisions of "Linear hypothesis"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
A statistical hypothesis according to which the mean <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592801.png" /> of an <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592802.png" />-dimensional normal law <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592803.png" /> (where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592804.png" /> is the unit matrix), lying in a linear subspace <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592805.png" /> of dimension <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592806.png" />, belongs to a linear subspace <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592807.png" /> of dimension <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592808.png" />.
+
<!--
 +
l0592801.png
 +
$#A+1 = 36 n = 0
 +
$#C+1 = 36 : ~/encyclopedia/old_files/data/L059/L.0509280 Linear hypothesis
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
Many problems of mathematical statistics can be reduced to the problem of testing a linear hypothesis, which is often stated in the following so-called canonical form. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l0592809.png" /> be a normally distributed vector with independent components and let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928010.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928011.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928012.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928013.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928014.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928015.png" />, where the quantities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928016.png" /> are unknown. Then the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928017.png" />, according to which
+
{{TEX|auto}}
 +
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928018.png" /></td> </tr></table>
+
A statistical hypothesis according to which the mean  $  a $
 +
of an  $  n $-
 +
dimensional normal law  $  N _ {n} ( a , \sigma  ^ {2} I ) $(
 +
where  $  I $
 +
is the unit matrix), lying in a linear subspace  $  \Pi  ^ {s} \subset  \mathbf R  ^ {n} $
 +
of dimension  $  s < n $,
 +
belongs to a linear subspace  $  \Pi  ^ {r} \subset  \Pi  ^ {s} $
 +
of dimension  $  r < s $.
 +
 
 +
Many problems of mathematical statistics can be reduced to the problem of testing a linear hypothesis, which is often stated in the following so-called canonical form. Let  $  X = ( X _ {1} \dots X _ {n} ) $
 +
be a normally distributed vector with independent components and let  $  {\mathsf E} X _ {i} = a _ {i} $
 +
for  $  i = 1 \dots s $,
 +
$  {\mathsf E} X _ {i} = 0 $
 +
for  $  i = s + 1 \dots n $
 +
and  $  {\mathsf D} X _ {i} = \sigma  ^ {2} $
 +
for  $  i = 1 \dots n $,
 +
where the quantities  $  a _ {1} \dots a _ {s} $
 +
are unknown. Then the hypothesis  $  H _ {0} $,
 +
according to which
 +
 
 +
$$
 +
a _ {1}  = \dots =  a _ {r}  =  0 ,\ \
 +
r < s < n ,
 +
$$
  
 
is the canonical linear hypothesis.
 
is the canonical linear hypothesis.
  
Example. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928019.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928020.png" /> be <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928021.png" /> independent random variables, subject to normal distributions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928022.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928023.png" />, respectively, where the parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928024.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928025.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928026.png" /> are unknown. Then the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928027.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928028.png" /> is the linear hypothesis, while a hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928029.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928030.png" /> with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928031.png" /> is not linear.
+
Example. Let $  Y _ {1} \dots Y _ {n} $
 +
and $  Z _ {1} \dots Z _ {m} $
 +
be $  n + m $
 +
independent random variables, subject to normal distributions $  N _ {1} ( a , \sigma  ^ {2} ) $
 +
and $  N _ {1} ( b , \sigma  ^ {2} ) $,  
 +
respectively, where the parameters $  a $,  
 +
$  b $,  
 +
$  \sigma  ^ {2} $
 +
are unknown. Then the hypothesis $  H _ {0} $:  
 +
$  a = b = 0 $
 +
is the linear hypothesis, while a hypothesis $  a = a _ {0} $,  
 +
$  b = b _ {0} $
 +
with $  a _ {0} \neq b _ {0} $
 +
is not linear.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1986)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1986)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====
However, such a linear hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928032.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928033.png" /> with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928034.png" /> does correspond to a linear hypothesis concerning the means of the transformed quantities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928035.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l059/l059280/l05928036.png" />.
+
However, such a linear hypothesis $  a = a _ {0} $,  
 +
$  b = b _ {0} $
 +
with $  a _ {0} \neq b _ {0} $
 +
does correspond to a linear hypothesis concerning the means of the transformed quantities $  Y _ {i}  ^  \prime  = Y _ {i} - a _ {0} $,  
 +
$  Z _ {i}  ^  \prime  = Z _ {i} - b _ {0} $.

Latest revision as of 22:17, 5 June 2020


A statistical hypothesis according to which the mean $ a $ of an $ n $- dimensional normal law $ N _ {n} ( a , \sigma ^ {2} I ) $( where $ I $ is the unit matrix), lying in a linear subspace $ \Pi ^ {s} \subset \mathbf R ^ {n} $ of dimension $ s < n $, belongs to a linear subspace $ \Pi ^ {r} \subset \Pi ^ {s} $ of dimension $ r < s $.

Many problems of mathematical statistics can be reduced to the problem of testing a linear hypothesis, which is often stated in the following so-called canonical form. Let $ X = ( X _ {1} \dots X _ {n} ) $ be a normally distributed vector with independent components and let $ {\mathsf E} X _ {i} = a _ {i} $ for $ i = 1 \dots s $, $ {\mathsf E} X _ {i} = 0 $ for $ i = s + 1 \dots n $ and $ {\mathsf D} X _ {i} = \sigma ^ {2} $ for $ i = 1 \dots n $, where the quantities $ a _ {1} \dots a _ {s} $ are unknown. Then the hypothesis $ H _ {0} $, according to which

$$ a _ {1} = \dots = a _ {r} = 0 ,\ \ r < s < n , $$

is the canonical linear hypothesis.

Example. Let $ Y _ {1} \dots Y _ {n} $ and $ Z _ {1} \dots Z _ {m} $ be $ n + m $ independent random variables, subject to normal distributions $ N _ {1} ( a , \sigma ^ {2} ) $ and $ N _ {1} ( b , \sigma ^ {2} ) $, respectively, where the parameters $ a $, $ b $, $ \sigma ^ {2} $ are unknown. Then the hypothesis $ H _ {0} $: $ a = b = 0 $ is the linear hypothesis, while a hypothesis $ a = a _ {0} $, $ b = b _ {0} $ with $ a _ {0} \neq b _ {0} $ is not linear.

References

[1] E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)

Comments

However, such a linear hypothesis $ a = a _ {0} $, $ b = b _ {0} $ with $ a _ {0} \neq b _ {0} $ does correspond to a linear hypothesis concerning the means of the transformed quantities $ Y _ {i} ^ \prime = Y _ {i} - a _ {0} $, $ Z _ {i} ^ \prime = Z _ {i} - b _ {0} $.

How to Cite This Entry:
Linear hypothesis. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Linear_hypothesis&oldid=12624
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article