Difference between revisions of "Neyman structure"
(Importing text file) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
− | + | <!-- | |
+ | n0666101.png | ||
+ | $#A+1 = 64 n = 0 | ||
+ | $#C+1 = 64 : ~/encyclopedia/old_files/data/N066/N.0606610 Neyman structure | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
− | + | {{TEX|auto}} | |
+ | {{TEX|done}} | ||
− | + | A structure determined by a statistic that is independent of a sufficient statistic. The concept was introduced by J. Neyman (see [[#References|[1]]]) in connection with the problem of constructing similar tests (cf. [[Similar test|Similar test]]) in the theory of statistical hypothesis testing, and the term "Neyman structure" is used when referring to the structure of a statistical test if its critical function has Neyman structure. Suppose that in the realization of a random variable $ X $ | |
+ | taking values in a sample space $ ( \mathfrak X , \mathfrak B , P _ \theta ) $, | ||
+ | $ \theta \in \Theta $, | ||
+ | it is required to verify a composite hypothesis $ H _ {0} $: | ||
+ | $ \theta \in \Theta _ {0} \subset \Theta $ | ||
+ | and that for the family $ \{ {P _ \theta } : {\theta \in \Theta _ {0} } \} $ | ||
+ | there exists a [[Sufficient statistic|sufficient statistic]] $ T $ | ||
+ | with distribution in the family $ \{ {P _ {e} ^ {T} } : {e \in \Theta _ {0} } \} $. | ||
+ | Then any statistical test of level $ \alpha $ | ||
+ | intended for testing $ H _ {0} $ | ||
+ | has Neyman structure if its critical function $ \phi $ | ||
+ | satisfies the condition: | ||
− | + | $$ \tag{1 } | |
+ | {\mathsf E} \{ \phi ( X) \mid T = t \} = \alpha | ||
+ | $$ | ||
− | + | almost everywhere with respect to the measure $ P _ \theta ^ {T} $, | |
+ | $ \theta \in \Theta _ {0} $. | ||
+ | Evidently, if a statistical test has Neyman structure, then it is similar (cf. [[Similar test|Similar test]]) relative to the family $ \{ {P _ \theta } : {\theta \in \Theta _ {0} } \} $, | ||
+ | since | ||
− | + | $$ | |
+ | {\mathsf E} _ \theta \{ \phi ( X) \} = \ | ||
+ | {\mathsf E} _ \theta \{ {\mathsf E} \{ \phi ( X) \mid T = t \} \} = \alpha | ||
+ | $$ | ||
− | + | for all $ \theta \in \Theta _ {0} $. | |
− | + | The validity of (1) essentially reduces the problem of testing the composite hypothesis $ H _ {0} $ | |
+ | to that of testing $ H _ {0} $ | ||
+ | as a simple hypothesis for every fixed value $ t $ | ||
+ | of the sufficient statistic $ T $. | ||
− | + | Example. Suppose that two independent random variables $ X _ {1} $ | |
+ | and $ X _ {2} $ | ||
+ | are subject to Poisson laws with unknown parameters $ \lambda _ {1} $ | ||
+ | and $ \lambda _ {2} $( | ||
+ | cf. [[Poisson distribution|Poisson distribution]]) and that the hypothesis $ H _ {0} $: | ||
+ | $ \lambda _ {1} = \lambda _ {2} $ | ||
+ | is to be tested against the alternative $ H _ {1} $: | ||
+ | $ \lambda _ {1} \neq \lambda _ {2} $. | ||
+ | Thanks to the independence of $ X _ {1} $ | ||
+ | and $ X _ {2} $ | ||
+ | the statistic $ T = X _ {1} + X _ {2} $ | ||
+ | is subject to the Poisson law with parameter $ \lambda _ {1} + \lambda _ {2} $ | ||
+ | and the conditional distributions of $ X _ {1} $ | ||
+ | and $ X _ {2} $ | ||
+ | under the condition $ T = t $ | ||
+ | are binomial with parameters $ t $, | ||
+ | $ \lambda _ {1} / ( \lambda _ {1} + \lambda _ {2} ) $ | ||
+ | and $ t $, | ||
+ | $ \lambda _ {2} / ( \lambda _ {1} + \lambda _ {2} ) $, | ||
+ | respectively, that is, | ||
− | + | $$ \tag{2 } | |
+ | {\mathsf P} \{ X _ {i} = k \mid T = t \} = \ | ||
+ | \left ( \begin{array}{c} | ||
+ | t \\ | ||
+ | k | ||
+ | \end{array} | ||
+ | \right ) | ||
+ | \left ( | ||
− | + | \frac{\lambda _ {i} }{\lambda _ {1} + \lambda _ {2} } | |
− | + | \right ) ^ {k} | |
+ | \left ( | ||
+ | 1 - | ||
+ | \frac{\lambda _ {i} }{\lambda _ {1} + \lambda _ {2} } | ||
− | The concept of a Neyman structure is of great significance in the problem of testing composite statistical hypotheses, since among the tests having Neyman structure there frequently is a [[Most-powerful test|most-powerful test]]. E. Lehmann and H. Scheffé have shown that a statistical test for testing a composite hypothesis | + | \right ) ^ {t-} k , |
+ | $$ | ||
+ | |||
+ | $$ | ||
+ | k = 0 \dots t . | ||
+ | $$ | ||
+ | |||
+ | When $ H _ {0} $ | ||
+ | is valid, then $ T $ | ||
+ | is sufficient for the unknown common value $ \lambda = \lambda _ {1} = \lambda _ {2} $, | ||
+ | and from (2) it follows that when $ H _ {0} $ | ||
+ | holds, then the conditional distribution of $ X _ {1} $ | ||
+ | for a fixed value of the sufficient statistic $ T = t $ | ||
+ | is binomial with parameters $ t $ | ||
+ | and $ 1 / 2 $, | ||
+ | that is, under $ H _ {0} $, | ||
+ | |||
+ | $$ | ||
+ | {\mathsf P} \{ X _ {1} = k \mid T = t \} = \ | ||
+ | \left ( \begin{array}{c} | ||
+ | t \\ | ||
+ | k | ||
+ | \end{array} | ||
+ | \right ) | ||
+ | \left ( { | ||
+ | \frac{1}{2} | ||
+ | } \right ) ^ {t} ,\ \ | ||
+ | k = 0 \dots t . | ||
+ | $$ | ||
+ | |||
+ | Thus, in this case the problem of testing the composite hypothesis $ H _ {0} $ | ||
+ | reduces to that of testing the simple hypothesis $ H _ {0} ^ {t} $, | ||
+ | according to which the conditional distribution of $ X _ {1} $( | ||
+ | for a fixed sum $ X _ {1} + X _ {2} = t $) | ||
+ | is binomial with parameters $ t $ | ||
+ | and $ 1 / 2 $. | ||
+ | For testing $ H _ {0} ^ {t} $ | ||
+ | one can use, for example, the [[Sign test|sign test]]. | ||
+ | |||
+ | The concept of a Neyman structure is of great significance in the problem of testing composite statistical hypotheses, since among the tests having Neyman structure there frequently is a [[Most-powerful test|most-powerful test]]. E. Lehmann and H. Scheffé have shown that a statistical test for testing a composite hypothesis $ H _ {0} $: | ||
+ | $ \theta \in \Theta _ {0} $ | ||
+ | has Neyman structure relative to a sufficient statistic $ T $ | ||
+ | if and only if the family $ \{ {P _ \theta ^ {T} } : {\theta \in \Theta _ {0} } \} $ | ||
+ | induced by $ T $ | ||
+ | is boundedly complete. On the basis of the concept of a Neyman structure general methods have been worked out for the construction of similar tests. See [[Distributions, complete family of|Distributions, complete family of]]; [[Similar test|Similar test]]. | ||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> J. Neyman, "Current problems of mathematical statistics" , ''Proc. Internat. Congress Mathematicians (Amsterdam, 1954)'' , '''1''' , Noordhoff & North-Holland (1957) pp. 349–370</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian)</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> J. Neyman, "Current problems of mathematical statistics" , ''Proc. Internat. Congress Mathematicians (Amsterdam, 1954)'' , '''1''' , Noordhoff & North-Holland (1957) pp. 349–370</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian)</TD></TR></table> |
Revision as of 08:02, 6 June 2020
A structure determined by a statistic that is independent of a sufficient statistic. The concept was introduced by J. Neyman (see [1]) in connection with the problem of constructing similar tests (cf. Similar test) in the theory of statistical hypothesis testing, and the term "Neyman structure" is used when referring to the structure of a statistical test if its critical function has Neyman structure. Suppose that in the realization of a random variable $ X $
taking values in a sample space $ ( \mathfrak X , \mathfrak B , P _ \theta ) $,
$ \theta \in \Theta $,
it is required to verify a composite hypothesis $ H _ {0} $:
$ \theta \in \Theta _ {0} \subset \Theta $
and that for the family $ \{ {P _ \theta } : {\theta \in \Theta _ {0} } \} $
there exists a sufficient statistic $ T $
with distribution in the family $ \{ {P _ {e} ^ {T} } : {e \in \Theta _ {0} } \} $.
Then any statistical test of level $ \alpha $
intended for testing $ H _ {0} $
has Neyman structure if its critical function $ \phi $
satisfies the condition:
$$ \tag{1 } {\mathsf E} \{ \phi ( X) \mid T = t \} = \alpha $$
almost everywhere with respect to the measure $ P _ \theta ^ {T} $, $ \theta \in \Theta _ {0} $. Evidently, if a statistical test has Neyman structure, then it is similar (cf. Similar test) relative to the family $ \{ {P _ \theta } : {\theta \in \Theta _ {0} } \} $, since
$$ {\mathsf E} _ \theta \{ \phi ( X) \} = \ {\mathsf E} _ \theta \{ {\mathsf E} \{ \phi ( X) \mid T = t \} \} = \alpha $$
for all $ \theta \in \Theta _ {0} $.
The validity of (1) essentially reduces the problem of testing the composite hypothesis $ H _ {0} $ to that of testing $ H _ {0} $ as a simple hypothesis for every fixed value $ t $ of the sufficient statistic $ T $.
Example. Suppose that two independent random variables $ X _ {1} $ and $ X _ {2} $ are subject to Poisson laws with unknown parameters $ \lambda _ {1} $ and $ \lambda _ {2} $( cf. Poisson distribution) and that the hypothesis $ H _ {0} $: $ \lambda _ {1} = \lambda _ {2} $ is to be tested against the alternative $ H _ {1} $: $ \lambda _ {1} \neq \lambda _ {2} $. Thanks to the independence of $ X _ {1} $ and $ X _ {2} $ the statistic $ T = X _ {1} + X _ {2} $ is subject to the Poisson law with parameter $ \lambda _ {1} + \lambda _ {2} $ and the conditional distributions of $ X _ {1} $ and $ X _ {2} $ under the condition $ T = t $ are binomial with parameters $ t $, $ \lambda _ {1} / ( \lambda _ {1} + \lambda _ {2} ) $ and $ t $, $ \lambda _ {2} / ( \lambda _ {1} + \lambda _ {2} ) $, respectively, that is,
$$ \tag{2 } {\mathsf P} \{ X _ {i} = k \mid T = t \} = \ \left ( \begin{array}{c} t \\ k \end{array} \right ) \left ( \frac{\lambda _ {i} }{\lambda _ {1} + \lambda _ {2} } \right ) ^ {k} \left ( 1 - \frac{\lambda _ {i} }{\lambda _ {1} + \lambda _ {2} } \right ) ^ {t-} k , $$
$$ k = 0 \dots t . $$
When $ H _ {0} $ is valid, then $ T $ is sufficient for the unknown common value $ \lambda = \lambda _ {1} = \lambda _ {2} $, and from (2) it follows that when $ H _ {0} $ holds, then the conditional distribution of $ X _ {1} $ for a fixed value of the sufficient statistic $ T = t $ is binomial with parameters $ t $ and $ 1 / 2 $, that is, under $ H _ {0} $,
$$ {\mathsf P} \{ X _ {1} = k \mid T = t \} = \ \left ( \begin{array}{c} t \\ k \end{array} \right ) \left ( { \frac{1}{2} } \right ) ^ {t} ,\ \ k = 0 \dots t . $$
Thus, in this case the problem of testing the composite hypothesis $ H _ {0} $ reduces to that of testing the simple hypothesis $ H _ {0} ^ {t} $, according to which the conditional distribution of $ X _ {1} $( for a fixed sum $ X _ {1} + X _ {2} = t $) is binomial with parameters $ t $ and $ 1 / 2 $. For testing $ H _ {0} ^ {t} $ one can use, for example, the sign test.
The concept of a Neyman structure is of great significance in the problem of testing composite statistical hypotheses, since among the tests having Neyman structure there frequently is a most-powerful test. E. Lehmann and H. Scheffé have shown that a statistical test for testing a composite hypothesis $ H _ {0} $: $ \theta \in \Theta _ {0} $ has Neyman structure relative to a sufficient statistic $ T $ if and only if the family $ \{ {P _ \theta ^ {T} } : {\theta \in \Theta _ {0} } \} $ induced by $ T $ is boundedly complete. On the basis of the concept of a Neyman structure general methods have been worked out for the construction of similar tests. See Distributions, complete family of; Similar test.
References
[1] | J. Neyman, "Current problems of mathematical statistics" , Proc. Internat. Congress Mathematicians (Amsterdam, 1954) , 1 , Noordhoff & North-Holland (1957) pp. 349–370 |
[2] | E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986) |
[3] | Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian) |
Neyman structure. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Neyman_structure&oldid=14940