Namespaces
Variants
Actions

Neyman structure

From Encyclopedia of Mathematics
Revision as of 08:02, 6 June 2020 by Ulf Rehmann (talk | contribs) (tex encoded by computer)
Jump to: navigation, search


A structure determined by a statistic that is independent of a sufficient statistic. The concept was introduced by J. Neyman (see [1]) in connection with the problem of constructing similar tests (cf. Similar test) in the theory of statistical hypothesis testing, and the term "Neyman structure" is used when referring to the structure of a statistical test if its critical function has Neyman structure. Suppose that in the realization of a random variable $ X $ taking values in a sample space $ ( \mathfrak X , \mathfrak B , P _ \theta ) $, $ \theta \in \Theta $, it is required to verify a composite hypothesis $ H _ {0} $: $ \theta \in \Theta _ {0} \subset \Theta $ and that for the family $ \{ {P _ \theta } : {\theta \in \Theta _ {0} } \} $ there exists a sufficient statistic $ T $ with distribution in the family $ \{ {P _ {e} ^ {T} } : {e \in \Theta _ {0} } \} $. Then any statistical test of level $ \alpha $ intended for testing $ H _ {0} $ has Neyman structure if its critical function $ \phi $ satisfies the condition:

$$ \tag{1 } {\mathsf E} \{ \phi ( X) \mid T = t \} = \alpha $$

almost everywhere with respect to the measure $ P _ \theta ^ {T} $, $ \theta \in \Theta _ {0} $. Evidently, if a statistical test has Neyman structure, then it is similar (cf. Similar test) relative to the family $ \{ {P _ \theta } : {\theta \in \Theta _ {0} } \} $, since

$$ {\mathsf E} _ \theta \{ \phi ( X) \} = \ {\mathsf E} _ \theta \{ {\mathsf E} \{ \phi ( X) \mid T = t \} \} = \alpha $$

for all $ \theta \in \Theta _ {0} $.

The validity of (1) essentially reduces the problem of testing the composite hypothesis $ H _ {0} $ to that of testing $ H _ {0} $ as a simple hypothesis for every fixed value $ t $ of the sufficient statistic $ T $.

Example. Suppose that two independent random variables $ X _ {1} $ and $ X _ {2} $ are subject to Poisson laws with unknown parameters $ \lambda _ {1} $ and $ \lambda _ {2} $( cf. Poisson distribution) and that the hypothesis $ H _ {0} $: $ \lambda _ {1} = \lambda _ {2} $ is to be tested against the alternative $ H _ {1} $: $ \lambda _ {1} \neq \lambda _ {2} $. Thanks to the independence of $ X _ {1} $ and $ X _ {2} $ the statistic $ T = X _ {1} + X _ {2} $ is subject to the Poisson law with parameter $ \lambda _ {1} + \lambda _ {2} $ and the conditional distributions of $ X _ {1} $ and $ X _ {2} $ under the condition $ T = t $ are binomial with parameters $ t $, $ \lambda _ {1} / ( \lambda _ {1} + \lambda _ {2} ) $ and $ t $, $ \lambda _ {2} / ( \lambda _ {1} + \lambda _ {2} ) $, respectively, that is,

$$ \tag{2 } {\mathsf P} \{ X _ {i} = k \mid T = t \} = \ \left ( \begin{array}{c} t \\ k \end{array} \right ) \left ( \frac{\lambda _ {i} }{\lambda _ {1} + \lambda _ {2} } \right ) ^ {k} \left ( 1 - \frac{\lambda _ {i} }{\lambda _ {1} + \lambda _ {2} } \right ) ^ {t-} k , $$

$$ k = 0 \dots t . $$

When $ H _ {0} $ is valid, then $ T $ is sufficient for the unknown common value $ \lambda = \lambda _ {1} = \lambda _ {2} $, and from (2) it follows that when $ H _ {0} $ holds, then the conditional distribution of $ X _ {1} $ for a fixed value of the sufficient statistic $ T = t $ is binomial with parameters $ t $ and $ 1 / 2 $, that is, under $ H _ {0} $,

$$ {\mathsf P} \{ X _ {1} = k \mid T = t \} = \ \left ( \begin{array}{c} t \\ k \end{array} \right ) \left ( { \frac{1}{2} } \right ) ^ {t} ,\ \ k = 0 \dots t . $$

Thus, in this case the problem of testing the composite hypothesis $ H _ {0} $ reduces to that of testing the simple hypothesis $ H _ {0} ^ {t} $, according to which the conditional distribution of $ X _ {1} $( for a fixed sum $ X _ {1} + X _ {2} = t $) is binomial with parameters $ t $ and $ 1 / 2 $. For testing $ H _ {0} ^ {t} $ one can use, for example, the sign test.

The concept of a Neyman structure is of great significance in the problem of testing composite statistical hypotheses, since among the tests having Neyman structure there frequently is a most-powerful test. E. Lehmann and H. Scheffé have shown that a statistical test for testing a composite hypothesis $ H _ {0} $: $ \theta \in \Theta _ {0} $ has Neyman structure relative to a sufficient statistic $ T $ if and only if the family $ \{ {P _ \theta ^ {T} } : {\theta \in \Theta _ {0} } \} $ induced by $ T $ is boundedly complete. On the basis of the concept of a Neyman structure general methods have been worked out for the construction of similar tests. See Distributions, complete family of; Similar test.

References

[1] J. Neyman, "Current problems of mathematical statistics" , Proc. Internat. Congress Mathematicians (Amsterdam, 1954) , 1 , Noordhoff & North-Holland (1957) pp. 349–370
[2] E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)
[3] Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian)
How to Cite This Entry:
Neyman structure. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Neyman_structure&oldid=53127
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article