Namespaces
Variants
Actions

Neyman structure

From Encyclopedia of Mathematics
Revision as of 17:10, 7 February 2011 by 127.0.0.1 (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

A structure determined by a statistic that is independent of a sufficient statistic. The concept was introduced by J. Neyman (see [1]) in connection with the problem of constructing similar tests (cf. Similar test) in the theory of statistical hypothesis testing, and the term "Neyman structure" is used when referring to the structure of a statistical test if its critical function has Neyman structure. Suppose that in the realization of a random variable taking values in a sample space , , it is required to verify a composite hypothesis : and that for the family there exists a sufficient statistic with distribution in the family . Then any statistical test of level intended for testing has Neyman structure if its critical function satisfies the condition:

(1)

almost everywhere with respect to the measure , . Evidently, if a statistical test has Neyman structure, then it is similar (cf. Similar test) relative to the family , since

for all .

The validity of (1) essentially reduces the problem of testing the composite hypothesis to that of testing as a simple hypothesis for every fixed value of the sufficient statistic .

Example. Suppose that two independent random variables and are subject to Poisson laws with unknown parameters and (cf. Poisson distribution) and that the hypothesis : is to be tested against the alternative : . Thanks to the independence of and the statistic is subject to the Poisson law with parameter and the conditional distributions of and under the condition are binomial with parameters , and , , respectively, that is,

(2)

When is valid, then is sufficient for the unknown common value , and from (2) it follows that when holds, then the conditional distribution of for a fixed value of the sufficient statistic is binomial with parameters and , that is, under ,

Thus, in this case the problem of testing the composite hypothesis reduces to that of testing the simple hypothesis , according to which the conditional distribution of (for a fixed sum ) is binomial with parameters and . For testing one can use, for example, the sign test.

The concept of a Neyman structure is of great significance in the problem of testing composite statistical hypotheses, since among the tests having Neyman structure there frequently is a most-powerful test. E. Lehmann and H. Scheffé have shown that a statistical test for testing a composite hypothesis : has Neyman structure relative to a sufficient statistic if and only if the family induced by is boundedly complete. On the basis of the concept of a Neyman structure general methods have been worked out for the construction of similar tests. See Distributions, complete family of; Similar test.

References

[1] J. Neyman, "Current problems of mathematical statistics" , Proc. Internat. Congress Mathematicians (Amsterdam, 1954) , 1 , Noordhoff & North-Holland (1957) pp. 349–370
[2] E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)
[3] Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian)
How to Cite This Entry:
Neyman structure. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Neyman_structure&oldid=14940
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article