Difference between revisions of "Similar statistic"
(Importing text file) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | s0851401.png | ||
+ | $#A+1 = 32 n = 0 | ||
+ | $#C+1 = 32 : ~/encyclopedia/old_files/data/S085/S.0805140 Similar statistic | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
A statistic having a fixed probability distribution under some compound hypothesis. | A statistic having a fixed probability distribution under some compound hypothesis. | ||
− | Let the statistic | + | Let the statistic $ T $ |
+ | map the sample space $ ( \mathfrak X , {\mathcal B} _ {\mathfrak X} , {\mathsf P} _ \theta ) $, | ||
+ | $ \theta \in \Theta $, | ||
+ | into a measurable space $ ( \mathfrak A, {\mathcal B} _ {\mathfrak Y} ) $ | ||
+ | and consider some compound hypothesis $ H _ {0} $: | ||
+ | $ \theta \in \Theta _ {0} \subseteq \Theta $. | ||
+ | In that case, if for any event $ B \in {\mathcal B} _ {\mathfrak Y} $ | ||
+ | the probability | ||
− | + | $$ \tag{* } | |
+ | {\mathsf P} _ \theta ( T ^ {-} 1 ( B)) \ \textrm{ is } \textrm{ independent } \ | ||
+ | \textrm{ of } \theta \textrm{ for } \theta \in \Theta _ {0} , | ||
+ | $$ | ||
− | one says that | + | one says that $ T $ |
+ | is a similar statistic with respect to $ H _ {0} $, | ||
+ | or simply that it is a similar statistic. It is clear that condition (*) is equivalent to saying that the distribution of the statistic $ T $ | ||
+ | does not vary when $ \theta $ | ||
+ | runs through $ \Theta _ {0} $. | ||
+ | With this property in view, it is frequently said of a similar statistic that it is independent of the parameter $ \theta $, | ||
+ | $ \theta \in \Theta _ {0} $. | ||
+ | Similar statistics play a large role in constructing similar tests, and also in solving statistical problems with nuisance parameters. | ||
− | Example 1. Let | + | Example 1. Let $ X _ {1} \dots X _ {n} $ |
+ | be independent random variables with identical normal distribution $ N _ {1} ( a, \sigma ^ {2} ) $ | ||
+ | with $ | a | < \infty $ | ||
+ | and $ \sigma > 0 $. | ||
+ | Then for any $ \alpha > 0 $ | ||
+ | the statistic | ||
− | + | $$ | |
+ | T = \left ( {\sum _ { i= } 1 ^ { n } ( X _ {i} - \overline{X}\; ) ^ {2} } \right ) ^ | ||
+ | {- \alpha } \sum _ { i= } 1 ^ { n } ( X _ {i} - \overline{X}\; ) ^ {2 \alpha } , | ||
+ | $$ | ||
where | where | ||
− | + | $$ | |
+ | \overline{X}\; = | ||
+ | \frac{1}{n} | ||
+ | \sum _ { i= } 1 ^ { n } X _ {i} , | ||
+ | $$ | ||
− | is independent of the two-dimensional parameter | + | is independent of the two-dimensional parameter $ ( a, \sigma ^ {2} ) $. |
− | Example 2. Let | + | Example 2. Let $ X _ {1} \dots X _ {n+} m $ |
+ | be independent identically-distributed random variables whose distribution functions belong to the family $ {\mathcal F} = \{ F ( x) \} $ | ||
+ | of all continuous distribution functions on $ ( - \infty , + \infty ) $. | ||
+ | If $ F _ {n} ( x) $ | ||
+ | and $ F _ {m} ( x) $ | ||
+ | are empirical distribution functions constructed from the observations $ X _ {1} \dots X _ {n} $ | ||
+ | and $ X _ {n+} 1 \dots X _ {n+} m $, | ||
+ | respectively, then the Smirnov statistic | ||
− | + | $$ | |
+ | S _ {n,m} = \sup _ {| x| < \infty } | F _ {n} ( x) - F _ {m} ( x) | | ||
+ | $$ | ||
− | is similar with respect to the family | + | is similar with respect to the family $ {\mathcal F} $. |
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> J.-L. Soler, "Basic structures in mathematical statistics" , Moscow (1972) (In Russian; translated from French)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> J.-R. Barra, "Mathematical bases of statistics" , Acad. Press (1981) (Translated from French)</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> J.-L. Soler, "Basic structures in mathematical statistics" , Moscow (1972) (In Russian; translated from French)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> J.-R. Barra, "Mathematical bases of statistics" , Acad. Press (1981) (Translated from French)</TD></TR></table> | ||
− | |||
− | |||
====Comments==== | ====Comments==== | ||
− | |||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)</TD></TR></table> | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)</TD></TR></table> |
Revision as of 08:13, 6 June 2020
A statistic having a fixed probability distribution under some compound hypothesis.
Let the statistic $ T $ map the sample space $ ( \mathfrak X , {\mathcal B} _ {\mathfrak X} , {\mathsf P} _ \theta ) $, $ \theta \in \Theta $, into a measurable space $ ( \mathfrak A, {\mathcal B} _ {\mathfrak Y} ) $ and consider some compound hypothesis $ H _ {0} $: $ \theta \in \Theta _ {0} \subseteq \Theta $. In that case, if for any event $ B \in {\mathcal B} _ {\mathfrak Y} $ the probability
$$ \tag{* } {\mathsf P} _ \theta ( T ^ {-} 1 ( B)) \ \textrm{ is } \textrm{ independent } \ \textrm{ of } \theta \textrm{ for } \theta \in \Theta _ {0} , $$
one says that $ T $ is a similar statistic with respect to $ H _ {0} $, or simply that it is a similar statistic. It is clear that condition (*) is equivalent to saying that the distribution of the statistic $ T $ does not vary when $ \theta $ runs through $ \Theta _ {0} $. With this property in view, it is frequently said of a similar statistic that it is independent of the parameter $ \theta $, $ \theta \in \Theta _ {0} $. Similar statistics play a large role in constructing similar tests, and also in solving statistical problems with nuisance parameters.
Example 1. Let $ X _ {1} \dots X _ {n} $ be independent random variables with identical normal distribution $ N _ {1} ( a, \sigma ^ {2} ) $ with $ | a | < \infty $ and $ \sigma > 0 $. Then for any $ \alpha > 0 $ the statistic
$$ T = \left ( {\sum _ { i= } 1 ^ { n } ( X _ {i} - \overline{X}\; ) ^ {2} } \right ) ^ {- \alpha } \sum _ { i= } 1 ^ { n } ( X _ {i} - \overline{X}\; ) ^ {2 \alpha } , $$
where
$$ \overline{X}\; = \frac{1}{n} \sum _ { i= } 1 ^ { n } X _ {i} , $$
is independent of the two-dimensional parameter $ ( a, \sigma ^ {2} ) $.
Example 2. Let $ X _ {1} \dots X _ {n+} m $ be independent identically-distributed random variables whose distribution functions belong to the family $ {\mathcal F} = \{ F ( x) \} $ of all continuous distribution functions on $ ( - \infty , + \infty ) $. If $ F _ {n} ( x) $ and $ F _ {m} ( x) $ are empirical distribution functions constructed from the observations $ X _ {1} \dots X _ {n} $ and $ X _ {n+} 1 \dots X _ {n+} m $, respectively, then the Smirnov statistic
$$ S _ {n,m} = \sup _ {| x| < \infty } | F _ {n} ( x) - F _ {m} ( x) | $$
is similar with respect to the family $ {\mathcal F} $.
References
[1] | J.-L. Soler, "Basic structures in mathematical statistics" , Moscow (1972) (In Russian; translated from French) |
[2] | Yu.V. Linnik, "Statistical problems with nuisance parameters" , Amer. Math. Soc. (1968) (Translated from Russian) |
[3] | J.-R. Barra, "Mathematical bases of statistics" , Acad. Press (1981) (Translated from French) |
Comments
References
[a1] | E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986) |
Similar statistic. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Similar_statistic&oldid=48700