Difference between revisions of "Sign test"
Ulf Rehmann (talk | contribs) m (Undo revision 48694 by Ulf Rehmann (talk)) Tag: Undo |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
Line 1: | Line 1: | ||
− | + | <!-- | |
+ | s0850401.png | ||
+ | $#A+1 = 25 n = 0 | ||
+ | $#C+1 = 25 : ~/encyclopedia/old_files/data/S085/S.0805040 Sign test | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
− | + | {{TEX|auto}} | |
+ | {{TEX|done}} | ||
− | + | A [[Non-parametric test|non-parametric test]] for a hypothesis $ H _ {0} $, | |
+ | according to which a random variable $ \mu $ | ||
+ | has a binomial distribution with parameters $ ( n ; p = 0 . 5 ) $. | ||
+ | If the hypothesis $ H _ {0} $ | ||
+ | is true, then | ||
+ | |||
+ | $$ | ||
+ | {\mathsf P} \left \{ \mu \leq k \left | n , | ||
+ | \frac{1}{2} | ||
+ | \right . \right \} = \sum _ | ||
+ | {i = 0 } ^ { k } \left ( \begin{array}{c} | ||
+ | n \\ | ||
+ | i | ||
+ | \end{array} | ||
+ | \right ) \left ( | ||
+ | \frac{1}{2} | ||
+ | \right ) ^ {n} = \ | ||
+ | I _ {0,5} ( n - k , k + 1 ) , | ||
+ | $$ | ||
+ | |||
+ | $$ | ||
+ | k = 0 \dots n , | ||
+ | $$ | ||
where | where | ||
− | + | $$ | |
+ | I _ {z} ( a , b ) = | ||
+ | \frac{1}{B ( a , b ) } | ||
+ | |||
+ | \int\limits _ { 0 } ^ { z } t ^ {a-} 1 ( 1 - t ) ^ {b-} 1 dt ,\ \ | ||
+ | 0 \leq z \leq 1 , | ||
+ | $$ | ||
− | and | + | and $ B ( a , b ) $ |
+ | is the beta-function. According to the sign test with significance level $ \alpha $, | ||
+ | $ 0 < \alpha \leq 0 . 5 $, | ||
+ | the hypothesis $ H _ {0} $ | ||
+ | is rejected if | ||
− | + | $$ | |
+ | \min \{ \mu , n - \mu \} \leq m , | ||
+ | $$ | ||
− | where | + | where $ m = m ( \alpha , n ) $, |
+ | the critical value of the test, is the integer solution of the inequalities | ||
− | + | $$ | |
+ | \sum _ {i = 0 } ^ { m } \left ( \begin{array}{c} | ||
+ | n \\ | ||
+ | i | ||
+ | \end{array} | ||
+ | \right ) | ||
+ | \left ( | ||
+ | \frac{1}{2} | ||
+ | \right ) ^ {n} \leq | ||
+ | \frac \alpha {2} | ||
+ | ,\ \ | ||
+ | \sum _ {i = 0 } ^ { {m } + 1 } \left ( \begin{array}{c} | ||
+ | n \\ | ||
+ | i | ||
+ | \end{array} | ||
+ | \right ) | ||
+ | \left ( | ||
+ | \frac{1}{2} | ||
+ | \right ) ^ {n} > | ||
+ | \frac \alpha {2} | ||
+ | . | ||
+ | $$ | ||
− | The sign test can be used to test a hypothesis | + | The sign test can be used to test a hypothesis $ H _ {0} $ |
+ | according to which the unknown continuous distribution of independent identically-distributed random variables $ X _ {1} \dots X _ {n} $ | ||
+ | is symmetric about zero, i.e. for any real $ x $, | ||
− | + | $$ | |
+ | {\mathsf P} \{ X _ {i} \langle - x \} = {\mathsf P} \{ X _ {i} \rangle x \} . | ||
+ | $$ | ||
In this case the sign test is based on the statistic | In this case the sign test is based on the statistic | ||
− | + | $$ | |
+ | \mu = \sum _ { i= } 1 ^ { n } \delta ( X _ {i} ) ,\ \ | ||
+ | \delta ( x) = \left \{ | ||
+ | \begin{array}{ll} | ||
+ | 1 & \textrm{ if } x > 0 , \\ | ||
+ | 0 & \textrm{ if } x < 0 , \\ | ||
+ | \end{array} | ||
+ | |||
+ | \right .$$ | ||
− | which is governed by a binomial law with parameters | + | which is governed by a binomial law with parameters $ ( n ; p = 0 . 5 ) $ |
+ | if the hypothesis $ H _ {0} $ | ||
+ | is true. | ||
− | Similarly, the sign test is used to test a hypothesis | + | Similarly, the sign test is used to test a hypothesis $ H _ {0} $ |
+ | according to which the median of an unknown continuous distribution to which independent random variables $ X _ {1} \dots X _ {n} $ | ||
+ | are subject is $ \xi _ {0} $; | ||
+ | to this end one simply replaces the given random variables by $ Y _ {1} = X _ {1} - \xi _ {0} \dots Y _ {n} = X _ {n} - \xi _ {0} $. | ||
====References==== | ====References==== | ||
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> B.L. van der Waerden, "Mathematische Statistik" , Springer (1957)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top"> N.V. Smirnov, I.V. Dunin-Barkovskii, "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft. (1969) (Translated from Russian)</TD></TR></table> | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top"> E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> B.L. van der Waerden, "Mathematische Statistik" , Springer (1957)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top"> N.V. Smirnov, I.V. Dunin-Barkovskii, "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft. (1969) (Translated from Russian)</TD></TR></table> |
Revision as of 14:55, 7 June 2020
A non-parametric test for a hypothesis $ H _ {0} $,
according to which a random variable $ \mu $
has a binomial distribution with parameters $ ( n ; p = 0 . 5 ) $.
If the hypothesis $ H _ {0} $
is true, then
$$ {\mathsf P} \left \{ \mu \leq k \left | n , \frac{1}{2} \right . \right \} = \sum _ {i = 0 } ^ { k } \left ( \begin{array}{c} n \\ i \end{array} \right ) \left ( \frac{1}{2} \right ) ^ {n} = \ I _ {0,5} ( n - k , k + 1 ) , $$
$$ k = 0 \dots n , $$
where
$$ I _ {z} ( a , b ) = \frac{1}{B ( a , b ) } \int\limits _ { 0 } ^ { z } t ^ {a-} 1 ( 1 - t ) ^ {b-} 1 dt ,\ \ 0 \leq z \leq 1 , $$
and $ B ( a , b ) $ is the beta-function. According to the sign test with significance level $ \alpha $, $ 0 < \alpha \leq 0 . 5 $, the hypothesis $ H _ {0} $ is rejected if
$$ \min \{ \mu , n - \mu \} \leq m , $$
where $ m = m ( \alpha , n ) $, the critical value of the test, is the integer solution of the inequalities
$$ \sum _ {i = 0 } ^ { m } \left ( \begin{array}{c} n \\ i \end{array} \right ) \left ( \frac{1}{2} \right ) ^ {n} \leq \frac \alpha {2} ,\ \ \sum _ {i = 0 } ^ { {m } + 1 } \left ( \begin{array}{c} n \\ i \end{array} \right ) \left ( \frac{1}{2} \right ) ^ {n} > \frac \alpha {2} . $$
The sign test can be used to test a hypothesis $ H _ {0} $ according to which the unknown continuous distribution of independent identically-distributed random variables $ X _ {1} \dots X _ {n} $ is symmetric about zero, i.e. for any real $ x $,
$$ {\mathsf P} \{ X _ {i} \langle - x \} = {\mathsf P} \{ X _ {i} \rangle x \} . $$
In this case the sign test is based on the statistic
$$ \mu = \sum _ { i= } 1 ^ { n } \delta ( X _ {i} ) ,\ \ \delta ( x) = \left \{ \begin{array}{ll} 1 & \textrm{ if } x > 0 , \\ 0 & \textrm{ if } x < 0 , \\ \end{array} \right .$$
which is governed by a binomial law with parameters $ ( n ; p = 0 . 5 ) $ if the hypothesis $ H _ {0} $ is true.
Similarly, the sign test is used to test a hypothesis $ H _ {0} $ according to which the median of an unknown continuous distribution to which independent random variables $ X _ {1} \dots X _ {n} $ are subject is $ \xi _ {0} $; to this end one simply replaces the given random variables by $ Y _ {1} = X _ {1} - \xi _ {0} \dots Y _ {n} = X _ {n} - \xi _ {0} $.
References
[1] | L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , Libr. math. tables , 46 , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova) |
[2] | E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986) |
[3] | B.L. van der Waerden, "Mathematische Statistik" , Springer (1957) |
[4] | N.V. Smirnov, I.V. Dunin-Barkovskii, "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft. (1969) (Translated from Russian) |
Sign test. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Sign_test&oldid=49582