Namespaces
Variants
Actions

Difference between revisions of "Kolmogorov test"

From Encyclopedia of Mathematics
Jump to: navigation, search
(picture remake)
m (tex encoded by computer)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
 +
<!--
 +
k0557601.png
 +
$#A+1 = 55 n = 0
 +
$#C+1 = 55 : ~/encyclopedia/old_files/data/K055/K.0505760 Kolmogorov test
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
{{MSC|62G10}}
 
{{MSC|62G10}}
  
 
[[Category:Nonparametric inference]]
 
[[Category:Nonparametric inference]]
  
A [[Statistical test|statistical test]] used for testing a simple non-parametric hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557601.png" />, according to which independent identically-distributed random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557602.png" /> have a given distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557603.png" />, where the alternative hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557604.png" /> is taken to be two-sided:
+
A [[Statistical test|statistical test]] used for testing a simple non-parametric hypothesis $  H _ {0} $,  
 +
according to which independent identically-distributed random variables $  X _ {1} \dots X _ {n} $
 +
have a given distribution function $  F $,  
 +
where the alternative hypothesis $  H _ {1} $
 +
is taken to be two-sided:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557605.png" /></td> </tr></table>
+
$$
 +
| {\mathsf E} F _ {n} ( x) - F ( x) |  > 0 ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557606.png" /> is the [[Mathematical expectation|mathematical expectation]] of the empirical distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557607.png" />. The critical set of the Kolmogorov test is expressed by the inequality
+
where $  {\mathsf E} F _ {n} $
 +
is the [[Mathematical expectation|mathematical expectation]] of the empirical distribution function $  F _ {n} $.  
 +
The critical set of the Kolmogorov test is expressed by the inequality
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557608.png" /></td> </tr></table>
+
$$
 +
D _ {n}  = \
 +
\sup _
 +
{| x | < \infty } \
 +
| F _ {n} ( x) - F ( x) |  > \lambda _ {n}  $$
  
and is based on the following theorem, proved by A.N. Kolmogorov in 1933: If the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k0557609.png" /> is true, then the distribution of the statistic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576010.png" /> does not depend on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576011.png" />; also, as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576012.png" />,
+
and is based on the following theorem, proved by A.N. Kolmogorov in 1933: If the hypothesis $  H _ {0} $
 +
is true, then the distribution of the statistic $  D _ {n} $
 +
does not depend on $  F $;  
 +
also, as $  n \rightarrow \infty $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576013.png" /></td> </tr></table>
+
$$
 +
{\mathsf P}
 +
\{ \sqrt n D _ {n} < \lambda \}  \rightarrow  K ( \lambda ) ,\ \
 +
\lambda > 0 ,
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576014.png" /></td> </tr></table>
+
$$
 +
K ( \lambda )  = \
 +
\sum _ {m = - \infty } ^  \infty 
 +
( - 1 )  ^ {m} e ^ {- 2 m  ^ {2} \lambda  ^ {2} } .
 +
$$
  
In 1948 N.V. Smirnov {{Cite|BS}} tabulated the Kolmogorov distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576015.png" />. According to the Kolmogorov test with significance level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576016.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576017.png" />, the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576018.png" /> must be rejected if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576019.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576020.png" /> is the critical value of the Kolmogorov test corresponding to the given significance level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576021.png" /> and is the root of the equation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576022.png" />.
+
In 1948 N.V. Smirnov {{Cite|BS}} tabulated the Kolmogorov distribution function $  K ( \lambda ) $.  
 +
According to the Kolmogorov test with significance level $  \alpha $,
 +
$  0 < \alpha < 0.5 $,  
 +
the hypothesis $  H _ {0} $
 +
must be rejected if $  D _ {n} \geq  \lambda _ {n} ( \alpha ) $,  
 +
where $  \lambda _ {n} ( \alpha ) $
 +
is the critical value of the Kolmogorov test corresponding to the given significance level $  \alpha $
 +
and is the root of the equation $  {\mathsf P} \{ D _ {n} \geq  \lambda \} = \alpha $.
  
To determine <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576023.png" /> one recommends the use of the approximation of the limiting law of the Kolmogorov statistic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576024.png" /> and its limiting distribution; see {{Cite|B}}, where it is shown that, as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576025.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576026.png" />,
+
To determine $  \lambda _ {n} ( \alpha ) $
 +
one recommends the use of the approximation of the limiting law of the Kolmogorov statistic $  D _ {n} $
 +
and its limiting distribution; see {{Cite|B}}, where it is shown that, as $  n \rightarrow \infty $
 +
and  $  0 < \lambda _ {0} < \lambda = O ( n  ^ {1/3} ) $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576027.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
$$ \tag{* }
 +
{\mathsf P}
 +
\left \{
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576028.png" /></td> </tr></table>
+
\frac{1}{18n}
 +
 
 +
( 6 n D _ {n} + 1 )  ^ {2}
 +
\geq  \lambda \right \} =
 +
$$
 +
 
 +
$$
 +
= \
 +
\left [ 1 - K \left ( \sqrt {
 +
\frac \lambda {2}
 +
} \right )  \right
 +
] \left [ 1 + O \left (
 +
\frac{1}{n}
 +
\right )  \right ] .
 +
$$
  
 
The application of the approximation (*) gives the following approximation of the critical value:
 
The application of the approximation (*) gives the following approximation of the critical value:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576029.png" /></td> </tr></table>
+
$$
 +
\lambda _ {n} ( \alpha )  \approx \
 +
\sqrt {
 +
\frac{z}{2n}
 +
} -  
 +
\frac{1}{6n}
 +
,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576030.png" /> is the root of the equation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576031.png" />.
+
where $  z $
 +
is the root of the equation $  1 - K ( \sqrt {z/2 } ) = \alpha $.
  
In practice, for the calculation of the value of the statistic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576032.png" /> one uses the fact that
+
In practice, for the calculation of the value of the statistic $  D _ {n} $
 +
one uses the fact that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576033.png" /></td> </tr></table>
+
$$
 +
D _ {n}  = \
 +
\max ( D _ {n}  ^ {+} , D _ {n}  ^ {-} ) ,
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576034.png" /></td> </tr></table>
+
$$
 
+
D _ {n} ^ {+} = \
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576035.png" /></td> </tr></table>
+
\max _ {1 \leq  m \leq  n } \
 
+
\left (
and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576036.png" /> is the [[Variational series|variational series]] (or set of order statistics) constructed from the sample <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576037.png" />. The Kolmogorov test has the following geometric interpretation (see Fig.).
 
 
 
<center><asy>
 
 
 
srand(2014011);
 
 
 
import stats;
 
 
 
int size = 13;
 
real [] sample = new real[size+1];
 
real lambda = 1.3/size;
 
real width = 2.0;
 
 
 
for (int k=0; k<size; ++k) {
 
  sample[k] = Gaussrand();
 
}
 
sample[size] = 10;
 
 
 
sample = sort(sample);
 
 
 
// for (real x : sample ) {
 
//  write(x);
 
// }
 
 
 
real x0 = -10;
 
int k = 0;
 
for (real x : sample ) {
 
  filldraw( box( (x0,k/size-lambda), (x,k/size+lambda) ), rgb(0.8,0.8,0.8) );
 
  draw( (x0,k/size-lambda)..(x,k/size-lambda), currentpen+1.5 );
 
  draw( (x0,k/size)..(x,k/size), currentpen+1.5 );
 
  draw( (x0,k/size+lambda)..(x,k/size+lambda), currentpen+1.5 );
 
  k += 1;
 
  x0 = x;
 
  draw( (x,(k-1)/size-lambda)..(x,k/size+lambda) );
 
}
 
 
 
clip( box((-width,-0.005),(width,1.005)) );
 
 
 
draw ((-width,0)--(width,0),Arrow);
 
draw ((0,-0.1)--(0,1.3),Arrow);
 
draw ((-width,1)--(width,1));
 
 
 
draw ((sample[2],0)..(sample[2],2/size));
 
draw ((sample[size-1],0)..(sample[size-1],0.48), dashed);
 
draw ((sample[size-1],0.7)..(sample[size-1],1-1/size), dashed);
 
 
 
label("$x$",(width,0),S);
 
label("$y$",(0,1.3),W);
 
label("$0$",(0,0),SW);
 
label("$1$",(0,1),NE);
 
 
 
label("$X_{(1)}$",(sample[0],0),S);
 
label("$X_{(2)}$",(sample[1],0),S);
 
label("$X_{(3)}$",(sample[2],0),S);
 
label("$X_{(n)}$",(sample[size-1],0),S);
 
  
label("$F_n(x)+\lambda_n(\alpha)$",(-1.55,0.35));
+
\frac{m}{n}
draw ((-1.35,0.25)..(-1.2,1/size+lambda));
+
- F
dot((-1.2,1/size+lambda));
+
( X _ {(} m) )
 +
\right ) ,
 +
$$
  
label("$F_n(x)$",(0.4,0.3));
+
$$  
draw ((0.4,0.4)..(0.3,8/size));
+
D _ {n}  ^ {-}  =  \max _ {1 \leq  m \leq  n }
dot((0.3,8/size));
+
\left ( F ( X _ {(} m) ) - m-
 +
\frac{1}{n}
 +
\right ) ,
 +
$$
  
label("$F_n(x)-\lambda_n(\alpha)$",(1.5,0.6));
+
and  $ X _ {(} 1) \leq  \dots \leq  X _ {(} n) $
draw ((1.6,0.7)..(1.7,1-lambda));
+
is the [[Variational series|variational series]] (or set of order statistics) constructed from the sample  $  X _ {1} \dots X _ {n} $.  
dot((1.7,1-lambda));
+
The Kolmogorov test has the following geometric interpretation (see Fig.).
  
shipout(scale(100,100)*currentpicture);
+
{{:Kolmogorov test/Fig1}}
</asy></center>
 
  
The graph of the functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576038.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576039.png" /> is depicted in the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576040.png" />-plane. The shaded region is the confidence zone at level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576041.png" /> for the distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576042.png" />, since if the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576043.png" /> is true, then according to Kolmogorov's theorem
+
The graph of the functions $  F _ {n} ( x) $,  
 +
$  F _ {n} ( x) \pm  \lambda _ {n} ( \alpha ) $
 +
is depicted in the $  xy $-
 +
plane. The shaded region is the confidence zone at level $  1 - \alpha $
 +
for the distribution function $  F $,  
 +
since if the hypothesis $  H _ {0} $
 +
is true, then according to Kolmogorov's theorem
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576044.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ F _ {n} ( x) - \lambda _ {n} ( \alpha ) < F ( x) < F _ {n} ( x) + \lambda _ {n} ( \alpha ) \}
 +
\approx  1 - \alpha .
 +
$$
  
If the graph of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576045.png" /> does not leave the shaded region then, according to the Kolmogorov test, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576046.png" /> must be accepted with significance level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576047.png" />; otherwise <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576048.png" /> is rejected.
+
If the graph of $  F $
 +
does not leave the shaded region then, according to the Kolmogorov test, $  H _ {0} $
 +
must be accepted with significance level $  \alpha $;  
 +
otherwise $  H _ {0} $
 +
is rejected.
  
 
The Kolmogorov test gave a strong impetus to the development of mathematical statistics, being the start of much research on new methods of statistical analysis lying at the foundations of non-parametric statistics.
 
The Kolmogorov test gave a strong impetus to the development of mathematical statistics, being the start of much research on new methods of statistical analysis lying at the foundations of non-parametric statistics.
Line 134: Line 169:
  
 
====Comments====
 
====Comments====
Tests based on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576049.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576050.png" />, and similar tests for a two-sample problem based on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576051.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576052.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576053.png" /> is the empirical distribution function for samples of size <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576054.png" /> for a population with distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/k/k055/k055760/k05576055.png" />, are also called Kolmogorov–Smirnov tests, cf. also [[Kolmogorov–Smirnov test|Kolmogorov–Smirnov test]].
+
Tests based on $  D _ {n} $
 +
and $  \widetilde{D}  _ {n} = \sup _ {x} ( F _ {n} ( x) - F ( x)) $,  
 +
and similar tests for a two-sample problem based on $  D _ {m,n} = \sup _ {x}  | F _ {m} ( x) - G _ {n} ( x) | $
 +
and $  \widetilde{D}  _ {m,n} = \sup _ {x} ( F _ {m} ( x) - G _ {n} ( x)) $,  
 +
where $  G _ {m} $
 +
is the empirical distribution function for samples of size $  m $
 +
for a population with distribution function $  G $,  
 +
are also called Kolmogorov–Smirnov tests, cf. also [[Kolmogorov–Smirnov test|Kolmogorov–Smirnov test]].
  
 
====References====
 
====References====

Latest revision as of 22:14, 5 June 2020


2020 Mathematics Subject Classification: Primary: 62G10 [MSN][ZBL]

A statistical test used for testing a simple non-parametric hypothesis $ H _ {0} $, according to which independent identically-distributed random variables $ X _ {1} \dots X _ {n} $ have a given distribution function $ F $, where the alternative hypothesis $ H _ {1} $ is taken to be two-sided:

$$ | {\mathsf E} F _ {n} ( x) - F ( x) | > 0 , $$

where $ {\mathsf E} F _ {n} $ is the mathematical expectation of the empirical distribution function $ F _ {n} $. The critical set of the Kolmogorov test is expressed by the inequality

$$ D _ {n} = \ \sup _ {| x | < \infty } \ | F _ {n} ( x) - F ( x) | > \lambda _ {n} $$

and is based on the following theorem, proved by A.N. Kolmogorov in 1933: If the hypothesis $ H _ {0} $ is true, then the distribution of the statistic $ D _ {n} $ does not depend on $ F $; also, as $ n \rightarrow \infty $,

$$ {\mathsf P} \{ \sqrt n D _ {n} < \lambda \} \rightarrow K ( \lambda ) ,\ \ \lambda > 0 , $$

where

$$ K ( \lambda ) = \ \sum _ {m = - \infty } ^ \infty ( - 1 ) ^ {m} e ^ {- 2 m ^ {2} \lambda ^ {2} } . $$

In 1948 N.V. Smirnov [BS] tabulated the Kolmogorov distribution function $ K ( \lambda ) $. According to the Kolmogorov test with significance level $ \alpha $, $ 0 < \alpha < 0.5 $, the hypothesis $ H _ {0} $ must be rejected if $ D _ {n} \geq \lambda _ {n} ( \alpha ) $, where $ \lambda _ {n} ( \alpha ) $ is the critical value of the Kolmogorov test corresponding to the given significance level $ \alpha $ and is the root of the equation $ {\mathsf P} \{ D _ {n} \geq \lambda \} = \alpha $.

To determine $ \lambda _ {n} ( \alpha ) $ one recommends the use of the approximation of the limiting law of the Kolmogorov statistic $ D _ {n} $ and its limiting distribution; see [B], where it is shown that, as $ n \rightarrow \infty $ and $ 0 < \lambda _ {0} < \lambda = O ( n ^ {1/3} ) $,

$$ \tag{* } {\mathsf P} \left \{ \frac{1}{18n} ( 6 n D _ {n} + 1 ) ^ {2} \geq \lambda \right \} = $$

$$ = \ \left [ 1 - K \left ( \sqrt { \frac \lambda {2} } \right ) \right ] \left [ 1 + O \left ( \frac{1}{n} \right ) \right ] . $$

The application of the approximation (*) gives the following approximation of the critical value:

$$ \lambda _ {n} ( \alpha ) \approx \ \sqrt { \frac{z}{2n} } - \frac{1}{6n} , $$

where $ z $ is the root of the equation $ 1 - K ( \sqrt {z/2 } ) = \alpha $.

In practice, for the calculation of the value of the statistic $ D _ {n} $ one uses the fact that

$$ D _ {n} = \ \max ( D _ {n} ^ {+} , D _ {n} ^ {-} ) , $$

where

$$ D _ {n} ^ {+} = \ \max _ {1 \leq m \leq n } \ \left ( \frac{m}{n} - F ( X _ {(} m) ) \right ) , $$

$$ D _ {n} ^ {-} = \max _ {1 \leq m \leq n } \left ( F ( X _ {(} m) ) - m- \frac{1}{n} \right ) , $$

and $ X _ {(} 1) \leq \dots \leq X _ {(} n) $ is the variational series (or set of order statistics) constructed from the sample $ X _ {1} \dots X _ {n} $. The Kolmogorov test has the following geometric interpretation (see Fig.).

The graph of the functions $ F _ {n} ( x) $, $ F _ {n} ( x) \pm \lambda _ {n} ( \alpha ) $ is depicted in the $ xy $- plane. The shaded region is the confidence zone at level $ 1 - \alpha $ for the distribution function $ F $, since if the hypothesis $ H _ {0} $ is true, then according to Kolmogorov's theorem

$$ {\mathsf P} \{ F _ {n} ( x) - \lambda _ {n} ( \alpha ) < F ( x) < F _ {n} ( x) + \lambda _ {n} ( \alpha ) \} \approx 1 - \alpha . $$

If the graph of $ F $ does not leave the shaded region then, according to the Kolmogorov test, $ H _ {0} $ must be accepted with significance level $ \alpha $; otherwise $ H _ {0} $ is rejected.

The Kolmogorov test gave a strong impetus to the development of mathematical statistics, being the start of much research on new methods of statistical analysis lying at the foundations of non-parametric statistics.

References

[K] A.N. Kolmogorov, "Sulla determinizione empirica di una legge di distribuzione" Giorn. Ist. Ital. Attuari , 4 (1933) pp. 83–91
[S] N.V. Smirnov, "On estimating the discrepancy between empirical distribiution curves for two independent samples" Byull. Moskov. Gos. Univ. Ser. A , 2 : 2 (1938) pp. 3–14 (In Russian)
[B] L.N. Bol'shev, "Asymptotically Pearson transformations" Theor. Probab. Appl. , 8 (1963) pp. 121–146 Teor. Veroyatnost. i Primenen. , 8 : 2 (1963) pp. 129–155 Zbl 0125.09103
[BS] L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , Libr. math. tables , 46 , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova) Zbl 0529.62099

Comments

Tests based on $ D _ {n} $ and $ \widetilde{D} _ {n} = \sup _ {x} ( F _ {n} ( x) - F ( x)) $, and similar tests for a two-sample problem based on $ D _ {m,n} = \sup _ {x} | F _ {m} ( x) - G _ {n} ( x) | $ and $ \widetilde{D} _ {m,n} = \sup _ {x} ( F _ {m} ( x) - G _ {n} ( x)) $, where $ G _ {m} $ is the empirical distribution function for samples of size $ m $ for a population with distribution function $ G $, are also called Kolmogorov–Smirnov tests, cf. also Kolmogorov–Smirnov test.

References

[N] G.E. Noether, "A brief survey of nonparametric statistics" R.V. Hogg (ed.) , Studies in statistics , Math. Assoc. Amer. (1978) pp. 3–65 Zbl 0413.62023
[HW] M. Hollander, D.A. Wolfe, "Nonparametric statistical methods" , Wiley (1973) MR0353556 Zbl 0277.62030
How to Cite This Entry:
Kolmogorov test. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Kolmogorov_test&oldid=35108
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article