Namespaces
Variants
Actions

Difference between revisions of "Invariance, principle of"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(latex details)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521901.png" /> be independent identically-distributed real-valued random variables with zero expectation and variance <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521902.png" />; consider the random polygonal line
+
<!--
 +
i0521901.png
 +
$#A+1 = 27 n = 0
 +
$#C+1 = 27 : ~/encyclopedia/old_files/data/I052/I.0502190 Invariance, principle of
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521903.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521904.png" />. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521905.png" /> is a real-valued continuous function on the space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521906.png" /> of continuous functions on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521907.png" /> with the supremum norm (or only continuous everywhere except on a set of Wiener measure zero), then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521908.png" /> converges in distribution to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i0521909.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219010.png" /> is a Wiener random function. Thus, the limiting distribution for the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219011.png" /> does not depend on any special properties of the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219012.png" />.
+
Let  $  X _ {1} , X _ {2} \dots $
 +
be independent identically-distributed real-valued random variables with zero expectation and variance  $  \sigma  ^ {2} $;
 +
consider the random polygonal line
  
A typical scheme for the use of the invariance principle consists in finding the limiting distribution for the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219013.png" /> by finding the limiting distribution for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219014.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219015.png" /> is a random polygonal line constructed in the same way as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219016.png" /> from some specially chosen sequence <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219017.png" />. For example, if
+
$$
 +
Y _ {n} ( t )  =
 +
\frac{1}{\sigma \sqrt n }
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219018.png" /></td> </tr></table>
+
\{ S _ {[ nt ] }  + ( nt - [ nt ] ) X _ {[ nt ] + 1 }  \} ,\ \
 +
0 \leq  t \leq  1 ,
 +
$$
  
then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219019.png" /> is continuous on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219020.png" />, and, since
+
where  $  S _ {m} = \sum_{i=1}^ {m} X _ {i} $.  
 +
If  $  f $
 +
is a real-valued continuous function on the space  $  C [ 0 , 1] $
 +
of continuous functions on  $  [ 0 , 1 ] $
 +
with the supremum norm (or only continuous everywhere except on a set of Wiener measure zero), then  $  f ( Y _ {n} ) $
 +
converges in distribution to  $  f ( W ) $,
 +
where  $  W $
 +
is a Wiener random function. Thus, the limiting distribution for the  $  f ( Y _ {n} ) $
 +
does not depend on any special properties of the  $  X _ {1} , X _ {2} , .  . . $.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219021.png" /></td> </tr></table>
+
A typical scheme for the use of the invariance principle consists in finding the limiting distribution for the  $  f ( Y _ {n} ) $
 +
by finding the limiting distribution for  $  f ( Y _ {n}  ^  \prime  ) $,
 +
where  $  Y _ {n}  ^  \prime  $
 +
is a random polygonal line constructed in the same way as  $  Y _ {n} $
 +
from some specially chosen sequence  $  X _ {1}  ^  \prime  , X _ {2}  ^  \prime  , .  .  . $.
 +
For example, if
 +
 
 +
$$
 +
f ( x )  = \sup _ {0 \leq  t \leq  1 }  x ( t ) ,
 +
$$
 +
 
 +
then  $  f $
 +
is continuous on  $  C $,
 +
and, since
 +
 
 +
$$
 +
f ( Y _ {n} )  =
 +
\frac{1}{\sigma \sqrt n }
 +
  \max _ {1 \leq  m \leq
 +
n }  S _ {m} ,
 +
$$
  
 
one has that
 
one has that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219022.png" /></td> </tr></table>
+
$$
  
converges in distribution to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219023.png" />. To find the distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219024.png" />, the sequence <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219025.png" /> <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219026.png" /> is used, and as a result of the calculations one obtains
+
\frac{1}{\sigma \sqrt n }
 +
  \max _ {1 \leq  m \leq  n }  S _ {m}  $$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i052/i052190/i05219027.png" /></td> </tr></table>
+
converges in distribution to  $  \sup _ {t}  W ( t ) $.
 +
To find the distribution of  $  \sup _ {t}  W ( t ) $,
 +
the sequence  $  \{ X _ {n}  ^  \prime  \} : $
 +
$  {\mathsf P} \{ X _ {n}  ^  \prime  = 1 \} = {\mathsf P} \{ X _ {n}  ^  \prime  = - 1 \} = 1 / 2 $
 +
is used, and as a result of the calculations one obtains
 +
 
 +
$$
 +
{\mathsf P} \left \{ \sup _ { t }  W ( t ) \leq  a \right \}  = \
 +
\sqrt {
 +
\frac{2} \pi
 +
} \int\limits _ { 0 } ^ { a }  e ^ {- u  ^ {2} / 2 }  du ,\  a
 +
\geq  0 .
 +
$$
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  M. Donsker,  "An invariance principle for certain probability limit theorems"  ''Memoirs Amer. Math. Soc.'' , '''6'''  (1951)  pp. 1–12</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  Yu.V. Prokhorov,  "Convergence of random processes and limit theorems in probability theory"  ''Theor. Probab. Appl.'' , '''1'''  (1956)  pp. 157–214  ''Teor. Veroyatnost. Prilozhen.'' , '''1''' :  2  (1956)  pp. 177–238</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  P. Billingsley,  "Convergence of probability measures" , Wiley  (1968)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  M. Donsker,  "An invariance principle for certain probability limit theorems"  ''Memoirs Amer. Math. Soc.'' , '''6'''  (1951)  pp. 1–12</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  Yu.V. Prokhorov,  "Convergence of random processes and limit theorems in probability theory"  ''Theor. Probab. Appl.'' , '''1'''  (1956)  pp. 157–214  ''Teor. Veroyatnost. Prilozhen.'' , '''1''' :  2  (1956)  pp. 177–238</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  P. Billingsley,  "Convergence of probability measures" , Wiley  (1968)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====
 
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  L.P. Breiman,  "Probability" , Addison-Wesley  (1968)</TD></TR></table>
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  L.P. Breiman,  "Probability" , Addison-Wesley  (1968)</TD></TR></table>

Latest revision as of 19:45, 16 January 2024


Let $ X _ {1} , X _ {2} \dots $ be independent identically-distributed real-valued random variables with zero expectation and variance $ \sigma ^ {2} $; consider the random polygonal line

$$ Y _ {n} ( t ) = \frac{1}{\sigma \sqrt n } \{ S _ {[ nt ] } + ( nt - [ nt ] ) X _ {[ nt ] + 1 } \} ,\ \ 0 \leq t \leq 1 , $$

where $ S _ {m} = \sum_{i=1}^ {m} X _ {i} $. If $ f $ is a real-valued continuous function on the space $ C [ 0 , 1] $ of continuous functions on $ [ 0 , 1 ] $ with the supremum norm (or only continuous everywhere except on a set of Wiener measure zero), then $ f ( Y _ {n} ) $ converges in distribution to $ f ( W ) $, where $ W $ is a Wiener random function. Thus, the limiting distribution for the $ f ( Y _ {n} ) $ does not depend on any special properties of the $ X _ {1} , X _ {2} , . . . $.

A typical scheme for the use of the invariance principle consists in finding the limiting distribution for the $ f ( Y _ {n} ) $ by finding the limiting distribution for $ f ( Y _ {n} ^ \prime ) $, where $ Y _ {n} ^ \prime $ is a random polygonal line constructed in the same way as $ Y _ {n} $ from some specially chosen sequence $ X _ {1} ^ \prime , X _ {2} ^ \prime , . . . $. For example, if

$$ f ( x ) = \sup _ {0 \leq t \leq 1 } x ( t ) , $$

then $ f $ is continuous on $ C $, and, since

$$ f ( Y _ {n} ) = \frac{1}{\sigma \sqrt n } \max _ {1 \leq m \leq n } S _ {m} , $$

one has that

$$ \frac{1}{\sigma \sqrt n } \max _ {1 \leq m \leq n } S _ {m} $$

converges in distribution to $ \sup _ {t} W ( t ) $. To find the distribution of $ \sup _ {t} W ( t ) $, the sequence $ \{ X _ {n} ^ \prime \} : $ $ {\mathsf P} \{ X _ {n} ^ \prime = 1 \} = {\mathsf P} \{ X _ {n} ^ \prime = - 1 \} = 1 / 2 $ is used, and as a result of the calculations one obtains

$$ {\mathsf P} \left \{ \sup _ { t } W ( t ) \leq a \right \} = \ \sqrt { \frac{2} \pi } \int\limits _ { 0 } ^ { a } e ^ {- u ^ {2} / 2 } du ,\ a \geq 0 . $$

References

[1] M. Donsker, "An invariance principle for certain probability limit theorems" Memoirs Amer. Math. Soc. , 6 (1951) pp. 1–12
[2] Yu.V. Prokhorov, "Convergence of random processes and limit theorems in probability theory" Theor. Probab. Appl. , 1 (1956) pp. 157–214 Teor. Veroyatnost. Prilozhen. , 1 : 2 (1956) pp. 177–238
[3] P. Billingsley, "Convergence of probability measures" , Wiley (1968)

Comments

References

[a1] L.P. Breiman, "Probability" , Addison-Wesley (1968)
How to Cite This Entry:
Invariance, principle of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Invariance,_principle_of&oldid=13150
This article was adapted from an original article by V.V. Sazonov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article