Namespaces
Variants
Actions

Difference between revisions of "Random variables, transformations of"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (dead link removed)
m (tex encoded by computer)
Line 1: Line 1:
 +
<!--
 +
r0773701.png
 +
$#A+1 = 33 n = 0
 +
$#C+1 = 33 : ~/encyclopedia/old_files/data/R077/R.0707370 Random variables, transformations of
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
The determination of functions of given arbitrary random variables for which the probability distributions possess given properties.
 
The determination of functions of given arbitrary random variables for which the probability distributions possess given properties.
  
Example 1. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773701.png" /> be a random variable having a continuous and strictly increasing distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773702.png" />. Then the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773703.png" /> has a uniform distribution on the interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773704.png" />, and the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773705.png" /> (where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773706.png" /> is the standard normal distribution function) has a [[Normal distribution|normal distribution]] with parameters 0 and 1. Conversely, the formula <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773707.png" /> enables one to obtain a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773708.png" /> that has the given distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r0773709.png" /> from a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737010.png" /> with a standard normal distribution.
+
Example 1. Let $  X $
 +
be a random variable having a continuous and strictly increasing distribution function $  F $.  
 +
Then the random variable $  Y = F ( X) $
 +
has a uniform distribution on the interval $  [ 0 , 1 ] $,
 +
and the random variable $  Z = \Phi  ^ {-} 1 ( F ( X) ) $(
 +
where $  \Phi $
 +
is the standard normal distribution function) has a [[Normal distribution|normal distribution]] with parameters 0 and 1. Conversely, the formula $  X = F ^ { - 1 } ( \Phi ( Z) ) $
 +
enables one to obtain a random variable $  X $
 +
that has the given distribution function $  F $
 +
from a random variable $  Z $
 +
with a standard normal distribution.
  
Transformations of random variables are often used in connection with limit theorems of probability theory. For example, let a sequence of random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737011.png" /> be asymptotically normal with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737012.png" />. One then poses the problem of constructing simple (and simply invertible) functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737013.png" /> such that the random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737014.png" /> are  "more normal"  than <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737015.png" />.
+
Transformations of random variables are often used in connection with limit theorems of probability theory. For example, let a sequence of random variables $  Z _ {n} $
 +
be asymptotically normal with parameters $  ( 0 , 1 ) $.  
 +
One then poses the problem of constructing simple (and simply invertible) functions $  f _ {n} $
 +
such that the random variables $  V _ {n} = Z _ {n} + f _ {n} ( Z _ {n} ) $
 +
are  "more normal"  than $  Z _ {n} $.
  
Example 2. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737016.png" /> be independent random variables, each having a [[Uniform distribution|uniform distribution]] on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737017.png" />, and put
+
Example 2. Let $  X _ {1} \dots X _ {n} \dots $
 +
be independent random variables, each having a [[Uniform distribution|uniform distribution]] on $  [ - 1 , 1 ] $,  
 +
and put
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737018.png" /></td> </tr></table>
+
$$
 +
Z _ {n}  =
 +
\frac{X _ {1} + \dots + X _ {n} }{\sqrt {n / 3 } }
 +
.
 +
$$
  
 
By the [[Central limit theorem|central limit theorem]],
 
By the [[Central limit theorem|central limit theorem]],
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737019.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ Z _ {n} < x \} - \Phi ( x)  = O \left (
 +
\frac{1}{n}
 +
\right ) .
 +
$$
  
 
If one sets
 
If one sets
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737020.png" /></td> </tr></table>
+
$$
 +
V _ {n}  = Z _ {n} -
 +
 
 +
\frac{1}{20n}
 +
( 3 Z _ {n} - Z _ {n}  ^ {3} ) ,
 +
$$
  
 
then
 
then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737021.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ V _ {n} < x \} - \Phi ( x)  = O \left (
 +
\frac{1}{n  ^ {2} }
 +
\right ) .
 +
$$
  
Example 3. The random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737022.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737023.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737024.png" /> are asymptotically normal as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737025.png" /> (see [[Chi-squared distribution| Chi-squared  distribution]]). The uniform deviation of the corresponding distribution functions from their normal approximations becomes less than <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737026.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737027.png" /> when <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737028.png" />, and for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737029.png" /> (the Fisher transformation) — when <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737030.png" />; for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737031.png" /> (the Wilson–Hilferty transformation) when <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737032.png" /> this deviation does not exceed <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/r/r077/r077370/r07737033.png" />.
+
Example 3. The random variables $  \chi _ {n}  ^ {2} $,  
 +
$  \sqrt {2 \chi _ {n}  ^ {2} } $
 +
and $  ( \chi _ {n}  ^ {2} / n )  ^ {1/3} $
 +
are asymptotically normal as $  n \rightarrow \infty $(
 +
see [[Chi-squared distribution| Chi-squared  distribution]]). The uniform deviation of the corresponding distribution functions from their normal approximations becomes less than 0 . 0 1 $
 +
for $  \chi _ {n}  ^ {2} $
 +
when $  n \geq  354 $,  
 +
and for $  \sqrt {2 \chi _ {n}  ^ {2} } $(
 +
the Fisher transformation) — when $  n \geq  23 $;  
 +
for $  ( \chi _ {n}  ^ {2} / n )  ^ {1/3} $(
 +
the Wilson–Hilferty transformation) when $  n \geq  3 $
 +
this deviation does not exceed 0.0007 $.
  
 
Transformations of random variables have long been applied in problems of mathematical statistics as the basis for constructing simple asymptotic formulas of high precision. Transformations of random variables are also used in the theory of stochastic processes (for example, the method of the  "single probability space" ).
 
Transformations of random variables have long been applied in problems of mathematical statistics as the basis for constructing simple asymptotic formulas of high precision. Transformations of random variables are also used in the theory of stochastic processes (for example, the method of the  "single probability space" ).
Line 27: Line 83:
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  L.N. Bol'shev,  "On transformations of random variables"  ''Theory Probab. Appl.'' , '''4'''  (1959)  pp. 129–141  ''Teor. Veryatnost. Primenen.'' , '''4''' :  2  (1959)  pp. 136–149</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  L.N. Bol'shev,  "Asymptotically Pearson transformations"  ''Theory Probab. Appl.'' , '''8''' :  2  (1963)  pp. 121–146  ''Teor. Veroyatnost. Primenen.'' , '''8''' :  2  (1963)  pp. 129–155</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  L.N. Bol'shev,  N.V. Smirnov,  "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka  (1983)  (In Russian)  (Processed by L.S. Bark and E.S. Kedrova)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  L.N. Bol'shev,  "On transformations of random variables"  ''Theory Probab. Appl.'' , '''4'''  (1959)  pp. 129–141  ''Teor. Veryatnost. Primenen.'' , '''4''' :  2  (1959)  pp. 136–149</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  L.N. Bol'shev,  "Asymptotically Pearson transformations"  ''Theory Probab. Appl.'' , '''8''' :  2  (1963)  pp. 121–146  ''Teor. Veroyatnost. Primenen.'' , '''8''' :  2  (1963)  pp. 129–155</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  L.N. Bol'shev,  N.V. Smirnov,  "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka  (1983)  (In Russian)  (Processed by L.S. Bark and E.S. Kedrova)</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====

Revision as of 08:09, 6 June 2020


The determination of functions of given arbitrary random variables for which the probability distributions possess given properties.

Example 1. Let $ X $ be a random variable having a continuous and strictly increasing distribution function $ F $. Then the random variable $ Y = F ( X) $ has a uniform distribution on the interval $ [ 0 , 1 ] $, and the random variable $ Z = \Phi ^ {-} 1 ( F ( X) ) $( where $ \Phi $ is the standard normal distribution function) has a normal distribution with parameters 0 and 1. Conversely, the formula $ X = F ^ { - 1 } ( \Phi ( Z) ) $ enables one to obtain a random variable $ X $ that has the given distribution function $ F $ from a random variable $ Z $ with a standard normal distribution.

Transformations of random variables are often used in connection with limit theorems of probability theory. For example, let a sequence of random variables $ Z _ {n} $ be asymptotically normal with parameters $ ( 0 , 1 ) $. One then poses the problem of constructing simple (and simply invertible) functions $ f _ {n} $ such that the random variables $ V _ {n} = Z _ {n} + f _ {n} ( Z _ {n} ) $ are "more normal" than $ Z _ {n} $.

Example 2. Let $ X _ {1} \dots X _ {n} \dots $ be independent random variables, each having a uniform distribution on $ [ - 1 , 1 ] $, and put

$$ Z _ {n} = \frac{X _ {1} + \dots + X _ {n} }{\sqrt {n / 3 } } . $$

By the central limit theorem,

$$ {\mathsf P} \{ Z _ {n} < x \} - \Phi ( x) = O \left ( \frac{1}{n} \right ) . $$

If one sets

$$ V _ {n} = Z _ {n} - \frac{1}{20n} ( 3 Z _ {n} - Z _ {n} ^ {3} ) , $$

then

$$ {\mathsf P} \{ V _ {n} < x \} - \Phi ( x) = O \left ( \frac{1}{n ^ {2} } \right ) . $$

Example 3. The random variables $ \chi _ {n} ^ {2} $, $ \sqrt {2 \chi _ {n} ^ {2} } $ and $ ( \chi _ {n} ^ {2} / n ) ^ {1/3} $ are asymptotically normal as $ n \rightarrow \infty $( see Chi-squared distribution). The uniform deviation of the corresponding distribution functions from their normal approximations becomes less than $ 0 . 0 1 $ for $ \chi _ {n} ^ {2} $ when $ n \geq 354 $, and for $ \sqrt {2 \chi _ {n} ^ {2} } $( the Fisher transformation) — when $ n \geq 23 $; for $ ( \chi _ {n} ^ {2} / n ) ^ {1/3} $( the Wilson–Hilferty transformation) when $ n \geq 3 $ this deviation does not exceed $ 0.0007 $.

Transformations of random variables have long been applied in problems of mathematical statistics as the basis for constructing simple asymptotic formulas of high precision. Transformations of random variables are also used in the theory of stochastic processes (for example, the method of the "single probability space" ).

References

[1] L.N. Bol'shev, "On transformations of random variables" Theory Probab. Appl. , 4 (1959) pp. 129–141 Teor. Veryatnost. Primenen. , 4 : 2 (1959) pp. 136–149
[2] L.N. Bol'shev, "Asymptotically Pearson transformations" Theory Probab. Appl. , 8 : 2 (1963) pp. 121–146 Teor. Veroyatnost. Primenen. , 8 : 2 (1963) pp. 129–155
[3] L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , Libr. math. tables , 46 , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova)

Comments

Related to the transformations above are the Edgeworth expansions (see, e.g., [a1]; cf. also Edgeworth series).

References

[a1] V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian)
How to Cite This Entry:
Random variables, transformations of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Random_variables,_transformations_of&oldid=28561
This article was adapted from an original article by V.I. PagurovaYu.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article