Namespaces
Variants
Actions

Difference between revisions of "Cornish-Fisher expansion"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (tex encoded by computer)
Line 1: Line 1:
An asymptotic expansion of the quantiles of a distribution (close to the normal standard one) in terms of the corresponding quantiles of the standard normal distribution, in powers of a small parameter. It was studied by E.A. Cornish and R.A. Fisher [[#References|[1]]]. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265001.png" /> is a distribution function depending on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265002.png" /> as a parameter, if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265003.png" /> is the normal distribution function with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265004.png" />, and if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265005.png" /> as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265006.png" />, then, subject to certain assumptions on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265007.png" />, the Cornish–Fisher expansion of the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265008.png" /> (where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c0265009.png" /> is the function inverse to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650010.png" />) has the form
+
<!--
 +
c0265001.png
 +
$#A+1 = 42 n = 0
 +
$#C+1 = 42 : ~/encyclopedia/old_files/data/C026/C.0206500 Cornish\ANDFisher expansion
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650011.png" /></td> <td valign="top" style="width:5%;text-align:right;">(1)</td></tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
where the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650012.png" /> are certain polynomials in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650013.png" />. Similarly, one defines the Cornish–Fisher expansion of the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650014.png" /> (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650015.png" /> being the function inverse to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650016.png" />) in powers of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650017.png" />:
+
An asymptotic expansion of the quantiles of a distribution (close to the normal standard one) in terms of the corresponding quantiles of the standard normal distribution, in powers of a small parameter. It was studied by E.A. Cornish and R.A. Fisher [[#References|[1]]]. If  $  F ( x, t) $
 +
is a distribution function depending on  $  t $
 +
as a parameter, if  $  \Phi ( x) $
 +
is the normal distribution function with parameters  $  ( 0, 1) $,
 +
and if  $  F ( x, t) \rightarrow \Phi ( x) $
 +
as  $  t \rightarrow 0 $,
 +
then, subject to certain assumptions on  $  F ( x, t) $,  
 +
the Cornish–Fisher expansion of the function $  x = F  ^ {-} 1 [ \Phi ( z), t] $(
 +
where  $  F  ^ {-} 1 $
 +
is the function inverse to $  F  $)  
 +
has the form
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650018.png" /></td> <td valign="top" style="width:5%;text-align:right;">(2)</td></tr></table>
+
$$ \tag{1 }
 +
= z + \sum _ {i = 1 } ^ { {m }  - 1 }
 +
S _ {i} ( z) t  ^ {i} + O ( t  ^ {m} ),
 +
$$
  
where the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650019.png" /> are certain polynomials in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650020.png" />. Formula (2) is obtained by expanding <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650021.png" /> in a Taylor series about the point <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650022.png" /> and using the Edgeworth expansion. Formula (1) is the inversion of (2).
+
where the $  S _ {i} ( z) $
 +
are certain polynomials in $  z $.
 +
Similarly, one defines the Cornish–Fisher expansion of the function  $  z = \Phi  ^ {-} 1 [ F ( x, t)] $(
 +
$  \Phi  ^ {-} 1 $
 +
being the function inverse to  $  \Phi $)  
 +
in powers of $  x $:
  
If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650023.png" /> is a random variable with distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650024.png" />, then the variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650025.png" /> is normally distributed with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650026.png" />, and, as follows from (2), <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650027.png" /> approximates the distribution function of the variable
+
$$ \tag{2 }
 +
= x + \sum _ {i = 1 } ^ { {m }  - 1 }
 +
Q _ {i} ( x) t  ^ {i} + O ( t  ^ {m} ),
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650028.png" /></td> </tr></table>
+
where the  $  Q _ {i} ( x) $
 +
are certain polynomials in  $  x $.
 +
Formula (2) is obtained by expanding  $  \Phi  ^ {-} 1 $
 +
in a Taylor series about the point  $  \Phi ( x) $
 +
and using the Edgeworth expansion. Formula (1) is the inversion of (2).
  
as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650029.png" /> better than it approximates <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650030.png" />. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650031.png" /> has zero expectation and unit variance, then the first terms of the expansion (1) have the form
+
If $  X $
 +
is a random variable with distribution function  $  F ( x, t) $,
 +
then the variable  $  Z = Z ( X) = \Phi  ^ {-} 1 [ F ( X , t) ] $
 +
is normally distributed with parameters  $  ( 0, 1) $,
 +
and, as follows from (2),  $  \Phi ( x) $
 +
approximates the distribution function of the variable
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650032.png" /></td> </tr></table>
+
$$
 +
\overline{Z}\; = \
 +
X + \sum _ {i = 1 } ^ { {m }  - 1 }
 +
Q _ {i} ( X) t  ^ {i}
 +
$$
  
Here <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650033.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650034.png" />, with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650035.png" /> the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650036.png" />-th cumulant of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650037.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650038.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650039.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650040.png" />, and with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650041.png" /> the Hermite polynomials, defined by the relation
+
as  $  t \rightarrow 0 $
 +
better than it approximates  $  F ( x, t) $.  
 +
If  $  X $
 +
has zero expectation and unit variance, then the first terms of the expansion (1) have the form
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/c/c026/c026500/c02650042.png" /></td> </tr></table>
+
$$
 +
= z +
 +
[ \gamma _ {1} h _ {1} ( z)] +
 +
[ \gamma _ {2} h _ {2} ( z) +
 +
\gamma _ {1}  ^ {2} h _ {3} ( z)] + \dots .
 +
$$
 +
 
 +
Here  $  \gamma _ {1} = {\kappa _ {3} / \kappa _ {2} }  ^ {3/2} $,
 +
$  \gamma _ {2} = \kappa _ {4} / \kappa _ {2}  ^ {2} $,
 +
with  $  \kappa _ {r} $
 +
the  $  r $-
 +
th cumulant of  $  X $,
 +
$  h _ {1} ( z) = H _ {2} ( z)/6 $,
 +
$  h _ {2} ( z) = H _ {3} ( z) / 24 $,
 +
$  h _ {3} ( z) = - [ 2H _ {3} ( z) + H _ {1} ( z)]/36 $,
 +
and with  $  H _ {r} ( z) $
 +
the Hermite polynomials, defined by the relation
 +
 
 +
$$
 +
\phi ( z) H _ {r} ( z)  = \
 +
(- 1)  ^ {r}
 +
 
 +
\frac{d  ^ {r} \phi ( z) }{dz  ^ {r} }
 +
\ \
 +
( \phi ( z) =
 +
\Phi  ^  \prime  ( z)).
 +
$$
  
 
Concerning expansions for random variables obeying limit laws from the family of Pearson distributions see [[#References|[3]]]. See also [[Random variables, transformations of|Random variables, transformations of]].
 
Concerning expansions for random variables obeying limit laws from the family of Pearson distributions see [[#References|[3]]]. See also [[Random variables, transformations of|Random variables, transformations of]].
Line 25: Line 96:
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  E.A. Cornish,  R.A. Fisher,  "Moments and cumulants in the specification of distributions"  ''Rev. Inst. Internat. Statist.'' , '''5'''  (1937)  pp. 307–320</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  M.G. Kendall,  A. Stuart,  "The advanced theory of statistics. Distribution theory" , '''3. Design and analysis''' , Griffin  (1969)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  L.N. Bol'shev,  "Asymptotically Pearson transformations"  ''Theor. Probab. Appl.'' , '''8'''  (1963)  pp. 121–146  ''Teor. Veroyatnost. i Primenen.'' , '''8''' :  2  (1963)  pp. 129–155</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  E.A. Cornish,  R.A. Fisher,  "Moments and cumulants in the specification of distributions"  ''Rev. Inst. Internat. Statist.'' , '''5'''  (1937)  pp. 307–320</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  M.G. Kendall,  A. Stuart,  "The advanced theory of statistics. Distribution theory" , '''3. Design and analysis''' , Griffin  (1969)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  L.N. Bol'shev,  "Asymptotically Pearson transformations"  ''Theor. Probab. Appl.'' , '''8'''  (1963)  pp. 121–146  ''Teor. Veroyatnost. i Primenen.'' , '''8''' :  2  (1963)  pp. 129–155</TD></TR></table>
 
 
  
 
====Comments====
 
====Comments====

Revision as of 17:29, 5 June 2020


An asymptotic expansion of the quantiles of a distribution (close to the normal standard one) in terms of the corresponding quantiles of the standard normal distribution, in powers of a small parameter. It was studied by E.A. Cornish and R.A. Fisher [1]. If $ F ( x, t) $ is a distribution function depending on $ t $ as a parameter, if $ \Phi ( x) $ is the normal distribution function with parameters $ ( 0, 1) $, and if $ F ( x, t) \rightarrow \Phi ( x) $ as $ t \rightarrow 0 $, then, subject to certain assumptions on $ F ( x, t) $, the Cornish–Fisher expansion of the function $ x = F ^ {-} 1 [ \Phi ( z), t] $( where $ F ^ {-} 1 $ is the function inverse to $ F $) has the form

$$ \tag{1 } x = z + \sum _ {i = 1 } ^ { {m } - 1 } S _ {i} ( z) t ^ {i} + O ( t ^ {m} ), $$

where the $ S _ {i} ( z) $ are certain polynomials in $ z $. Similarly, one defines the Cornish–Fisher expansion of the function $ z = \Phi ^ {-} 1 [ F ( x, t)] $( $ \Phi ^ {-} 1 $ being the function inverse to $ \Phi $) in powers of $ x $:

$$ \tag{2 } z = x + \sum _ {i = 1 } ^ { {m } - 1 } Q _ {i} ( x) t ^ {i} + O ( t ^ {m} ), $$

where the $ Q _ {i} ( x) $ are certain polynomials in $ x $. Formula (2) is obtained by expanding $ \Phi ^ {-} 1 $ in a Taylor series about the point $ \Phi ( x) $ and using the Edgeworth expansion. Formula (1) is the inversion of (2).

If $ X $ is a random variable with distribution function $ F ( x, t) $, then the variable $ Z = Z ( X) = \Phi ^ {-} 1 [ F ( X , t) ] $ is normally distributed with parameters $ ( 0, 1) $, and, as follows from (2), $ \Phi ( x) $ approximates the distribution function of the variable

$$ \overline{Z}\; = \ X + \sum _ {i = 1 } ^ { {m } - 1 } Q _ {i} ( X) t ^ {i} $$

as $ t \rightarrow 0 $ better than it approximates $ F ( x, t) $. If $ X $ has zero expectation and unit variance, then the first terms of the expansion (1) have the form

$$ x = z + [ \gamma _ {1} h _ {1} ( z)] + [ \gamma _ {2} h _ {2} ( z) + \gamma _ {1} ^ {2} h _ {3} ( z)] + \dots . $$

Here $ \gamma _ {1} = {\kappa _ {3} / \kappa _ {2} } ^ {3/2} $, $ \gamma _ {2} = \kappa _ {4} / \kappa _ {2} ^ {2} $, with $ \kappa _ {r} $ the $ r $- th cumulant of $ X $, $ h _ {1} ( z) = H _ {2} ( z)/6 $, $ h _ {2} ( z) = H _ {3} ( z) / 24 $, $ h _ {3} ( z) = - [ 2H _ {3} ( z) + H _ {1} ( z)]/36 $, and with $ H _ {r} ( z) $ the Hermite polynomials, defined by the relation

$$ \phi ( z) H _ {r} ( z) = \ (- 1) ^ {r} \frac{d ^ {r} \phi ( z) }{dz ^ {r} } \ \ ( \phi ( z) = \Phi ^ \prime ( z)). $$

Concerning expansions for random variables obeying limit laws from the family of Pearson distributions see [3]. See also Random variables, transformations of.

References

[1] E.A. Cornish, R.A. Fisher, "Moments and cumulants in the specification of distributions" Rev. Inst. Internat. Statist. , 5 (1937) pp. 307–320
[2] M.G. Kendall, A. Stuart, "The advanced theory of statistics. Distribution theory" , 3. Design and analysis , Griffin (1969)
[3] L.N. Bol'shev, "Asymptotically Pearson transformations" Theor. Probab. Appl. , 8 (1963) pp. 121–146 Teor. Veroyatnost. i Primenen. , 8 : 2 (1963) pp. 129–155

Comments

For the methods of using an Edgeworth expansion to obtain (2) (see also Edgeworth series), see also [a1].

References

[a1] P.J. Bickel, "Edgeworth expansions in non parametric statistics" Ann. Statist. , 2 (1974) pp. 1–20
[a2] N.L. Johnson, S. Kotz, "Distributions in statistics" , 1 , Houghton Mifflin (1970)
How to Cite This Entry:
Cornish-Fisher expansion. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Cornish-Fisher_expansion&oldid=46519
This article was adapted from an original article by V.I. Pagurova (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article