Namespaces
Variants
Actions

Difference between revisions of "Lévy metric"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (moved Levy metric to Lévy metric over redirect: accented title)
m (tex encoded by computer)
 
(2 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 +
<!--
 +
l0583101.png
 +
$#A+1 = 65 n = 0
 +
$#C+1 = 65 : ~/encyclopedia/old_files/data/L058/L.0508310 L\Aeevy metric
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
{{MSC|60E05}}
 
{{MSC|60E05}}
  
 
[[Category:Distribution theory]]
 
[[Category:Distribution theory]]
  
A metric <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583101.png" /> in the space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583102.png" /> of distribution functions (cf. [[Distribution function|Distribution function]]) of one-dimensional random variables such that:
+
A metric $  L $
 +
in the space $  {\mathcal F} $
 +
of distribution functions (cf. [[Distribution function|Distribution function]]) of one-dimensional random variables such that:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583103.png" /></td> </tr></table>
+
$$
 +
L  \equiv  L ( F , G ) =
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583104.png" /></td> </tr></table>
+
$$
 +
= \
 +
\inf \{  \epsilon  : {F ( x - \epsilon ) - \epsilon \leq  G ( x) \leq  F (
 +
x + \epsilon ) + \epsilon  \textrm{ for  all  }  x } \}
 +
$$
  
for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583105.png" />. It was introduced by P. Lévy (see [[#References|[1]]]). If between the graphs of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583106.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583107.png" /> one inscribes squares with sides parallel to the coordinate axes (at points of discontinuity of a graph vertical segments are added), then a side of the largest of them is equal to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583108.png" />.
+
for any $  F , G \in {\mathcal F} $.  
 +
It was introduced by P. Lévy (see {{Cite|Le}}). If between the graphs of $  F $
 +
and $  G $
 +
one inscribes squares with sides parallel to the coordinate axes (at points of discontinuity of a graph vertical segments are added), then a side of the largest of them is equal to $  L $.
  
The Lévy metric can be regarded as a special case of the [[Lévy–Prokhorov metric|Lévy–Prokhorov metric]]. The definition of the Lévy metric carries over to the set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l0583109.png" /> of all non-decreasing functions on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831010.png" /> (infinite values of the metric being allowed).
+
The Lévy metric can be regarded as a special case of the [[Lévy–Prokhorov metric|Lévy–Prokhorov metric]]. The definition of the Lévy metric carries over to the set $  M $
 +
of all non-decreasing functions on $  \mathbf R  ^ {1} $(
 +
infinite values of the metric being allowed).
  
 
==Most important properties of the Lévy metric.==
 
==Most important properties of the Lévy metric.==
  
 +
1) The Lévy metric induces a weak topology in  $  {\mathcal F} $(
 +
cf. [[Distributions, convergence of|Distributions, convergence of]]). The metric space ( $  {\mathcal F} , L $)
 +
is separable and complete. Convergence of a sequence of functions from  $  M $
 +
in the metric  $  L $
 +
is equivalent to complete convergence.
  
1) The Lévy metric induces a weak topology in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831011.png" /> (cf. [[Distributions, convergence of|Distributions, convergence of]]). The metric space (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831012.png" />) is separable and complete. Convergence of a sequence of functions from <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831013.png" /> in the metric <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831014.png" /> is equivalent to complete convergence.
+
2) If  $  F \in M $
 +
and if
  
2) If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831015.png" /> and if
+
$$
 +
F _ {-} 1 ( x) = \inf \{ {t } : {F ( t) < x } \}
 +
,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831016.png" /></td> </tr></table>
+
then for any  $  F , G \in M $,
  
then for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831017.png" />,
+
$$
 +
L ( F , G )  = L ( F _ {-} 1 , G _ {-} 1 ) .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831018.png" /></td> </tr></table>
+
3) Regularity of the Lévy metric: For any  $  F , G , H \in {\mathcal F} $,
  
3) Regularity of the Lévy metric: For any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831019.png" />,
+
$$
 +
L ( F \star H , G \star H ) \leq  L ( F , G )
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831020.png" /></td> </tr></table>
+
(here  $  \star $
 +
denotes convolution, cf. [[Convolution of functions|Convolution of functions]]). A consequence of this property is the property of semi-additivity:
  
(here <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831021.png" /> denotes convolution, cf. [[Convolution of functions|Convolution of functions]]). A consequence of this property is the property of semi-additivity:
+
$$
 +
L ( F _ {1} \star F _ {2} , G _ {1} \star G _ {2} ) \leq  L ( F _ {1} ,\
 +
G _ {1} ) + L ( F _ {2} , G _ {2} )
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831022.png" /></td> </tr></table>
+
and the "smoothing inequality" :
  
and the "smoothing inequality" :
+
$$
 +
L ( F , G ) \leq  L ( F \star H , G \times H ) + 2L ( E , H )
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831023.png" /></td> </tr></table>
+
( $  E $
 +
being a distribution that is degenerate at zero).
  
(<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831024.png" /> being a distribution that is degenerate at zero).
+
4) If  $  \alpha _ {k} \geq  0 $,
 +
$  F _ {k} , G _ {k} \in {\mathcal F} $,
 +
then
  
4) If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831025.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831026.png" />, then
+
$$
 +
L \left ( \sum \alpha _ {k} F _ {k} , \sum \alpha _ {k} G _ {k} \right )
 +
\leq  \
 +
\max \left ( 1 , \sum \alpha _ {k} \right )  \max  L ( F _ {k} , G _ {k} ) .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831027.png" /></td> </tr></table>
+
5) If  $  \beta _ {r} ( F  ) $,
 +
$  r > 0 $,
 +
is an [[Absolute moment|absolute moment]] of the distribution  $  F $,
 +
then
  
5) If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831028.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831029.png" />, is an [[Absolute moment|absolute moment]] of the distribution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831030.png" />, then
+
$$
 +
L ( F , E ) \leq  \{ \beta _ {r} ( F  ) \} ^ {r / ( r+ 1 ) } .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831031.png" /></td> </tr></table>
+
6) The Lévy metric on  $  M $
 +
is related to the integral mean metric
  
6) The Lévy metric on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831032.png" /> is related to the integral mean metric
+
$$
 
+
\rho _ {1}  = \rho _ {1} ( F , G )  = \int\limits | F ( x) - G ( x) |  dx
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831033.png" /></td> </tr></table>
+
$$
  
 
by the inequality
 
by the inequality
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831034.png" /></td> </tr></table>
+
$$
 +
L  ^ {2}  \leq  \rho _ {1} .
 +
$$
  
7) The Lévy metric on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831035.png" /> is related to the uniform metric
+
7) The Lévy metric on $  M $
 +
is related to the uniform metric
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831036.png" /></td> </tr></table>
+
$$
 +
\rho  = \rho ( F , G )  = \sup _ { x }  | F ( x) - G ( x) |
 +
$$
  
 
by the relations
 
by the relations
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831037.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
$$ \tag{* }
 +
L  \leq  \rho  \leq  L + \min \{ Q _ {F} ( L) , Q _ {G} ( L) \} ,
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831038.png" /></td> </tr></table>
+
$$
 +
Q _ {F} ( x)  = \sup _ { t } | F ( t+ x ) - F ( t) |
 +
$$
  
(<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831039.png" /> is the [[Concentration function|concentration function]] for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831040.png" />). In particular, if one of the functions, for example <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831041.png" />, has a uniformly bounded derivative, then
+
( $  Q _ {F} ( x) $
 +
is the [[Concentration function|concentration function]] for $  F \in {\mathcal F} $).  
 +
In particular, if one of the functions, for example $  G $,  
 +
has a uniformly bounded derivative, then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831042.png" /></td> </tr></table>
+
$$
 +
\rho  \leq  \left ( 1 + \sup _ { x }  G  ^  \prime  ( x) \right ) L .
 +
$$
  
 
A consequence of (*) is the Pólya–Glivenko theorem on the equivalence of weak and uniform convergence in the case when the limit distribution is continuous.
 
A consequence of (*) is the Pólya–Glivenko theorem on the equivalence of weak and uniform convergence in the case when the limit distribution is continuous.
  
8) If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831043.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831044.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831045.png" /> are constants, then for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831046.png" />,
+
8) If $  F _ {a , \sigma }  ( x) = F ( \sigma x + a ) $,  
 +
where $  a $
 +
and  $  \sigma > 0 $
 +
are constants, then for any $  F , G \in {\mathcal F} $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831047.png" /></td> </tr></table>
+
$$
 +
L ( \sigma F , \sigma G )  \leq  \sigma L ( F _ {a , \sigma }  , G _ {a , \sigma }  )
 +
$$
  
 
(in particular, the Lévy metric is invariant with respect to a shift of the distributions) and
 
(in particular, the Lévy metric is invariant with respect to a shift of the distributions) and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831048.png" /></td> </tr></table>
+
$$
 +
\lim\limits _ {\sigma \rightarrow 0 }  L ( F _ {a , \sigma }  , G _ {a
 +
, \sigma }  )  = \rho ( F , G ) .
 +
$$
  
9) If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831049.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831050.png" /> are the characteristic functions (cf. [[Characteristic function|Characteristic function]]) corresponding to the distributions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831051.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831052.png" />, then for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831053.png" />,
+
9) If $  f $
 +
and $  g $
 +
are the characteristic functions (cf. [[Characteristic function|Characteristic function]]) corresponding to the distributions $  F $
 +
and $  G $,  
 +
then for any $  T > e $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831054.png" /></td> </tr></table>
+
$$
 +
L ( F , G )  \leq 
 +
\frac{1} \pi
 +
\int\limits _ { 0 } ^ { T }  | f ( t) - g ( t) | 
 +
\frac{dt}{t}
 +
+ 2e
 +
\frac{ \mathop{\rm ln}  T }{T}
 +
.
 +
$$
  
The concept of the Lévy metric can be extended to the case of distributions in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831055.png" />.
+
The concept of the Lévy metric can be extended to the case of distributions in $  \mathbf R  ^ {n} $.
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  P. Lévy,   "Théorie de l'addition des variables aléatoires" , Gauthier-Villars (1937)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  V.M. Zolotarev,   "Estimates of the difference between distributions in the Lévy metric" ''Proc. Steklov Inst. Math.'' , '''112''' (1973) pp. 232–240 ''Trudy Mat. Inst. Steklov.'' , '''112''' (1971) pp. 224–231</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"V.M. Zolotarev,   V.V. Senatov,   "Two-sided estimates of Lévy's metric" ''Theor. Probab. Appl.'' , '''20''' (1975) pp. 234–245 ''Teor. Veroyatnost. i Primenen.'' , '''20''' : 2 (1975) pp. 239–250</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top"Yu.V. Linnik,   I.V. Ostrovskii,   "Decomposition of random variables and vectors" , Amer. Math. Soc. (1977) (Translated from Russian)</TD></TR></table>
+
{|
 
+
|valign="top"|{{Ref|Le}}|| P. Lévy, "Théorie de l'addition des variables aléatoires" , Gauthier-Villars (1937)
 
+
|-
 +
|valign="top"|{{Ref|Z}}|| V.M. Zolotarev, "Estimates of the difference between distributions in the Lévy metric" ''Proc. Steklov Inst. Math.'' , '''112''' (1973) pp. 232–240 ''Trudy Mat. Inst. Steklov.'' , '''112''' (1971) pp. 224–231
 +
|-
 +
|valign="top"|{{Ref|ZS}}|| V.M. Zolotarev, V.V. Senatov, "Two-sided estimates of Lévy's metric" ''Theor. Probab. Appl.'' , '''20''' (1975) pp. 234–245 ''Teor. Veroyatnost. i Primenen.'' , '''20''' : 2 (1975) pp. 239–250
 +
|-
 +
|valign="top"|{{Ref|LO}}|| Yu.V. Linnik, I.V. Ostrovskii, "Decomposition of random variables and vectors" , Amer. Math. Soc. (1977) (Translated from Russian) {{MR|0428382}} {{ZBL|0358.60020}}
 +
|}
  
 
====Comments====
 
====Comments====
 
A word of warning. In the Soviet mathematical literature (and in the main article above), distribution functions are usually left continuous, whereas in the West they are right continuous. So slight changes must be made in 2) or 7).
 
A word of warning. In the Soviet mathematical literature (and in the main article above), distribution functions are usually left continuous, whereas in the West they are right continuous. So slight changes must be made in 2) or 7).
  
Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831056.png" /> be a distribution function or, more generally, a non-decreasing left-continuous function. Then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831057.png" /> has a countable set of discontinuity points. The complement of this set is called the continuity set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831058.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831059.png" />. A series of distribution functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831060.png" /> is said to converge weakly to a distribution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831061.png" /> if this is the case on the continuity set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831062.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831063.png" />. The series converges completely if moreover <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831064.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/l/l058/l058310/l05831065.png" />. Cf. also [[Convergence of distributions|Convergence of distributions]] and [[Convergence, types of|Convergence, types of]].
+
Let $  F $
 +
be a distribution function or, more generally, a non-decreasing left-continuous function. Then $  F $
 +
has a countable set of discontinuity points. The complement of this set is called the continuity set $  C ( F  ) $
 +
of $  F $.  
 +
A series of distribution functions $  F _ {n} $
 +
is said to converge weakly to a distribution $  F $
 +
if this is the case on the continuity set $  C ( F  ) $
 +
of $  F $.  
 +
The series converges completely if moreover $  F _ {n} ( + \infty ) \rightarrow F ( \infty ) $
 +
and $  F _ {n} ( - \infty ) \rightarrow F ( - \infty ) $.  
 +
Cf. also [[Convergence of distributions|Convergence of distributions]] and [[Convergence, types of|Convergence, types of]].
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  P. Billingsley,   "Convergence of probability measures" , Wiley (1968)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  W. Hengartner,   R. Theodorescu,   "Concentration functions" , Acad. Press (1973)</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"M. Loève,   "Probability theory" , v. Nostrand (1963) pp. 178</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|B}}|| P. Billingsley, "Convergence of probability measures" , Wiley (1968) {{MR|0233396}} {{ZBL|0172.21201}}
 +
|-
 +
|valign="top"|{{Ref|HT}}|| W. Hengartner, R. Theodorescu, "Concentration functions" , Acad. Press (1973)
 +
|-
 +
|valign="top"|{{Ref|Lo}}|| M. Loève, "Probability theory" , v. Nostrand (1963) pp. 178 {{MR|0203748}} {{ZBL|0108.14202}}
 +
|}

Latest revision as of 04:11, 6 June 2020


2020 Mathematics Subject Classification: Primary: 60E05 [MSN][ZBL]

A metric $ L $ in the space $ {\mathcal F} $ of distribution functions (cf. Distribution function) of one-dimensional random variables such that:

$$ L \equiv L ( F , G ) = $$

$$ = \ \inf \{ \epsilon : {F ( x - \epsilon ) - \epsilon \leq G ( x) \leq F ( x + \epsilon ) + \epsilon \textrm{ for all } x } \} $$

for any $ F , G \in {\mathcal F} $. It was introduced by P. Lévy (see [Le]). If between the graphs of $ F $ and $ G $ one inscribes squares with sides parallel to the coordinate axes (at points of discontinuity of a graph vertical segments are added), then a side of the largest of them is equal to $ L $.

The Lévy metric can be regarded as a special case of the Lévy–Prokhorov metric. The definition of the Lévy metric carries over to the set $ M $ of all non-decreasing functions on $ \mathbf R ^ {1} $( infinite values of the metric being allowed).

Most important properties of the Lévy metric.

1) The Lévy metric induces a weak topology in $ {\mathcal F} $( cf. Distributions, convergence of). The metric space ( $ {\mathcal F} , L $) is separable and complete. Convergence of a sequence of functions from $ M $ in the metric $ L $ is equivalent to complete convergence.

2) If $ F \in M $ and if

$$ F _ {-} 1 ( x) = \inf \{ {t } : {F ( t) < x } \} , $$

then for any $ F , G \in M $,

$$ L ( F , G ) = L ( F _ {-} 1 , G _ {-} 1 ) . $$

3) Regularity of the Lévy metric: For any $ F , G , H \in {\mathcal F} $,

$$ L ( F \star H , G \star H ) \leq L ( F , G ) $$

(here $ \star $ denotes convolution, cf. Convolution of functions). A consequence of this property is the property of semi-additivity:

$$ L ( F _ {1} \star F _ {2} , G _ {1} \star G _ {2} ) \leq L ( F _ {1} ,\ G _ {1} ) + L ( F _ {2} , G _ {2} ) $$

and the "smoothing inequality" :

$$ L ( F , G ) \leq L ( F \star H , G \times H ) + 2L ( E , H ) $$

( $ E $ being a distribution that is degenerate at zero).

4) If $ \alpha _ {k} \geq 0 $, $ F _ {k} , G _ {k} \in {\mathcal F} $, then

$$ L \left ( \sum \alpha _ {k} F _ {k} , \sum \alpha _ {k} G _ {k} \right ) \leq \ \max \left ( 1 , \sum \alpha _ {k} \right ) \max L ( F _ {k} , G _ {k} ) . $$

5) If $ \beta _ {r} ( F ) $, $ r > 0 $, is an absolute moment of the distribution $ F $, then

$$ L ( F , E ) \leq \{ \beta _ {r} ( F ) \} ^ {r / ( r+ 1 ) } . $$

6) The Lévy metric on $ M $ is related to the integral mean metric

$$ \rho _ {1} = \rho _ {1} ( F , G ) = \int\limits | F ( x) - G ( x) | dx $$

by the inequality

$$ L ^ {2} \leq \rho _ {1} . $$

7) The Lévy metric on $ M $ is related to the uniform metric

$$ \rho = \rho ( F , G ) = \sup _ { x } | F ( x) - G ( x) | $$

by the relations

$$ \tag{* } L \leq \rho \leq L + \min \{ Q _ {F} ( L) , Q _ {G} ( L) \} , $$

where

$$ Q _ {F} ( x) = \sup _ { t } | F ( t+ x ) - F ( t) | $$

( $ Q _ {F} ( x) $ is the concentration function for $ F \in {\mathcal F} $). In particular, if one of the functions, for example $ G $, has a uniformly bounded derivative, then

$$ \rho \leq \left ( 1 + \sup _ { x } G ^ \prime ( x) \right ) L . $$

A consequence of (*) is the Pólya–Glivenko theorem on the equivalence of weak and uniform convergence in the case when the limit distribution is continuous.

8) If $ F _ {a , \sigma } ( x) = F ( \sigma x + a ) $, where $ a $ and $ \sigma > 0 $ are constants, then for any $ F , G \in {\mathcal F} $,

$$ L ( \sigma F , \sigma G ) \leq \sigma L ( F _ {a , \sigma } , G _ {a , \sigma } ) $$

(in particular, the Lévy metric is invariant with respect to a shift of the distributions) and

$$ \lim\limits _ {\sigma \rightarrow 0 } L ( F _ {a , \sigma } , G _ {a , \sigma } ) = \rho ( F , G ) . $$

9) If $ f $ and $ g $ are the characteristic functions (cf. Characteristic function) corresponding to the distributions $ F $ and $ G $, then for any $ T > e $,

$$ L ( F , G ) \leq \frac{1} \pi \int\limits _ { 0 } ^ { T } | f ( t) - g ( t) | \frac{dt}{t} + 2e \frac{ \mathop{\rm ln} T }{T} . $$

The concept of the Lévy metric can be extended to the case of distributions in $ \mathbf R ^ {n} $.

References

[Le] P. Lévy, "Théorie de l'addition des variables aléatoires" , Gauthier-Villars (1937)
[Z] V.M. Zolotarev, "Estimates of the difference between distributions in the Lévy metric" Proc. Steklov Inst. Math. , 112 (1973) pp. 232–240 Trudy Mat. Inst. Steklov. , 112 (1971) pp. 224–231
[ZS] V.M. Zolotarev, V.V. Senatov, "Two-sided estimates of Lévy's metric" Theor. Probab. Appl. , 20 (1975) pp. 234–245 Teor. Veroyatnost. i Primenen. , 20 : 2 (1975) pp. 239–250
[LO] Yu.V. Linnik, I.V. Ostrovskii, "Decomposition of random variables and vectors" , Amer. Math. Soc. (1977) (Translated from Russian) MR0428382 Zbl 0358.60020

Comments

A word of warning. In the Soviet mathematical literature (and in the main article above), distribution functions are usually left continuous, whereas in the West they are right continuous. So slight changes must be made in 2) or 7).

Let $ F $ be a distribution function or, more generally, a non-decreasing left-continuous function. Then $ F $ has a countable set of discontinuity points. The complement of this set is called the continuity set $ C ( F ) $ of $ F $. A series of distribution functions $ F _ {n} $ is said to converge weakly to a distribution $ F $ if this is the case on the continuity set $ C ( F ) $ of $ F $. The series converges completely if moreover $ F _ {n} ( + \infty ) \rightarrow F ( \infty ) $ and $ F _ {n} ( - \infty ) \rightarrow F ( - \infty ) $. Cf. also Convergence of distributions and Convergence, types of.

References

[B] P. Billingsley, "Convergence of probability measures" , Wiley (1968) MR0233396 Zbl 0172.21201
[HT] W. Hengartner, R. Theodorescu, "Concentration functions" , Acad. Press (1973)
[Lo] M. Loève, "Probability theory" , v. Nostrand (1963) pp. 178 MR0203748 Zbl 0108.14202
How to Cite This Entry:
Lévy metric. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=L%C3%A9vy_metric&oldid=23378
This article was adapted from an original article by V.M. Zolotarev (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article