Namespaces
Variants
Actions

Lévy metric

From Encyclopedia of Mathematics
(Redirected from Levy metric)
Jump to: navigation, search


2020 Mathematics Subject Classification: Primary: 60E05 [MSN][ZBL]

A metric $ L $ in the space $ {\mathcal F} $ of distribution functions (cf. Distribution function) of one-dimensional random variables such that:

$$ L \equiv L ( F , G ) = $$

$$ = \ \inf \{ \epsilon : {F ( x - \epsilon ) - \epsilon \leq G ( x) \leq F ( x + \epsilon ) + \epsilon \textrm{ for all } x } \} $$

for any $ F , G \in {\mathcal F} $. It was introduced by P. Lévy (see [Le]). If between the graphs of $ F $ and $ G $ one inscribes squares with sides parallel to the coordinate axes (at points of discontinuity of a graph vertical segments are added), then a side of the largest of them is equal to $ L $.

The Lévy metric can be regarded as a special case of the Lévy–Prokhorov metric. The definition of the Lévy metric carries over to the set $ M $ of all non-decreasing functions on $ \mathbf R ^ {1} $( infinite values of the metric being allowed).

Most important properties of the Lévy metric.

1) The Lévy metric induces a weak topology in $ {\mathcal F} $( cf. Distributions, convergence of). The metric space ( $ {\mathcal F} , L $) is separable and complete. Convergence of a sequence of functions from $ M $ in the metric $ L $ is equivalent to complete convergence.

2) If $ F \in M $ and if

$$ F _ {-} 1 ( x) = \inf \{ {t } : {F ( t) < x } \} , $$

then for any $ F , G \in M $,

$$ L ( F , G ) = L ( F _ {-} 1 , G _ {-} 1 ) . $$

3) Regularity of the Lévy metric: For any $ F , G , H \in {\mathcal F} $,

$$ L ( F \star H , G \star H ) \leq L ( F , G ) $$

(here $ \star $ denotes convolution, cf. Convolution of functions). A consequence of this property is the property of semi-additivity:

$$ L ( F _ {1} \star F _ {2} , G _ {1} \star G _ {2} ) \leq L ( F _ {1} ,\ G _ {1} ) + L ( F _ {2} , G _ {2} ) $$

and the "smoothing inequality" :

$$ L ( F , G ) \leq L ( F \star H , G \times H ) + 2L ( E , H ) $$

( $ E $ being a distribution that is degenerate at zero).

4) If $ \alpha _ {k} \geq 0 $, $ F _ {k} , G _ {k} \in {\mathcal F} $, then

$$ L \left ( \sum \alpha _ {k} F _ {k} , \sum \alpha _ {k} G _ {k} \right ) \leq \ \max \left ( 1 , \sum \alpha _ {k} \right ) \max L ( F _ {k} , G _ {k} ) . $$

5) If $ \beta _ {r} ( F ) $, $ r > 0 $, is an absolute moment of the distribution $ F $, then

$$ L ( F , E ) \leq \{ \beta _ {r} ( F ) \} ^ {r / ( r+ 1 ) } . $$

6) The Lévy metric on $ M $ is related to the integral mean metric

$$ \rho _ {1} = \rho _ {1} ( F , G ) = \int\limits | F ( x) - G ( x) | dx $$

by the inequality

$$ L ^ {2} \leq \rho _ {1} . $$

7) The Lévy metric on $ M $ is related to the uniform metric

$$ \rho = \rho ( F , G ) = \sup _ { x } | F ( x) - G ( x) | $$

by the relations

$$ \tag{* } L \leq \rho \leq L + \min \{ Q _ {F} ( L) , Q _ {G} ( L) \} , $$

where

$$ Q _ {F} ( x) = \sup _ { t } | F ( t+ x ) - F ( t) | $$

( $ Q _ {F} ( x) $ is the concentration function for $ F \in {\mathcal F} $). In particular, if one of the functions, for example $ G $, has a uniformly bounded derivative, then

$$ \rho \leq \left ( 1 + \sup _ { x } G ^ \prime ( x) \right ) L . $$

A consequence of (*) is the Pólya–Glivenko theorem on the equivalence of weak and uniform convergence in the case when the limit distribution is continuous.

8) If $ F _ {a , \sigma } ( x) = F ( \sigma x + a ) $, where $ a $ and $ \sigma > 0 $ are constants, then for any $ F , G \in {\mathcal F} $,

$$ L ( \sigma F , \sigma G ) \leq \sigma L ( F _ {a , \sigma } , G _ {a , \sigma } ) $$

(in particular, the Lévy metric is invariant with respect to a shift of the distributions) and

$$ \lim\limits _ {\sigma \rightarrow 0 } L ( F _ {a , \sigma } , G _ {a , \sigma } ) = \rho ( F , G ) . $$

9) If $ f $ and $ g $ are the characteristic functions (cf. Characteristic function) corresponding to the distributions $ F $ and $ G $, then for any $ T > e $,

$$ L ( F , G ) \leq \frac{1} \pi \int\limits _ { 0 } ^ { T } | f ( t) - g ( t) | \frac{dt}{t} + 2e \frac{ \mathop{\rm ln} T }{T} . $$

The concept of the Lévy metric can be extended to the case of distributions in $ \mathbf R ^ {n} $.

References

[Le] P. Lévy, "Théorie de l'addition des variables aléatoires" , Gauthier-Villars (1937)
[Z] V.M. Zolotarev, "Estimates of the difference between distributions in the Lévy metric" Proc. Steklov Inst. Math. , 112 (1973) pp. 232–240 Trudy Mat. Inst. Steklov. , 112 (1971) pp. 224–231
[ZS] V.M. Zolotarev, V.V. Senatov, "Two-sided estimates of Lévy's metric" Theor. Probab. Appl. , 20 (1975) pp. 234–245 Teor. Veroyatnost. i Primenen. , 20 : 2 (1975) pp. 239–250
[LO] Yu.V. Linnik, I.V. Ostrovskii, "Decomposition of random variables and vectors" , Amer. Math. Soc. (1977) (Translated from Russian) MR0428382 Zbl 0358.60020

Comments

A word of warning. In the Soviet mathematical literature (and in the main article above), distribution functions are usually left continuous, whereas in the West they are right continuous. So slight changes must be made in 2) or 7).

Let $ F $ be a distribution function or, more generally, a non-decreasing left-continuous function. Then $ F $ has a countable set of discontinuity points. The complement of this set is called the continuity set $ C ( F ) $ of $ F $. A series of distribution functions $ F _ {n} $ is said to converge weakly to a distribution $ F $ if this is the case on the continuity set $ C ( F ) $ of $ F $. The series converges completely if moreover $ F _ {n} ( + \infty ) \rightarrow F ( \infty ) $ and $ F _ {n} ( - \infty ) \rightarrow F ( - \infty ) $. Cf. also Convergence of distributions and Convergence, types of.

References

[B] P. Billingsley, "Convergence of probability measures" , Wiley (1968) MR0233396 Zbl 0172.21201
[HT] W. Hengartner, R. Theodorescu, "Concentration functions" , Acad. Press (1973)
[Lo] M. Loève, "Probability theory" , v. Nostrand (1963) pp. 178 MR0203748 Zbl 0108.14202
How to Cite This Entry:
Levy metric. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Levy_metric&oldid=23379