Chebyshev inequality in probability theory
An inequality in probability theory that gives a bound on the probability of deviation of a given random variable from its mathematical expectation in terms of its variance. Let be a random variable with finite mathematical expectation and variance . Chebyshev's inequality states that for any the probability of the event
does not exceed , or
This inequality was discovered independently by I. Bienaymé (1853) and P.L. Chebyshev (1866). In modern literature this inequality is usually referred to as Chebyshev's inequality, possibly because the name of Chebyshev is associated with an application of it in the proof of the law of large numbers (a theorem of Chebyshev).
Chebyshev's inequality is a representative of a whole class of inequalities of this type, the simplest of which asserts that for a non-negative random variable with finite mathematical expectation ,
(this is sometimes called Markov's inequality). This implies an inequality for arbitrary random variables, which depends on the moments:
(for this is just the Chebyshev inequality), and also the more general inequality
for a non-negative even function that is non-decreasing for positive . Inequality (3) indicates a way to obtain new inequalities of the same type, for example the exponential inequality
It has become traditional to consider all these inequalities to be of Chebyshev type, and even to call them Chebyshev inequalities. There is a general principle for obtaining Chebyshev inequalities by imposing conditions on the moments, based on the use of the system of Chebyshev polynomials (cf. ). For arbitrary random variables the Chebyshev inequalities give precise and best possible bounds, but in certain concrete situations these bounds can be improved. For example, if has a unimodal distribution with mode coinciding with the mathematical expectation, then Gauss' inequality holds:
The importance of Chebyshev's inequality in probability theory lies not so much in its exactness, but in its simplicity and universality. Chebyshev's inequality and its modifications, applied to sums of random variables, played a large part in the proofs of various forms of the law of large numbers and the law of the iterated logarithm. Chebyshev's inequality for sums of independent random variables has been subject to generalization and improvement in two different directions. The first of these is connected with the transition from the Chebyshev inequality
to the significantly stronger inequality
The second direction is concerned with replacing the power in Chebyshev's inequality by something with exponential decay, and leads to the Bernshtein–Kolmogorov inequality:
where , , (cf. Bernstein inequality). Such improvements of Chebyshev inequalities are obtained under additional restrictions on the summands .
Multi-dimensional analogues have been obtained of some of the inequalities stated here (cf. ).
|||P.L. Chebyshev, Mat. Sb. , 2 (1867) pp. 1–9|
|||A.A. Markov, "Wahrscheinlichkeitsrechung" , Teubner (1912) (Translated from Russian)|
|||A.N. Kolmogorov, "Foundations of the theory of probability" , Chelsea, reprint (1950) (Translated from German)|
|||S. Karlin, V. Studden, "Tchebycheff systems: with applications in analysis and statistics" , Interscience (1966)|
|||Yu.V. Prokhorov, "Multivariate distributions: inequalities and limit theorems" J. Soviet Math. , 2 (1974) pp. 475–488 Itogi Nauk. i Tekhn. Teor. Veroyatnost. Mat. Stat. Teoret. Kibernet. , 10 (1972) pp. 5–24|
Chebyshev inequality in probability theory. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Chebyshev_inequality_in_probability_theory&oldid=20850