Convergence in probability

From Encyclopedia of Mathematics
Revision as of 17:13, 7 February 2011 by (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Convergence of a sequence of random variables defined on a probability space , to a random variable , defined in the following way: if for any ,

In mathematical analysis, this form of convergence is called convergence in measure. Convergence in distribution follows from convergence in probability.


See also Weak convergence of probability measures; Convergence, types of; Distributions, convergence of.

How to Cite This Entry:
Convergence in probability. Encyclopedia of Mathematics. URL:
This article was adapted from an original article by V.I. Bityutskov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article