Convergence in probability
2020 Mathematics Subject Classification: Primary: 60-01 Secondary: 28A20 [MSN][ZBL]
Convergence of a sequence of random variables $X_1,X_2,\ldots$ defined on a probability space $(\Omega,\mathcal{F},\mathbb{P})$, to a random variable $X$, defined in the following way: $X_n \stackrel{\mathrm{P}}{\rightarrow} X$ if for any $\epsilon > 0$, $$ \mathbb{P}\{ |X_n-X| > \epsilon \} \rightarrow 0 \ \ \text{as}\ \ n \rightarrow \infty \ . $$
In mathematical analysis, this form of convergence is called convergence in measure. Convergence in probability implies convergence in distribution.
Comments
See also Weak convergence of probability measures; Convergence, types of; Distributions, convergence of.
Convergence in probability. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Convergence_in_probability&oldid=41768