Convergence, almost-certain
From Encyclopedia of Mathematics
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.
almost-sure convergence, convergence with probability one
Convergence of a sequence of random variables
defined on a certain probability space
, to a random variable
, defined in the following way:
(or
-almost certain) if
![]() |
In mathematical analysis this form of convergence is called almost-everywhere convergence. Convergence in probability follows from almost-certain convergence.
Comments
See also Convergence, types of; Weak convergence of probability measures; Distributions, convergence of.
How to Cite This Entry:
Convergence, almost-certain. V.I. Bityutskov (originator), Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Convergence,_almost-certain&oldid=15210
Convergence, almost-certain. V.I. Bityutskov (originator), Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Convergence,_almost-certain&oldid=15210
This text originally appeared in Encyclopedia of Mathematics - ISBN 1402006098
