Namespaces
Variants
Actions

Markov moment

From Encyclopedia of Mathematics
Jump to: navigation, search


Markov time; stopping time

2020 Mathematics Subject Classification: Primary: 60G40 [MSN][ZBL]

A notion used in probability theory for random variables having the property of independence of the "future" . More precisely, let $ ( \Omega , {\mathcal F} ) $ be a measurable space with a non-decreasing family $ ( {\mathcal F} _ {t} ) $, $ t \in T $, of $ \sigma $- algebras of $ {\mathcal F} $( $ T = [ 0 , \infty ] $ in the case of continuous time and $ T = \{ 0 , 1 ,\dots \} $ in the case of discrete time). A random variable $ \tau = \tau ( \omega ) $ with values in $ T \cup \{ + \infty \} $ is called a Markov moment or Markov time (relative to the family $ ( {\mathcal F} _ {t} ) $, $ t \in T $) if for each $ t \in T $ the event $ \{ \tau ( \omega ) \leq t \} $ belongs to $ {\mathcal F} _ {t} $. In the case of discrete time this is equivalent to saying that for any $ n \in \{ 0 , 1 ,\dots \} $ the event $ \{ \tau ( \omega ) = n \} $ belongs to $ {\mathcal F} _ {n} $.

Examples.

1) Let $ X ( t) $, $ t \in T $, be a real-valued right-continuous random process given on $ ( \Omega , {\mathcal F} ) $, and let $ {\mathcal F} _ {t} = \sigma \{ \omega : {X ( s), s \leq t } \} $. Then the random variables

$$ \tau ( \omega ) = \inf \{ {t \geq 0 } : {X ( t) \in B } \} $$

and

$$ \sigma ( \omega ) = \inf \{ {t > 0 } : {X ( t) \in B } \} , $$

that is, the (first and first after $ + 0 $) times of hitting the (Borel) set $ B $, form Markov moments (in the case $ \{ \cdot \} = \emptyset $ it is assumed that $ \inf \emptyset = \infty $).

2) If $ w ( t) $, $ t \geq 0 $, is a standard Wiener process, then the Markov moment

$$ \tau = \inf \{ {t \geq 0 } : {w ( t) \geq a } \} ,\ a > 0 , $$

has probability density

$$ P ( t) = \ \frac{a}{t ^ {3/2} \sqrt {2 \pi } } e ^ {- a ^ {2} / 2 t } . $$

Here $ {\mathsf P} \{ \tau < \infty \} = 1 $, but $ {\mathsf E} \tau = \infty $.

3) The random variable

$$ \gamma = \inf \{ {t > 0 } : {X ( s) \in B , s \geq t } \} , $$

being the first time after which $ X _ {t} $ remains in $ B $, is an example of a non-Markov moment (a random variable depending on the "future" ).

Using the idea of a Markov moment one can formulate the strong Markov property of Markov processes (cf. Markov process). Markov moments and stopping times (that is, finite Markov moments) play a major role in the general theory of random processes and statistical sequential analysis.

References

[GS] I.I. Gihman, A.V. Skorohod, "The theory of stochastic processes" , 2 , Springer (1975) (Translated from Russian) MR0375463 Zbl 0305.60027

Comments

References

[BG] R.M. Blumenthal, R.K. Getoor, "Markov processes and potential theory" , Acad. Press (1968) MR0264757 Zbl 0169.49204
[Do] J.L. Doob, "Classical potential theory and its probabilistic counterpart" , Springer (1984) pp. 390 MR0731258 Zbl 0549.31001
[Dy] E.B. Dynkin, "Markov processes" , 1 , Springer (1965) (Translated from Russian) MR0193671 Zbl 0132.37901
[W] A.D. Wentzell, "A course in the theory of stochastic processes" , McGraw-Hill (1981) (Translated from Russian) MR0781738 MR0614594 Zbl 0502.60001
[B] L.P. Breiman, "Probability" , Addison-Wesley (1968) MR0229267 Zbl 0174.48801
How to Cite This Entry:
Markov moment. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_moment&oldid=47772
This article was adapted from an original article by A.N. Shiryaev (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article