Difference between revisions of "Markov moment"
(refs format) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
(One intermediate revision by one other user not shown) | |||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | m0624801.png | ||
+ | $#A+1 = 36 n = 0 | ||
+ | $#C+1 = 36 : ~/encyclopedia/old_files/data/M062/M.0602480 Markov moment, | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
''Markov time; stopping time'' | ''Markov time; stopping time'' | ||
Line 5: | Line 17: | ||
[[Category:Stochastic processes]] | [[Category:Stochastic processes]] | ||
− | A notion used in probability theory for random variables having the property of independence of the "future" . More precisely, let | + | A notion used in probability theory for random variables having the property of independence of the "future" . More precisely, let $ ( \Omega , {\mathcal F} ) $ |
+ | be a [[Measurable space|measurable space]] with a non-decreasing family $ ( {\mathcal F} _ {t} ) $, | ||
+ | $ t \in T $, | ||
+ | of $ \sigma $- | ||
+ | algebras of $ {\mathcal F} $( | ||
+ | $ T = [ 0 , \infty ] $ | ||
+ | in the case of continuous time and $ T = \{ 0 , 1 ,\dots \} $ | ||
+ | in the case of discrete time). A [[Random variable|random variable]] $ \tau = \tau ( \omega ) $ | ||
+ | with values in $ T \cup \{ + \infty \} $ | ||
+ | is called a Markov moment or Markov time (relative to the family $ ( {\mathcal F} _ {t} ) $, | ||
+ | $ t \in T $) | ||
+ | if for each $ t \in T $ | ||
+ | the event $ \{ \tau ( \omega ) \leq t \} $ | ||
+ | belongs to $ {\mathcal F} _ {t} $. | ||
+ | In the case of discrete time this is equivalent to saying that for any $ n \in \{ 0 , 1 ,\dots \} $ | ||
+ | the event $ \{ \tau ( \omega ) = n \} $ | ||
+ | belongs to $ {\mathcal F} _ {n} $. | ||
===Examples.=== | ===Examples.=== | ||
+ | 1) Let $ X ( t) $, | ||
+ | $ t \in T $, | ||
+ | be a real-valued right-continuous random process given on $ ( \Omega , {\mathcal F} ) $, | ||
+ | and let $ {\mathcal F} _ {t} = \sigma \{ \omega : {X ( s), s \leq t } \} $. | ||
+ | Then the random variables | ||
− | + | $$ | |
− | + | \tau ( \omega ) = \inf \{ {t \geq 0 } : {X ( t) \in B } \} | |
− | + | $$ | |
and | and | ||
− | + | $$ | |
+ | \sigma ( \omega ) = \inf \{ {t > 0 } : {X ( t) \in B } \} | ||
+ | , | ||
+ | $$ | ||
− | that is, the (first and first after | + | that is, the (first and first after $ + 0 $) |
+ | times of hitting the (Borel) set $ B $, | ||
+ | form Markov moments (in the case $ \{ \cdot \} = \emptyset $ | ||
+ | it is assumed that $ \inf \emptyset = \infty $). | ||
− | 2) If | + | 2) If $ w ( t) $, |
+ | $ t \geq 0 $, | ||
+ | is a standard [[Wiener process|Wiener process]], then the Markov moment | ||
− | + | $$ | |
+ | \tau = \inf \{ {t \geq 0 } : {w ( t) \geq a } \} | ||
+ | ,\ a > 0 , | ||
+ | $$ | ||
has probability density | has probability density | ||
− | + | $$ | |
+ | P ( t) = \ | ||
+ | |||
+ | \frac{a}{t ^ {3/2} \sqrt {2 \pi } } | ||
− | Here < | + | e ^ {- a ^ {2} / 2 t } . |
+ | $$ | ||
+ | |||
+ | Here $ {\mathsf P} \{ \tau < \infty \} = 1 $, | ||
+ | but $ {\mathsf E} \tau = \infty $. | ||
3) The random variable | 3) The random variable | ||
− | + | $$ | |
+ | \gamma = \inf \{ {t > 0 } : {X ( s) \in B , s \geq t } \} | ||
+ | , | ||
+ | $$ | ||
− | being the first time after which | + | being the first time after which $ X _ {t} $ |
+ | remains in $ B $, | ||
+ | is an example of a non-Markov moment (a random variable depending on the "future" ). | ||
Using the idea of a Markov moment one can formulate the strong [[Markov property|Markov property]] of Markov processes (cf. [[Markov process|Markov process]]). Markov moments and stopping times (that is, finite Markov moments) play a major role in the general theory of random processes and statistical [[Sequential analysis|sequential analysis]]. | Using the idea of a Markov moment one can formulate the strong [[Markov property|Markov property]] of Markov processes (cf. [[Markov process|Markov process]]). Markov moments and stopping times (that is, finite Markov moments) play a major role in the general theory of random processes and statistical [[Sequential analysis|sequential analysis]]. | ||
Line 40: | Line 96: | ||
====References==== | ====References==== | ||
{| | {| | ||
− | |valign="top"|{{Ref|GS}}|| I.I. Gihman, A.V. | + | |valign="top"|{{Ref|GS}}|| I.I. Gihman, A.V. Skorohod, "The theory of stochastic processes" , '''2''' , Springer (1975) (Translated from Russian) {{MR|0375463}} {{ZBL|0305.60027}} |
|} | |} | ||
====Comments==== | ====Comments==== | ||
− | |||
====References==== | ====References==== |
Latest revision as of 07:59, 6 June 2020
Markov time; stopping time
2020 Mathematics Subject Classification: Primary: 60G40 [MSN][ZBL]
A notion used in probability theory for random variables having the property of independence of the "future" . More precisely, let $ ( \Omega , {\mathcal F} ) $ be a measurable space with a non-decreasing family $ ( {\mathcal F} _ {t} ) $, $ t \in T $, of $ \sigma $- algebras of $ {\mathcal F} $( $ T = [ 0 , \infty ] $ in the case of continuous time and $ T = \{ 0 , 1 ,\dots \} $ in the case of discrete time). A random variable $ \tau = \tau ( \omega ) $ with values in $ T \cup \{ + \infty \} $ is called a Markov moment or Markov time (relative to the family $ ( {\mathcal F} _ {t} ) $, $ t \in T $) if for each $ t \in T $ the event $ \{ \tau ( \omega ) \leq t \} $ belongs to $ {\mathcal F} _ {t} $. In the case of discrete time this is equivalent to saying that for any $ n \in \{ 0 , 1 ,\dots \} $ the event $ \{ \tau ( \omega ) = n \} $ belongs to $ {\mathcal F} _ {n} $.
Examples.
1) Let $ X ( t) $, $ t \in T $, be a real-valued right-continuous random process given on $ ( \Omega , {\mathcal F} ) $, and let $ {\mathcal F} _ {t} = \sigma \{ \omega : {X ( s), s \leq t } \} $. Then the random variables
$$ \tau ( \omega ) = \inf \{ {t \geq 0 } : {X ( t) \in B } \} $$
and
$$ \sigma ( \omega ) = \inf \{ {t > 0 } : {X ( t) \in B } \} , $$
that is, the (first and first after $ + 0 $) times of hitting the (Borel) set $ B $, form Markov moments (in the case $ \{ \cdot \} = \emptyset $ it is assumed that $ \inf \emptyset = \infty $).
2) If $ w ( t) $, $ t \geq 0 $, is a standard Wiener process, then the Markov moment
$$ \tau = \inf \{ {t \geq 0 } : {w ( t) \geq a } \} ,\ a > 0 , $$
has probability density
$$ P ( t) = \ \frac{a}{t ^ {3/2} \sqrt {2 \pi } } e ^ {- a ^ {2} / 2 t } . $$
Here $ {\mathsf P} \{ \tau < \infty \} = 1 $, but $ {\mathsf E} \tau = \infty $.
3) The random variable
$$ \gamma = \inf \{ {t > 0 } : {X ( s) \in B , s \geq t } \} , $$
being the first time after which $ X _ {t} $ remains in $ B $, is an example of a non-Markov moment (a random variable depending on the "future" ).
Using the idea of a Markov moment one can formulate the strong Markov property of Markov processes (cf. Markov process). Markov moments and stopping times (that is, finite Markov moments) play a major role in the general theory of random processes and statistical sequential analysis.
References
[GS] | I.I. Gihman, A.V. Skorohod, "The theory of stochastic processes" , 2 , Springer (1975) (Translated from Russian) MR0375463 Zbl 0305.60027 |
Comments
References
[BG] | R.M. Blumenthal, R.K. Getoor, "Markov processes and potential theory" , Acad. Press (1968) MR0264757 Zbl 0169.49204 |
[Do] | J.L. Doob, "Classical potential theory and its probabilistic counterpart" , Springer (1984) pp. 390 MR0731258 Zbl 0549.31001 |
[Dy] | E.B. Dynkin, "Markov processes" , 1 , Springer (1965) (Translated from Russian) MR0193671 Zbl 0132.37901 |
[W] | A.D. Wentzell, "A course in the theory of stochastic processes" , McGraw-Hill (1981) (Translated from Russian) MR0781738 MR0614594 Zbl 0502.60001 |
[B] | L.P. Breiman, "Probability" , Addison-Wesley (1968) MR0229267 Zbl 0174.48801 |
Markov moment. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_moment&oldid=26568