Namespaces
Variants
Actions

Difference between revisions of "Markov property"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (MR/ZBL numbers added)
m (fix tex)
(3 intermediate revisions by 2 users not shown)
Line 1: Line 1:
''for a real-valued [[Stochastic process|stochastic process]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625101.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625102.png" />''
+
<!--
 +
m0625101.png
 +
$#A+1 = 21 n = 0
 +
$#C+1 = 21 : ~/encyclopedia/old_files/data/M062/M.0602510 Markov property
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
 +
''for a real-valued [[Stochastic process|stochastic process]] $  X ( t) $,  
 +
$  t \in T \subset  \mathbf R $''
  
 
{{MSC|60Jxx}}
 
{{MSC|60Jxx}}
Line 5: Line 18:
 
[[Category:Markov processes]]
 
[[Category:Markov processes]]
  
The property that for any set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625103.png" /> of times from <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625104.png" /> and any Borel set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625105.png" />,
+
The property that for any set $ t _ {1} < \dots < t _ {n+1} $ of times from $  T $ and any Borel set $  B $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625106.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
$$ \tag{* }
 +
{\mathsf P} \{ X ( t _ {n+1} ) \in B \mid  X ( t _ {n} ) \dots X ( t _ {1} ) \} =
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625107.png" /></td> </tr></table>
+
$$
 +
= \
 +
{\mathsf P} \{ X ( t _ {n+1} ) \in B \mid  X ( t _ {n} ) \}
 +
$$
  
with probability 1, that is, the conditional probability distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625108.png" /> given <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625109.png" /> coincides (almost certainly) with the conditional distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251010.png" /> given <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251011.png" />. This can be interpreted as independence of the "future" <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251012.png" /> and the "past" <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251013.png" /> given the fixed "present" <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251014.png" />. Stochastic processes satisfying the property (*) are called Markov processes (cf. [[Markov process|Markov process]]). The Markov property has (under certain additional assumptions) a stronger version, known as the "strong Markov property" . In discrete time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251015.png" /> the strong Markov property, which is always true for (Markov) sequences satisfying (*), means that for each stopping time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251016.png" /> (relative to the family of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251017.png" />-algebras <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251018.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251019.png" />), with probability one
+
with probability 1, that is, the conditional probability distribution of $  X ( t _ {n+1} ) $
 +
given $  X ( t _ {n} ) \dots X ( t _ {1} ) $
 +
coincides (almost certainly) with the conditional distribution of $  X ( t _ {n+1} ) $
 +
given $  X ( t _ {n} ) $.  
 +
This can be interpreted as independence of the "future" $  X ( t _ {n+1} ) $
 +
and the "past" $  ( X ( t _ {n-1} ) \dots X ( t _ {1} ) ) $
 +
given the fixed "present" $  X ( t _ {n} ) $.  
 +
Stochastic processes satisfying the property (*) are called Markov processes (cf. [[Markov process|Markov process]]). The Markov property has (under certain additional assumptions) a stronger version, known as the "strong Markov property" . In discrete time $  T = \{ 1 , 2 ,\dots \} $
 +
the strong Markov property, which is always true for (Markov) sequences satisfying (*), means that for each stopping time $  \tau $(
 +
relative to the family of $  \sigma $-
 +
algebras $  ( F _ {n} , n \geq  1) $,  
 +
$  F _ {n} = \sigma \{  \omega  : {X ( 1) \dots X ( n) } \} $),  
 +
with probability one
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251020.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ X ( \tau + 1 ) \in B \mid  X ( \tau ) \dots X ( 1) \} =
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251021.png" /></td> </tr></table>
+
$$
 +
= \
 +
{\mathsf P} \{ X ( \tau + 1 ) \in B \mid  X ( \tau ) \} .
 +
$$
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top"> I.I. [I.I. Gikhman] Gihman, A.V. [A.V. Skorokhod] Skorohod, "The theory of stochastic processes" , '''2''' , Springer (1975) (Translated from Russian) {{MR|0375463}} {{ZBL|0305.60027}} </TD></TR></table>
+
{|
 
+
|valign="top"|{{Ref|GS}}|| I.I. Gihman, A.V. Skorohod, "The theory of stochastic processes" , '''2''' , Springer (1975) (Translated from Russian) {{MR|0375463}} {{ZBL|0305.60027}}
 
+
|}
  
 
====Comments====
 
====Comments====
 
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"> K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1960) {{MR|0116388}} {{ZBL|0092.34304}} </TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953) {{MR|1570654}} {{MR|0058896}} {{ZBL|0053.26802}} </TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> E.B. Dynkin, "Markov processes" , '''1''' , Springer (1965) (Translated from Russian) {{MR|0193671}} {{ZBL|0132.37901}} </TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top"> T.G. Kurtz, "Markov processes" , Wiley (1986) {{MR|0838085}} {{ZBL|0592.60049}} </TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top"> W. Feller, "An introduction to probability theory and its applications" , '''1–2''' , Wiley (1966) {{MR|0210154}} {{ZBL|0138.10207}} </TD></TR><TR><TD valign="top">[a6]</TD> <TD valign="top"> P. Lévy, "Processus stochastiques et mouvement Brownien" , Gauthier-Villars (1965) {{MR|0190953}} {{ZBL|0137.11602}} </TD></TR><TR><TD valign="top">[a7]</TD> <TD valign="top"> M. Loève, "Probability theory" , '''II''' , Springer (1978) {{MR|0651017}} {{MR|0651018}} {{ZBL|0385.60001}} </TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|C}}|| K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1960) {{MR|0116388}} {{ZBL|0092.34304}}
 +
|-
 +
|valign="top"|{{Ref|Do}}|| J.L. Doob, "Stochastic processes" , Wiley (1953) {{MR|1570654}} {{MR|0058896}} {{ZBL|0053.26802}}
 +
|-
 +
|valign="top"|{{Ref|Dy}}|| E.B. Dynkin, "Markov processes" , '''1''' , Springer (1965) (Translated from Russian) {{MR|0193671}} {{ZBL|0132.37901}}
 +
|-
 +
|valign="top"|{{Ref|K}}|| T.G. Kurtz, "Markov processes" , Wiley (1986) {{MR|0838085}} {{ZBL|0592.60049}}
 +
|-
 +
|valign="top"|{{Ref|F}}|| W. Feller, [[Feller, "An introduction to probability theory and its applications"|"An introduction to probability theory and its  applications"]], '''1–2''' , Wiley (1966)
 +
|-
 +
|valign="top"|{{Ref|Le}}|| P. Lévy, "Processus stochastiques et mouvement Brownien" , Gauthier-Villars (1965) {{MR|0190953}} {{ZBL|0137.11602}}
 +
|-
 +
|valign="top"|{{Ref|Lo}}|| M. Loève, "Probability theory" , '''II''' , Springer (1978) {{MR|0651017}} {{MR|0651018}} {{ZBL|0385.60001}}
 +
|}

Revision as of 21:26, 1 January 2021


for a real-valued stochastic process $ X ( t) $, $ t \in T \subset \mathbf R $

2020 Mathematics Subject Classification: Primary: 60Jxx [MSN][ZBL]

The property that for any set $ t _ {1} < \dots < t _ {n+1} $ of times from $ T $ and any Borel set $ B $,

$$ \tag{* } {\mathsf P} \{ X ( t _ {n+1} ) \in B \mid X ( t _ {n} ) \dots X ( t _ {1} ) \} = $$

$$ = \ {\mathsf P} \{ X ( t _ {n+1} ) \in B \mid X ( t _ {n} ) \} $$

with probability 1, that is, the conditional probability distribution of $ X ( t _ {n+1} ) $ given $ X ( t _ {n} ) \dots X ( t _ {1} ) $ coincides (almost certainly) with the conditional distribution of $ X ( t _ {n+1} ) $ given $ X ( t _ {n} ) $. This can be interpreted as independence of the "future" $ X ( t _ {n+1} ) $ and the "past" $ ( X ( t _ {n-1} ) \dots X ( t _ {1} ) ) $ given the fixed "present" $ X ( t _ {n} ) $. Stochastic processes satisfying the property (*) are called Markov processes (cf. Markov process). The Markov property has (under certain additional assumptions) a stronger version, known as the "strong Markov property" . In discrete time $ T = \{ 1 , 2 ,\dots \} $ the strong Markov property, which is always true for (Markov) sequences satisfying (*), means that for each stopping time $ \tau $( relative to the family of $ \sigma $- algebras $ ( F _ {n} , n \geq 1) $, $ F _ {n} = \sigma \{ \omega : {X ( 1) \dots X ( n) } \} $), with probability one

$$ {\mathsf P} \{ X ( \tau + 1 ) \in B \mid X ( \tau ) \dots X ( 1) \} = $$

$$ = \ {\mathsf P} \{ X ( \tau + 1 ) \in B \mid X ( \tau ) \} . $$

References

[GS] I.I. Gihman, A.V. Skorohod, "The theory of stochastic processes" , 2 , Springer (1975) (Translated from Russian) MR0375463 Zbl 0305.60027

Comments

References

[C] K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1960) MR0116388 Zbl 0092.34304
[Do] J.L. Doob, "Stochastic processes" , Wiley (1953) MR1570654 MR0058896 Zbl 0053.26802
[Dy] E.B. Dynkin, "Markov processes" , 1 , Springer (1965) (Translated from Russian) MR0193671 Zbl 0132.37901
[K] T.G. Kurtz, "Markov processes" , Wiley (1986) MR0838085 Zbl 0592.60049
[F] W. Feller, "An introduction to probability theory and its applications", 1–2 , Wiley (1966)
[Le] P. Lévy, "Processus stochastiques et mouvement Brownien" , Gauthier-Villars (1965) MR0190953 Zbl 0137.11602
[Lo] M. Loève, "Probability theory" , II , Springer (1978) MR0651017 MR0651018 Zbl 0385.60001
How to Cite This Entry:
Markov property. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_property&oldid=23628
This article was adapted from an original article by A.N. Shiryaev (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article