Difference between revisions of "Markov property"
(MSC|60Jxx Category:Markov processes) |
Ulf Rehmann (talk | contribs) m (MR/ZBL numbers added) |
||
Line 11: | Line 11: | ||
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625107.png" /></td> </tr></table> | <table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625107.png" /></td> </tr></table> | ||
− | with probability 1, that is, the conditional probability distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625108.png" /> given <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625109.png" /> coincides (almost certainly) with the conditional distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251010.png" /> given <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251011.png" />. This can be interpreted as independence of the | + | with probability 1, that is, the conditional probability distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625108.png" /> given <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m0625109.png" /> coincides (almost certainly) with the conditional distribution of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251010.png" /> given <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251011.png" />. This can be interpreted as independence of the "future" <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251012.png" /> and the "past" <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251013.png" /> given the fixed "present" <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251014.png" />. Stochastic processes satisfying the property (*) are called Markov processes (cf. [[Markov process|Markov process]]). The Markov property has (under certain additional assumptions) a stronger version, known as the "strong Markov property" . In discrete time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251015.png" /> the strong Markov property, which is always true for (Markov) sequences satisfying (*), means that for each stopping time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251016.png" /> (relative to the family of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251017.png" />-algebras <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251018.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251019.png" />), with probability one |
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251020.png" /></td> </tr></table> | <table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062510/m06251020.png" /></td> </tr></table> | ||
Line 18: | Line 18: | ||
====References==== | ====References==== | ||
− | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> | + | <table><TR><TD valign="top">[1]</TD> <TD valign="top"> I.I. [I.I. Gikhman] Gihman, A.V. [A.V. Skorokhod] Skorohod, "The theory of stochastic processes" , '''2''' , Springer (1975) (Translated from Russian) {{MR|0375463}} {{ZBL|0305.60027}} </TD></TR></table> |
Line 26: | Line 26: | ||
====References==== | ====References==== | ||
− | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> | + | <table><TR><TD valign="top">[a1]</TD> <TD valign="top"> K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1960) {{MR|0116388}} {{ZBL|0092.34304}} </TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"> J.L. Doob, "Stochastic processes" , Wiley (1953) {{MR|1570654}} {{MR|0058896}} {{ZBL|0053.26802}} </TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top"> E.B. Dynkin, "Markov processes" , '''1''' , Springer (1965) (Translated from Russian) {{MR|0193671}} {{ZBL|0132.37901}} </TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top"> T.G. Kurtz, "Markov processes" , Wiley (1986) {{MR|0838085}} {{ZBL|0592.60049}} </TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top"> W. Feller, "An introduction to probability theory and its applications" , '''1–2''' , Wiley (1966) {{MR|0210154}} {{ZBL|0138.10207}} </TD></TR><TR><TD valign="top">[a6]</TD> <TD valign="top"> P. Lévy, "Processus stochastiques et mouvement Brownien" , Gauthier-Villars (1965) {{MR|0190953}} {{ZBL|0137.11602}} </TD></TR><TR><TD valign="top">[a7]</TD> <TD valign="top"> M. Loève, "Probability theory" , '''II''' , Springer (1978) {{MR|0651017}} {{MR|0651018}} {{ZBL|0385.60001}} </TD></TR></table> |
Revision as of 10:31, 27 March 2012
for a real-valued stochastic process ,
2020 Mathematics Subject Classification: Primary: 60Jxx [MSN][ZBL]
The property that for any set of times from and any Borel set ,
(*) |
with probability 1, that is, the conditional probability distribution of given coincides (almost certainly) with the conditional distribution of given . This can be interpreted as independence of the "future" and the "past" given the fixed "present" . Stochastic processes satisfying the property (*) are called Markov processes (cf. Markov process). The Markov property has (under certain additional assumptions) a stronger version, known as the "strong Markov property" . In discrete time the strong Markov property, which is always true for (Markov) sequences satisfying (*), means that for each stopping time (relative to the family of -algebras , ), with probability one
References
[1] | I.I. [I.I. Gikhman] Gihman, A.V. [A.V. Skorokhod] Skorohod, "The theory of stochastic processes" , 2 , Springer (1975) (Translated from Russian) MR0375463 Zbl 0305.60027 |
Comments
References
[a1] | K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1960) MR0116388 Zbl 0092.34304 |
[a2] | J.L. Doob, "Stochastic processes" , Wiley (1953) MR1570654 MR0058896 Zbl 0053.26802 |
[a3] | E.B. Dynkin, "Markov processes" , 1 , Springer (1965) (Translated from Russian) MR0193671 Zbl 0132.37901 |
[a4] | T.G. Kurtz, "Markov processes" , Wiley (1986) MR0838085 Zbl 0592.60049 |
[a5] | W. Feller, "An introduction to probability theory and its applications" , 1–2 , Wiley (1966) MR0210154 Zbl 0138.10207 |
[a6] | P. Lévy, "Processus stochastiques et mouvement Brownien" , Gauthier-Villars (1965) MR0190953 Zbl 0137.11602 |
[a7] | M. Loève, "Probability theory" , II , Springer (1978) MR0651017 MR0651018 Zbl 0385.60001 |
Markov property. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_property&oldid=21658