Difference between revisions of "Markov chain, generalized"
(MSC|60J10 Category:Markov chains) |
Ulf Rehmann (talk | contribs) m (tex encoded by computer) |
||
(2 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
+ | <!-- | ||
+ | m0624001.png | ||
+ | $#A+1 = 12 n = 0 | ||
+ | $#C+1 = 12 : ~/encyclopedia/old_files/data/M062/M.0602400 Markov chain, generalized | ||
+ | Automatically converted into TeX, above some diagnostics. | ||
+ | Please remove this comment and the {{TEX|auto}} line below, | ||
+ | if TeX found to be correct. | ||
+ | --> | ||
+ | |||
+ | {{TEX|auto}} | ||
+ | {{TEX|done}} | ||
+ | |||
{{MSC|60J10}} | {{MSC|60J10}} | ||
[[Category:Markov chains]] | [[Category:Markov chains]] | ||
− | A sequence of random variables | + | A sequence of random variables $ \xi _ {n} $ |
+ | with the properties: | ||
− | 1) the set of values of each | + | 1) the set of values of each $ \xi _ {n} $ |
+ | is finite or countable; | ||
− | 2) for any | + | 2) for any $ n $ |
+ | and any $ i _ {0} \dots i _ {n} $, | ||
− | + | $$ \tag{* } | |
+ | {\mathsf P} \{ \xi _ {n} = i _ {n} \mid | ||
+ | \xi _ {0} = i _ {0} \dots \xi _ {n-} s = i _ {n-} s \dots | ||
+ | \xi _ {n-} 1 = i _ {n-} 1 \} = | ||
+ | $$ | ||
− | + | $$ | |
+ | = \ | ||
+ | {\mathsf P} \{ \xi _ {n} = i _ {n} \mid \xi _ {n-} s | ||
+ | = i _ {n-} s \dots \xi _ {n-} 1 = i _ {n-} 1 \} . | ||
+ | $$ | ||
− | A generalized Markov chain satisfying (*) is called | + | A generalized Markov chain satisfying (*) is called $ s $- |
+ | generalized. For $ s = 1 $, | ||
+ | (*) is the usual [[Markov property|Markov property]]. The study of $ s $- | ||
+ | generalized Markov chains can be reduced to the study of ordinary Markov chains. Consider the sequence of random variables $ \eta _ {n} $ | ||
+ | whose values are in one-to-one correspondence with the values of the vector | ||
− | + | $$ | |
+ | ( \xi _ {n-} s+ 1 , \xi _ {n-} s+ 2 \dots \xi _ {n} ) . | ||
+ | $$ | ||
− | The sequence | + | The sequence $ \eta _ {n} $ |
+ | forms an ordinary [[Markov chain|Markov chain]]. | ||
====References==== | ====References==== | ||
− | + | {| | |
− | + | |valign="top"|{{Ref|D}}|| J.L. Doob, "Stochastic processes" , Wiley (1953) {{MR|1570654}} {{MR|0058896}} {{ZBL|0053.26802}} | |
− | + | |} | |
====Comments==== | ====Comments==== | ||
− | |||
====References==== | ====References==== | ||
− | + | {| | |
+ | |valign="top"|{{Ref|F}}|| D. Freedman, "Markov chains" , Holden-Day (1975) {{MR|0686269}} {{MR|0681291}} {{MR|0556418}} {{MR|0428472}} {{MR|0292176}} {{MR|0237001}} {{MR|0211464}} {{MR|0164375}} {{MR|0158435}} {{MR|0152015}} {{ZBL|0501.60071}} {{ZBL|0501.60069}} {{ZBL|0426.60064}} {{ZBL|0325.60059}} {{ZBL|0322.60057}} {{ZBL|0212.49801}} {{ZBL|0129.30605}} | ||
+ | |- | ||
+ | |valign="top"|{{Ref|KS}}|| J.G. Kemeny, J.L. Snell, "Finite Markov chains" , v. Nostrand (1960) {{MR|1531032}} {{MR|0115196}} {{ZBL|0089.13704}} | ||
+ | |- | ||
+ | |valign="top"|{{Ref|Re}}|| D. Revuz, "Markov chains" , North-Holland (1975) {{MR|0415773}} {{ZBL|0332.60045}} | ||
+ | |- | ||
+ | |valign="top"|{{Ref|Ro}}|| V.I. Romanovsky, "Discrete Markov chains" , Wolters-Noordhoff (1970) (Translated from Russian) {{MR|0266312}} {{ZBL|0201.20002}} | ||
+ | |- | ||
+ | |valign="top"|{{Ref|S}}|| E. Seneta, "Non-negative matrices and Markov chains" , Springer (1981) {{MR|2209438}} {{ZBL|0471.60001}} | ||
+ | |- | ||
+ | |valign="top"|{{Ref|BF}}|| A. Blanc-Lapierre, R. Fortet, "Theory of random functions" , '''1–2''' , Gordon & Breach (1965–1968) (Translated from French) {{MR|}} {{ZBL|0185.44502}} {{ZBL|0159.45802}} | ||
+ | |} |
Latest revision as of 07:59, 6 June 2020
2020 Mathematics Subject Classification: Primary: 60J10 [MSN][ZBL]
A sequence of random variables $ \xi _ {n} $ with the properties:
1) the set of values of each $ \xi _ {n} $ is finite or countable;
2) for any $ n $ and any $ i _ {0} \dots i _ {n} $,
$$ \tag{* } {\mathsf P} \{ \xi _ {n} = i _ {n} \mid \xi _ {0} = i _ {0} \dots \xi _ {n-} s = i _ {n-} s \dots \xi _ {n-} 1 = i _ {n-} 1 \} = $$
$$ = \ {\mathsf P} \{ \xi _ {n} = i _ {n} \mid \xi _ {n-} s = i _ {n-} s \dots \xi _ {n-} 1 = i _ {n-} 1 \} . $$
A generalized Markov chain satisfying (*) is called $ s $- generalized. For $ s = 1 $, (*) is the usual Markov property. The study of $ s $- generalized Markov chains can be reduced to the study of ordinary Markov chains. Consider the sequence of random variables $ \eta _ {n} $ whose values are in one-to-one correspondence with the values of the vector
$$ ( \xi _ {n-} s+ 1 , \xi _ {n-} s+ 2 \dots \xi _ {n} ) . $$
The sequence $ \eta _ {n} $ forms an ordinary Markov chain.
References
[D] | J.L. Doob, "Stochastic processes" , Wiley (1953) MR1570654 MR0058896 Zbl 0053.26802 |
Comments
References
[F] | D. Freedman, "Markov chains" , Holden-Day (1975) MR0686269 MR0681291 MR0556418 MR0428472 MR0292176 MR0237001 MR0211464 MR0164375 MR0158435 MR0152015 Zbl 0501.60071 Zbl 0501.60069 Zbl 0426.60064 Zbl 0325.60059 Zbl 0322.60057 Zbl 0212.49801 Zbl 0129.30605 |
[KS] | J.G. Kemeny, J.L. Snell, "Finite Markov chains" , v. Nostrand (1960) MR1531032 MR0115196 Zbl 0089.13704 |
[Re] | D. Revuz, "Markov chains" , North-Holland (1975) MR0415773 Zbl 0332.60045 |
[Ro] | V.I. Romanovsky, "Discrete Markov chains" , Wolters-Noordhoff (1970) (Translated from Russian) MR0266312 Zbl 0201.20002 |
[S] | E. Seneta, "Non-negative matrices and Markov chains" , Springer (1981) MR2209438 Zbl 0471.60001 |
[BF] | A. Blanc-Lapierre, R. Fortet, "Theory of random functions" , 1–2 , Gordon & Breach (1965–1968) (Translated from French) Zbl 0185.44502 Zbl 0159.45802 |
Markov chain, generalized. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_chain,_generalized&oldid=21652