# Markov chain, generalized

2010 Mathematics Subject Classification: Primary: 60J10 [MSN][ZBL]

A sequence of random variables $\xi _ {n}$ with the properties:

1) the set of values of each $\xi _ {n}$ is finite or countable;

2) for any $n$ and any $i _ {0} \dots i _ {n}$,

$$\tag{* } {\mathsf P} \{ \xi _ {n} = i _ {n} \mid \xi _ {0} = i _ {0} \dots \xi _ {n-} s = i _ {n-} s \dots \xi _ {n-} 1 = i _ {n-} 1 \} =$$

$$= \ {\mathsf P} \{ \xi _ {n} = i _ {n} \mid \xi _ {n-} s = i _ {n-} s \dots \xi _ {n-} 1 = i _ {n-} 1 \} .$$

A generalized Markov chain satisfying (*) is called $s$- generalized. For $s = 1$, (*) is the usual Markov property. The study of $s$- generalized Markov chains can be reduced to the study of ordinary Markov chains. Consider the sequence of random variables $\eta _ {n}$ whose values are in one-to-one correspondence with the values of the vector

$$( \xi _ {n-} s+ 1 , \xi _ {n-} s+ 2 \dots \xi _ {n} ) .$$

The sequence $\eta _ {n}$ forms an ordinary Markov chain.

How to Cite This Entry:
Markov chain, generalized. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_chain,_generalized&oldid=47768
This article was adapted from an original article by V.P. Chistyakov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article