Markov chain, class of positive states of a

From Encyclopedia of Mathematics
Revision as of 17:06, 7 February 2011 by (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

A set of states of a homogeneous Markov chain with state space such that the transition probabilities

of satisfy

for any , , , and

where is the return time to the state :

for a discrete-time Markov chain, and

for a continuous-time Markov chain. When , is called a zero class of states (class of zero states).

States in the same positive class have a number of common properties. For example, in the case of discrete time, for any the limit relation

holds; if

is the period of state , then for any and is called the period of the class ; for any the limit relation

holds. A discrete-time Markov chain such that all its states form a single positive class of period 1 serves as an example of an ergodic Markov chain (cf. Markov chain, ergodic).


[1] K.L. Chung, "Markov chains with stationary transition probabilities" , Springer (1967)
[2] J.L. Doob, "Stochastic processes" , Wiley (1953)


Cf. also Markov chain, class of zero states of a for additional refences.

How to Cite This Entry:
Markov chain, class of positive states of a. Encyclopedia of Mathematics. URL:,_class_of_positive_states_of_a&oldid=14075
This article was adapted from an original article by A.M. Zubkov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article