Difference between revisions of "Markov chain, decomposable"
(Tex done) |
(details) |
||
Line 4: | Line 4: | ||
A [[Markov chain]] whose [[transition probabilities]] $p_{ij}(t)$ have the following property: There are states $i,j$ such that $p_{ij}(t) = 0$ for all $t \ge 0$. Decomposability of a Markov chain is equivalent to decomposability of its matrix of transition probabilities $P = \left( {p_{ij}} \right)$ for a discrete-time Markov chain, and of its matrix of transition probability densities $Q = \left( {p'_{ij}(0)} \right)$ for a continuous-time Markov chain. The state space of a decomposable Markov chain consists either of inessential states or of more than one class of communicating states (cf. [[Markov chain]]). | A [[Markov chain]] whose [[transition probabilities]] $p_{ij}(t)$ have the following property: There are states $i,j$ such that $p_{ij}(t) = 0$ for all $t \ge 0$. Decomposability of a Markov chain is equivalent to decomposability of its matrix of transition probabilities $P = \left( {p_{ij}} \right)$ for a discrete-time Markov chain, and of its matrix of transition probability densities $Q = \left( {p'_{ij}(0)} \right)$ for a continuous-time Markov chain. The state space of a decomposable Markov chain consists either of inessential states or of more than one class of communicating states (cf. [[Markov chain]]). | ||
− | |||
− | |||
− | |||
− | |||
− | |||
====References==== | ====References==== |
Latest revision as of 18:38, 27 April 2024
2020 Mathematics Subject Classification: Primary: 60J10 Secondary: 60J27 [MSN][ZBL]
A Markov chain whose transition probabilities $p_{ij}(t)$ have the following property: There are states $i,j$ such that $p_{ij}(t) = 0$ for all $t \ge 0$. Decomposability of a Markov chain is equivalent to decomposability of its matrix of transition probabilities $P = \left( {p_{ij}} \right)$ for a discrete-time Markov chain, and of its matrix of transition probability densities $Q = \left( {p'_{ij}(0)} \right)$ for a continuous-time Markov chain. The state space of a decomposable Markov chain consists either of inessential states or of more than one class of communicating states (cf. Markov chain).
References
[F] | W. Feller, "An introduction to probability theory and its applications", 1–2, Wiley (1966) |
[Fr] | D. Freedman, "Markov chains", Holden-Day (1975) MR0686269 MR0681291 MR0556418 MR0428472 MR0292176 MR0237001 MR0211464 MR0164375 MR0158435 MR0152015 Zbl 0501.60071 Zbl 0501.60069 Zbl 0426.60064 Zbl 0325.60059 Zbl 0322.60057 Zbl 0212.49801 Zbl 0129.30605 |
[I] | M. Iosifescu, "Finite Markov processes and their applications", Wiley (1980) MR0587116 Zbl 0436.60001 |
[KS] | J.G. Kemeny, J.L. Snell, "Finite Markov chains", v. Nostrand (1960) MR1531032 MR0115196 Zbl 0089.13704 |
[KSK] | J.G. Kemeny, J.L. Snell, A.W. Knapp, "Denumerable Markov chains", Springer (1976) MR0407981 Zbl 0348.60090 |
[Re] | D. Revuz, "Markov chains", North-Holland (1975) MR0415773 Zbl 0332.60045 |
[Ro] | V.I. Romanovsky, "Discrete Markov chains", Wolters-Noordhoff (1970) (Translated from Russian) MR0266312 Zbl 0201.20002 |
[S] | E. Seneta, "Non-negative matrices and Markov chains", Springer (1981) MR2209438 Zbl 0471.60001 |
Markov chain, decomposable. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_chain,_decomposable&oldid=39530