Difference between revisions of "Absorbing state"
(newer MSC template) |
(reorder the categories) |
||
Line 1: | Line 1: | ||
''of a Markov chain <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010430/a0104301.png" />'' | ''of a Markov chain <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010430/a0104301.png" />'' | ||
+ | |||
+ | {{MSC|60J10}} | ||
[[Category:Markov chains]] | [[Category:Markov chains]] | ||
− | |||
− | |||
A state <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010430/a0104302.png" /> such that | A state <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/a/a010/a010430/a0104302.png" /> such that |
Revision as of 19:41, 27 January 2012
of a Markov chain
2020 Mathematics Subject Classification: Primary: 60J10 [MSN][ZBL]
A state such that
An example of a Markov chain with absorbing state is a branching process.
The introduction of additional absorbing states is a convenient technique that enables one to examine the properties of trajectories of a Markov chain that are associated with hitting some set.
Example. Consider the set of states of a homogeneous Markov chain with discrete time and transition probabilities
in which a subset is distinguished and suppose one has to find the probabilities
where is the moment of first hitting the set . If one introduces the auxiliary Markov chain differing from only in that all states are absorbing in , then for the probabilities
are monotonically non-decreasing for and
(*) |
By virtue of the basic definition of a Markov chain
The passage to the limit for taking into account (*) gives a system of linear equations for :
References
[1] | W. Feller, "An introduction to probability theory and its applications" , 1 , Wiley (1968) |
Absorbing state. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Absorbing_state&oldid=20600