Next: State Probabilities
Up: markov
Previous: markov
A Markov chain is a discrete state space process in which the
next state depends only on the present state.
For a discrete time system,
if is the state of the system at time ,
then
is a Markov chain if:
i.e., the state of the system at time depends only on
the state of the system at time , and does not depend on
any other state before time .
Subsections