next up previous
Next: State Probabilities Up: markov Previous: markov

Discrete Time Markov Chains

A Markov chain is a discrete state space process in which the next state depends only on the present state.

For a discrete time system, if $X_{n}$ is the state of the system at time $n$, then $\{X_{n}: n \geq0 \}$ is a Markov chain if:

\begin{displaymath}
Pr\left[X_{n} = j \vert X_{n-1} = i_{n-1}, X_{n-2}=i_{n-2}, ...
...i_{0}
\right] = Pr\left[X_{n}=j \vert X_{n-1}=i_{n-1} \right],
\end{displaymath}

i.e., the state of the system at time $n$ depends only on the state of the system at time $n-1$, and does not depend on any other state before time $n-1$.



Subsections