As the time index approaches infinity, a Markov chain
may settle down and exhibit steady-state behavior.
If the following limit exists:

for all values of , then the are the limiting or steady-state probabilities.

Looking at the state probability as approaches infinity, we see that:

(1) | |||

When the limiting probabilities exist, then can be found using the
following equations:

and

where