For some states in a Markov chain, there is a positive probability that once the chain transitions out of that state, it will never come back. For instance, in the chain with transition matrix

Once the chain transitions out of state one, it will never again enter state one, as it will stay in state two forever. There are times, however, when the chain will (with probability one) re-enter that state. For example, consider the chain

We note that the probability the chain re-enters state one at time is given by as chain must stay in state two for transitions, each of which has probability , and then must finally transition to state one with probability . Now, the probability the state ever re-enters state one is the union of the events where it first enters at , or at , or , … These are all disjoint events, though, so we have

We say that state is recurrent if, given , the probability the chain will ever re-enter state is one. If a state is not recurrent, it is called transient. If a state is recurrent, then it will (with probability one) be entered infinitely often, assuming it’s entered at all, as the chain essentially “restarts” itself each time it enters that state, due to the Markov property.

Now suppose states and belong to the same class and that state is recurrent. Since and belong to the same class, we have . Note that (the probability of not transitioning to state from state ) is less than one. Each time the chain is in state there is a chance it will not transition to state . However, since is recurrent, it will be entered infinitely often, so the chance of never entering state is

So, if the chain enters state , it will (with probability one) also enter state at some point. This shows that if is recurrent, then must be as well. This shows that recurrence (and transcience) is a class property: in any given class, all states are transient or all states are recurrent.

An easier way to test for recurrence is to note that state is recurrent if and only if

(This follows by setting to be the Bernoulli random variable that is one if and zero otherwise. This expected number of times state one is entered is infinite, so is infinite. By linearity, though, this expectation is simply .)

Consider the simple random walk where the probability of stepping to the left or right at each transition is 1/2. We will suppose that our states are the integers, like we’re walking along an axis, and that we start in state zero. Note that all states communicate with all other states, so the chain is irreducible (there is only one class of states).

Now, supposing we’re in state zero, what’s the probability we ever re-enter state zero? To enter state zero, we must have had the same number of steps left as right. This is binomially distributed, so the probability that is

Of course, this depends on being even, so we can rewrite this as

Using Stirling’s approximation to the factorial, re-writing the above binomial, we have that

and so summing over all we have that which implies zero is a recurrent state. This in turn implies all states are recurrent.

## Leave a Reply