# Mathematics Prelims

## December 22, 2008

### Recurrence of States

Filed under: Probability Theory,Stochastics — cjohnson @ 9:25 pm

For some states in a Markov chain, there is a positive probability that once the chain transitions out of that state, it will never come back.  For instance, in the chain with transition matrix

$\displaystyle \left[ \begin{array}{cc} 0 & 1 \\ 0 & 1 \end{array} \right]$

Once the chain transitions out of state one, it will never again enter state one, as it will stay in state two forever.  There are times, however, when the chain will (with probability one) re-enter that state.  For example, consider the chain

$\displaystyle \left[ \begin{array}{cc} 1/2 & 1/2 \\ 1/2 & 1/2 \end{array} \right]$

We note that the probability the chain re-enters state one at time $n$ is given by $(1/2)^n$ as chain must stay in state two for $n-1$ transitions, each of which has probability $1/2$, and then must finally transition to state one with probability $1/2$.  Now, the probability the state ever re-enters state one is the union of the events where it first enters at $n=1$, or at $n=2$, or $n=3$, … These are all disjoint events, though, so we have

$\displaystyle \sum_{n=1}^\infty \left(\frac{1}{2}\right)^n = 1$

We say that state $i$ is recurrent if, given $X_0 = i$, the probability the chain will ever re-enter state $i$ is one.  If a state is not recurrent, it is called transient.  If a state is recurrent, then it will (with probability one) be entered infinitely often, assuming it’s entered at all, as the chain essentially “restarts” itself each time it enters that state, due to the Markov property.

Now suppose states $i$ and $j$ belong to the same class and that state $i$ is recurrent.  Since $i$ and $j$ belong to the same class, we have $P_{ij} > 0$.  Note that $1 - P_{ij}$ (the probability of not transitioning to state $j$ from state $i$) is less than one.  Each time the chain is in state $i$ there is a chance it will not transition to state $j$.  However, since $i$ is recurrent, it will be entered infinitely often, so the chance of never entering state $j$ is

$\displaystyle \prod_{n=1}^\infty (1 - P_{ij}) = 0$

So, if the chain enters state $i$, it will (with probability one) also enter state $j$ at some point.  This shows that if $i$ is recurrent, then $j$ must be as well.  This shows that recurrence (and transcience) is a class property: in any given class, all states are transient or all states are recurrent.

An easier way to test for recurrence is to note that state $i$ is recurrent if and only if

$\displaystyle \sum_{n=1}^\infty P_{ii}^{(n)} = \infty$

(This follows by setting $I_n$ to be the Bernoulli random variable that is one if $X_n = i$ and zero otherwise.  This expected number of times state one is entered is infinite, so $\mathbb{E}\left[\sum I_n \right]$ is infinite.  By linearity, though, this expectation is simply $\sum \mathbb{E}[I_n] = \sum P_{ii}^{(n)}$.)

Consider the simple random walk where the probability of stepping to the left or right at each transition is 1/2.  We will suppose that our states are the integers, like we’re walking along an axis, and that we start in state zero.  Note that all states communicate with all other states, so the chain is irreducible (there is only one class of states).

Now, supposing we’re in state zero, what’s the probability we ever re-enter state zero?  To enter state zero, we must have had the same number of steps left as right.  This is binomially distributed, so the probability that $X_n = 0$ is

$\displaystyle \binom{n}{n/2} \left(\frac{1}{2}\right)^{n/2} \left(\frac{1}{2}\right)^{n/2}$

Of course, this depends on $n$ being even, so we can rewrite this as

$\displaystyle P_{00}^{(2n)} = \binom{2n}{n} \left( \frac{1}{2} \right)^{2n}$

Using Stirling’s approximation to the factorial, re-writing the above binomial, we have that

$\displaystyle P_{00}^{(2n)} \sim \frac{4^n}{\sqrt{\pi n}}$

and so summing over all $n$ we have that $\sum P_{00}^{(2n)} = \infty$ which implies zero is a recurrent state.  This in turn implies all states are recurrent.