Supposing we’re given each for a Markov chain, we have all of the one-step transition probabilities. Calculating the -step transition probabilities, we arrive at the Chapman-Kolmogorov equations. We will let denote the probability that we arrive at state after transitions, given that we start in :

We begin by considering . If , we find the chance of by considering all the paths that start at and end at ; that is, we look at all posibilities of . Since we have each , we just look at the chance we first transition to state , then from transition to . This gives

Note this gives the two-step transition probabilities form the matrix . We can extend this result inductively, then we have that the -step transition probability matrix is simply .

Continuing with the weather example from last time, we have that the two-step probability transition matrix is

And so, according to our model, if it’s rainy today, there is an 11/32 chance it will be rainy two days from now, 1/4 chance it snows two days from now, and 13/32 chance it’s sunny.

### Like this:

Like Loading...

*Related*

In the first display and the first line of the second display, you need

P_{ij}^{(n)}

instead of

P_{ij}^(n)

which is what i think you have.

Comment by Siddhartha — April 1, 2011 @ 12:38 pm |