Consider a Markov chain X0,X1,X2,É described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1.

1recurs p=.75
1to 2 p= .25
2to 1 p = .375
2 recurs p=.25
2 to 3 p = .375
3 to 2 p = .25
3 recurs p = .75
Find the probability that X2=3.

P(X2=3)= 5/8 - incorrect
Find the probability that the process is in state 3 immediately after the second change of state. (A Òchange of state" is a transition that is not a self-transition.)

1 - incorrect
Find (approximately) P(X1000=2∣X1000=X1001).

P(X1000=2∣X1000=X1001)Å 1/4 - incorrect
Let T be the first time that the state is equal to 3.

E[T]= 5 - incorrect
Suppose for this part of the problem that the process starts instead at state 2, i.e., X0=2. Let S be the first time by which both states 1 and 3 have been visited.

E[S]= 6 - incorrect

3 answers

1 = 3/32
2 = 1/2
3 =
4 =
5 =
1. 3/32
2. 0.5
3. 1/10
4. 32/3
5. 12
Thank you all the above answers are correct