Consider a Markov chain X0,X1,X2,É described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1.

1recurs p=.75
1to 2 p= .25
2to 1 p = .375
2 recurs p=.25
2 to 3 p = .375
3 to 2 p = .25
3 recurs p = .75
Find the probability that X2=3.

1) P(X2=3)= ?

2) Find the probability that the process is in state 3 immediately after the second change of state. (A Òchange of state" is a transition that is not a self-transition.) - ?

3)Find (approx.) P(X1000=2∣X1000=X1001)

P(X1000=2∣X1000=X1001)~~ ?

4) Let T be the first time that the state is equal to 3.

E[T]= ?

5) Suppose for this part of the problem that the process starts instead at state 2, i.e., X0=2. Let S be the first time by which both states 1 and 3 have been visited.

E[S]= ?

9 answers

3) Interesting, what does "P(X1000=2 ∣ X1000=X1001)" mean ? I did not know you could condition on a future event. Normally I would have approached this using:
a) Bayes: P(A|B)=P(A)*P(B|A)/P(B)
b) the steady state for the 1->2 transition in 1000 steps.
Can you please answer??
2) 0.5
And the rest? Thanx

Please give me at-least one more...
1)3/32 correct
3) 0.0625
Anyone know 5)?
5) 8
5) 8 is false!