Consider a Markov chain X0,X1,X2,… described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1.

Find the probability that X2=3.

P(X2=3)= - unanswered

Find the probability that the process is in state 3 immediately after the second change of state. (A “change of state" is a transition that is not a self-transition.)

- unanswered

Find (approximately) P(X1000=2∣X1000=X1001).

P(X1000=2∣X1000=X1001)≈ - unanswered
Let T be the first time that the state is equal to 3.

E[T]= - unanswered

Suppose for this part of the problem that the process starts instead at state 2, i.e., X0=2. Let S be the first time by which both states 1 and 3 have been visited.

E[S]

9 answers

I am in need of the above soln if anybody knew solution please give answers...
you did not provide transition probability graph/matrix when you first posted the problem
I posted the matrix in my post. I also posted the answers I got wrong when I tried.
I am not sure and I cannot check it until i am sure but for 1 I have 3/32 or 1/4 * 3/8.
for 2 I have 1/2 but really not sure
3 could be 1/3
4 should be 20/3 which the sum of 1/1/p of 1/4 + 3/8
5 ???
Can you please answer?? thanx
I only have one answer left.
@gugget
Only (2) is correct
which is = 1/2
1)3/32 correct
2)0.5 correct
3)1/3 incorrect
4)20/3 incorrect
5
Thanks