Consider a Markov chain X0,X1,X2,… described by the transition probability graph shown below. The chain starts at state 1; that is, X0=1.
Find the probability that X2=3.
P(X2=3)=
- unanswered
Find the probability that the process is in state 3 immediately after the second change of state. (A “change of state" is a transition that is not a self-transition.)
- unanswered
Find (approximately) P(X1000=2∣X1000=X1001).
P(X1000=2∣X1000=X1001)≈
- unanswered
Let T be the first time that the state is equal to 3.
E[T]=
- unanswered
Suppose for this part of the problem that the process starts instead at state 2, i.e., X0=2. Let S be the first time by which both states 1 and 3 have been visited.
E[S]=
1 answer
- unanswered