A machine is either working (state 1) or not workind (state 2). If it is working one day the probability that it will be broken the next day is 0.1. If it is not working one day the probability that it will be working the next day is 0.8. Let Tn be the state of the machine n days from now. Assume the Markov assumption is satisfied so that Tn is a Markov Chain.

a) Give the transition matrix P for T.

1 answer

If machine is working on day 0
prob working on day 1 = .9
prob broken on day 1 = .1

If machine is broken on day 0
prob working on day 1 = .8
prob broken on day 1 = .2

so
working broken 1 = working broken 0 *
.9 .1
.8 .2

for example if working day 0
(doing second row with / because of font text limitation here)
[W1 B1] = [ 1 0 } [ .9/.8 .1/.2 ]
= [.9 .1]
then
W2 B2 = [.9 .1][ .9/.8 .1/.2 ]
= [.81+.08 .09+.02 ]
= [.89 .11 ]
and
Wn Bn = [1 0] *
| .9 .8 |^n
| .8 .2 |