Problem 2: Oscar's running shoes

Oscar goes for a run each morning. When he leaves his house for his run, he is equally likely to use either the front or the back door; and similarly, when he returns, he is equally likely to use either the front or the back door. Assume that his choice of the door through which he leaves is independent of his choice of the door through which he returns, and also assume that these choices are independent across days.

Oscar owns only five pairs of running shoes, each pair placed at one of the two doors. If there is at least one pair of shoes at the door through which he leaves, he wears a pair for his run; otherwise, he runs barefoot. When he returns from his run, if he wore shoes for that run, he takes off the shoes after the run and leaves them at the door through which he returns.

We wish to determine the long-term proportion of time that Oscar runs barefoot.

We consider a Markov chain with states {0,1,2,3,4,5}, where state i indicates that there are i pairs of shoes available at the front door in the morning, before Oscar leaves for his run. Specify the numerical values of the following transition probabilities.

For i∈{0,1,2,3,4},

pi,i+1=

0.25 - correct

For i∈{1,2,3,4,5},

pi,i−1=

0.25 - correct

For i∈{1,2,3,4},

pii=

0.5 - correct

p00=

0.75 - correct

p55=

0.75 - correct

Determine the steady-state probability that Oscar runs barefoot.

0.1666666666666666666 - correct

"answered in full"

3 answers

answers to "Problem 1: Steady-state convergence" , belonging to this very same problem set....

1.a. False

1.b. False

2.a. False

2.b. False

3.a. True

3.b. True
1. 0.44
2. 1/9
2/9
6/9
3. 1/9
Any hint how to solve it and not only the answers?