There is a mathematical theory called queuing theory that studies ways in which computer jobs are fed in CPUs and researches on how these can be reduced to a minimum. Show how can a computer estimate the average number of jobs waiting at a queue?
Suppose that in a 5 sec interval jobs arrive as indicated in the following table ( Arrival time is assumed to be at the beginning of each second) In the first second jobs A and B arrive. During the second second B moves to the head of the line ( A job is completed as it took 1 sec to be served), and C and D jobs arrive and so on.
Find:
a. The mean number of jobs in line
b. The mode of the number of jobs in line
Time in seconds Jobs
1 A,B
2 C,D
3 -
4 E,F
5 -
Please everyone I need step by step so I will understand how to solve this kind of problem next time. Thanks in advance.