Asked by Nancy
The distribution of cash withdrawals from the automatic teller machine at a
certain bank has a mean of $500 with a standard deviation of $70. To reduce the
incentives for robbery, the bank puts money into the machine every 12 hours and
it keeps the amount deposited fairly close to the expected total withdrawals for a
12-hour period. If 100 withdrawals were expected in each 12-hour period and
each withdrawal was independent, how much should the bank put into the
machine so that the probability of running out of money was 0.05?
certain bank has a mean of $500 with a standard deviation of $70. To reduce the
incentives for robbery, the bank puts money into the machine every 12 hours and
it keeps the amount deposited fairly close to the expected total withdrawals for a
12-hour period. If 100 withdrawals were expected in each 12-hour period and
each withdrawal was independent, how much should the bank put into the
machine so that the probability of running out of money was 0.05?
Answers
There are no human answers yet.
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.