By the argument of the weak law of large numbers, if the X_i are i.i.d with mean mu and variance sigma^2, and if M_n=(X_1+....+X_n)/n, then we have an inequality of the form P(|M_n-mu|>=epsilon)<=(a*sigma^2)/n, for a suitable value of a.

1. If epsilon=0.1, then the value of a is?
2. If we change epsilon=0.1 to epsilon=0.1/k, for k>=1 (i.e., if we are interested in k times higher accuracy), how should we change n so that the value of the upper bound does not change from the value calculated in part 1. n should?
a. stay the same
b. increase by a factor of k
c. increase by a factor of k^2
d. decrease by a factor of k
e. none of the above

3 answers

For part 1, if we take epsilon=0.1, then

P(|M_n-mu|>=0.1) <= (a*sigma^2)/n.

Since the weak law of large numbers states that M_n converges to mu in probability as n approaches infinity, we can assume that as n becomes larger, the probability of |M_n - mu| being greater than or equal to epsilon becomes smaller. Therefore, we can choose a to be a small positive constant such as 1.

So, for part 1, the value of a is 1.

For part 2, if we change epsilon to epsilon/k, we want to find the value of n such that

P(|M_n-mu|>=epsilon/k) <= (a*sigma^2)/n.

Since we want the upper bound to remain the same, we need to adjust n accordingly. This means that

P(|M_n-mu|>=0.1) <= (a*sigma^2)/n,

and

P(|M_n-mu|>=0.1/k) <= (a*sigma^2)/n.

To make the upper bound remain the same, the probability of |M_n - mu| being greater than or equal to 0.1/k should be the same as the probability of |M_n - mu| being greater than or equal to 0.1. Since the probabilities are decreasing as n becomes larger, we can conclude that the value of n should stay the same.

Therefore, for part 2, the value of n should stay the same. The answer is (a) stay the same.
a=100
If a = 100, then for any given epsilon:

P(|M_n - mu| >= epsilon) <= (100 * sigma^2) / n.

This is the inequality derived from the weak law of large numbers, with a value of a equal to 100.