To compare the mean squared error in the first and second scenarios, we'll calculate the mean squared error for each scenario and compare them.
In the first scenario, we have X = Θ + W, where Θ ~ N(0,1) and W ~ N(0,1) independent of Θ. The mean squared error is given by the formula:
MSE = E[(Θ - X)^2]
To calculate this, we substitute X = Θ + W into the formula:
MSE = E[(Θ - (Θ + W))^2] = E[(Θ - Θ - W)^2] = E[(-W)^2] = E[W^2] = Var(W) = 1
In the second scenario, we have two observations X1 and X2 of the form X_i = Θ + W_i, where W_i are standard normals and independent of each other and Θ. The mean squared error is again given by the formula:
MSE = E[(Θ - X1)^2] = E[(Θ - X2)^2]
Since Θ ~ N(0, σ0^2) and σ0^2 is treated as infinite, the posterior distribution of Θ given the observations X1 and X2 becomes N(X-bar, σ0^2), where X-bar is the sample mean of X1 and X2.
Using this posterior distribution, the mean squared error can be calculated as:
MSE = E[(Θ - X-bar)^2] = Var(Θ) + E[(X-bar)^2] - 2Cov(Θ, X-bar]
Since Θ ~ N(0, σ0^2), Var(Θ) = σ0^2, and Cov(Θ, X-bar) = Cov(Θ, (X1 + X2)/2) = Cov(Θ, X1)/2 + Cov(Θ, X2)/2 = 0, as Θ and Xi are independent.
Thus, the mean squared error reduces to:
MSE = σ0^2 + E[(X-bar)^2]
Using the fact that Xi = Θ + Wi, we have:
X-bar = (X1 + X2)/2 = (Θ + W1 + Θ + W2)/2 = Θ/2 + (W1 + W2)/2
Substituting this into the mean squared error formula, we get:
MSE = σ0^2 + E[(Θ/2 + (W1 + W2)/2)^2] = σ0^2 + Var(Θ/2 + (W1 + W2)/2) = σ0^2 + Var(Θ/2) + Var((W1 + W2)/2)
Since Θ and Wi are independent, we have:
Var(Θ/2) = (1/4)Var(Θ) = (1/4)σ0^2
Var((W1 + W2)/2) = (1/4)(Var(W1) + Var(W2)) = (1/4)(1 + 1) = 1/2
Therefore, the mean squared error in the second scenario is:
MSE = σ0^2 + (1/4)σ0^2 + 1/2 = (5/4)σ0^2 + 1/2
Comparing the mean squared error in the first and second scenarios, we have:
MSE in first scenario = 1
MSE in second scenario = (5/4)σ0^2 + 1/2
Since σ0^2 is treated as infinite, the mean squared error in the second scenario is much larger than in the first scenario. Therefore, the answer is:
b) smaller in the second scenario.
In this exercise we want to understand a little better the formula
\frac{1}{\displaystyle {\sum _{i=0}^{n} \frac{1}{\sigma _ i^2}}}
for the mean squared error by considering two alternative scenarios.
In the first scenario, \Theta \sim N(0,1) and we observe X=\Theta +W, where W\sim N(0,1) is independent of \Theta.
In the second scenario, the prior information on \Theta is extremely inaccurate: \Theta \sim N(0,\sigma _0^2), where \sigma _0^2 is so large that it can be treated as infinite. But in this second scenario we obtain two observations of the form X_ i=\Theta +W_ i, where the W_ i are standard normals, independent of each other and of \Theta.
The mean squared error is:
a)smaller in the first scenario.
b)smaller in the second scenario.
c)the same in both scenarios.
1 answer