a) The statement {\bf E}[\widetilde\Theta \mid \Theta =\theta ]=0 for all \theta is not necessarily true. This would imply that the estimator is unbiased, meaning that the expected value of the estimator is equal to the true value of the parameter being estimated. However, it is not guaranteed that the LMS estimator is unbiased for all values of \theta. Therefore, {\bf E}[\widetilde\Theta \mid \Theta =\theta ]=0 is not always true.
b) The property \textsf{Var}(\Theta )=\textsf{Var}(\widehat\Theta )+\textsf{Var}(\widetilde\Theta ) is not true for every estimator \widehat\Theta. This property is specifically true for the LMS estimator, but may not hold for other estimators. The LMS estimator is designed to minimize the mean squared error, which leads to this property. However, for other estimators, such as maximum likelihood estimators, this property may not hold.
Let \widehat\Theta be an estimator of a random variable \Theta, and let \widetilde\Theta =\widehat\Theta -\Theta be the estimation error.
a) In this part of the problem, let \widehat\Theta be specifically the LMS estimator of \Theta. We have seen that for the case of the LMS estimator, {\bf E}[\widetilde\Theta \mid X=x]=0 for every x. Is it also true that {\bf E}[\widetilde\Theta \mid \Theta =\theta ]=0 for all \theta? Equivalently, is it true that {\bf E}[\widehat\Theta \mid \Theta =\theta ]=\theta for all \theta?
Select an option
unanswered
b) In this part of the problem, \widehat\Theta is no longer necessarily the LMS estimator of \Theta. Is the property \textsf{Var}(\Theta )=\textsf{Var}(\widehat\Theta )+\textsf{Var}(\widetilde\Theta ) true for every estimator \widehat\Theta?
1 answer