Asked by Anon123

Let Θˆ be an estimator of a random variable Θ, and let Θ˜=Θˆ−Θ be the estimation error.

a) In this part of the problem, let Θˆ be specifically the LMS estimator of Θ. We have seen that for the case of the LMS estimator, E[Θ˜∣X=x]=0 for every x. Is it also true that E[Θ˜∣Θ=θ]=0 for all θ? Equivalently, is it true that E[Θˆ∣Θ=θ]=θ for all θ?

b) In this part of the problem, Θˆ is no longer necessarily the LMS estimator of Θ. Is the property Var(Θ)=Var(Θˆ)+Var(Θ˜) true for every estimator Θˆ?

Answers

Answered by Anonymous
a) NO

b) NO
Answer
Could you explain why not in parts a and b?
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions