For the model X=\Theta +W, and under the usual independence and normality assumptions for \Theta and W, the mean squared error of the LMS estimator is

\frac{1}{(1/\sigma _0^2)+(1/\sigma _1^2)},

where \sigma _0^2 and \sigma _1^2 are the variances of \Theta and W, respectively.

Suppose now that we change the observation model to Y=3\Theta +W. In some sense the “signal" \Theta has a stronger presence, relative to the noise term W, and we should expect to obtain a smaller mean squared error. Suppose \sigma _0^2=\sigma _1^2=1. The mean squared error of the original model X=\Theta +W is then 1/2. In contrast, the mean squared error of the new model Y=3\Theta +W is

1 answer

\frac{1}{(1/\sigma _0^2)+(1/(3^2\sigma _1^2))}=\frac{1}{1+\frac{1}{9}}=\frac{9}{10}
Similar Questions
    1. answers icon 2 answers
  1. Multiple ChoiceWhich expression is NOT equivalent to 1? (theta) means 0 with dash in it. A.)sin^2 (theta)+cot^2 (theta) sin^2
    1. answers icon 0 answers
  2. Suppose that \Theta takes values in [0,1] and its PDF is of the formf_{\Theta }(\theta )= a\theta (1-\theta )^2,\ \ \
    1. answers icon 1 answer
    1. answers icon 7 answers
more similar questions