For the model X=\Theta +W, and under the usual independence and normality assumptions for \Theta and W, the mean squared error of the LMS estimator is
\frac{1}{(1/\sigma _0^2)+(1/\sigma _1^2)},
where \sigma _0^2 and \sigma _1^2 are the variances of \Theta and W, respectively.
Suppose now that we change the observation model to Y=3\Theta +W. In some sense the “signal" \Theta has a stronger presence, relative to the noise term W, and we should expect to obtain a smaller mean squared error. Suppose \sigma _0^2=\sigma _1^2=1. The mean squared error of the original model X=\Theta +W is then 1/2. In contrast, the mean squared error of the new model Y=3\Theta +W is
1 answer
\frac{1}{(1/\sigma _0^2)+(1/(3^2\sigma _1^2))}=\frac{1}{1+\frac{1}{9}}=\frac{9}{10}