Helpppp pls!!!!

The probability of Heads of a coin is y, and this bias y is itself the realization of a random variable Y which is uniformly distributed on the interval [0,1].

To estimate the bias of this coin. We flip it 6 times, and define the (observed) random variable N as the number of Heads in this experiment.

Throughout this problem, you may find the following formula useful:
For every positive integers n,k,
∫10xn(1−x)kdx=n!k!(n+k+1)!.

Given the observation N=3, calculate the posterior distribution of the bias Y. That is, find the conditional distribution of Y, given N=3.

For 0≤y≤1,

fY|N(y∣N=3)=
unanswered

Loading

What is the LMS estimate of Y, given N=3?

(Enter an exact expression or a decimal accurate to at least 2 decimal places.)

Y^LMS= unanswered

What is the resulting conditional mean squared error of the LMS estimator, given N=3?

(Enter an exact expression or a decimal accurate to at least 2 decimal places.)

unanswered

5 answers

This is my solution.

1) fY|N(y∣N=3)=84 y^6 (1-y)^3
2)0.0636363
3)0.0383746

it's correct?
Just to make sure you're heading in the right direction:
you should know that f(y) = 1 (density of the uniform unit distribution)
Bayes's law: Your prior f(y) * model (f(N|y)) / Fn
The "trick" is to see Fn as simply a normalizing constant.

Our model will follow a binomial distribution
f(N|y) = (6 choose 3) * y^3 * (1-y)^3
Applying Bayes's Law:
(6 choose 3) * y^3 * (1-y)^3 * 1
but note that for the conditional PDF to be valid it must normalize.
The normalizing constant can be determined by using the ∫10xn(1−x)kdx=n!k!(n+k+1)! - we can ignore the (6 choose 3) since it factors into this constant. The rest should be simple :) and your answer should be of the form: c*y^3 * (1-y)^3
For part b - how does the LMS relate to the expected value?

For part c - how does the LMS estimate's MSE relate to variance/bias?
1. 1/7*(y)^3*(1-y)^3
2. 1/2
3. 0.00223
1. 140*y^3*(1-y)^3
2. 1/2
3. 1/36