A communication channel accepts the input X ?{0,1, 2,3} and outputs Y=X+Z whereZ is a binary random variable taking values -1 and +1 with equal probability. AssumeX and Z are independent and all values of the input X have equal probability.a) Find the entropy of Y.b) Find the entropy of X given that Y=1.