Suppose X is a random variable and Y = aX + b is a linear function of X. Show that the correlation of X and Y is 1 if a < 0.

3 answers

CORRECTION:
Suppose X is a random variable and Y = aX + b is a linear function of X. Show that the correlation of X and Y is -1 if a < 0.
cor(X,Y) = cov(X,V) / sqrt(var(X)*� var(Y))
Anyone?