Asked by Jerry

Suppose X is a random variable and Y = aX + b is a linear function of X. Show that the correlation of X and Y is 1 if a < 0.

Answers

Answered by Jerry
CORRECTION:
Suppose X is a random variable and Y = aX + b is a linear function of X. Show that the correlation of X and Y is -1 if a < 0.
Answered by Jerry
cor(X,Y) = cov(X,V) / sqrt(var(X)* var(Y))
Answered by Jerry
Anyone?
There are no AI answers yet. The ability to request AI answers is coming soon!

Related Questions