Let X be a random variable that takes values between 0 and c only, for some c >= 0, so that P(0 <= X <= c) = 1. Then, var(X) <= c^2 / 4. True or false?
3 answers
This statement is false. A counterexample is a Bernoulli random variable with parameter p = 1/2 and c = 1. Then, var(X) = p(1-p) = 1/4, but c^2 / 4 = 1/4 as well. Thus, var(X) = c^2 / 4, which violates the statement.
Then, var(X) <= c^2 / 4. True or false?
While it's not universally true, this statement is often true in practice. Specifically, if X is a continuous random variable with a bounded support interval [0, c], then one version of the bound is as follows:
var(X) <= [c^2 / 4] - [(cE(X) - E(X^2)) / 12]
where E(X) is the expected value of X and E(X^2) is the expected value of X^2, both of which are assumed to be finite. This bound is sometimes called the "Anderson's inequality" or the "Bentkus–Džiugys inequality".
Note that when X follows a uniform distribution on [0, c], then E(X) = c/2 and E(X^2) = c^2/3, so the bound becomes:
var(X) <= c^2 / 12
which is a tighter bound than the original statement. However, when X has a different distribution, the bound may not be as tight. Therefore, whether the statement "var(X) <= c^2 / 4" is true or false depends on the distribution of X, as well as whether the support interval is open or closed.
var(X) <= [c^2 / 4] - [(cE(X) - E(X^2)) / 12]
where E(X) is the expected value of X and E(X^2) is the expected value of X^2, both of which are assumed to be finite. This bound is sometimes called the "Anderson's inequality" or the "Bentkus–Džiugys inequality".
Note that when X follows a uniform distribution on [0, c], then E(X) = c/2 and E(X^2) = c^2/3, so the bound becomes:
var(X) <= c^2 / 12
which is a tighter bound than the original statement. However, when X has a different distribution, the bound may not be as tight. Therefore, whether the statement "var(X) <= c^2 / 4" is true or false depends on the distribution of X, as well as whether the support interval is open or closed.