Asked by digital communication
X and Y are discrete jointly distributed discrete valued random variables. The relation between their joint entropy H(X,Y) and their individual entropies H(X),H(Y) is
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are uncorrelated
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are uncorrelated
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are uncorrelated
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are uncorrelated
Answers
There are no human answers yet.
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.