Asked by digital communication
X and Y are discrete jointly distributed discrete valued random variables. The relation between their joint entropy H(X,Y) and their individual entropies H(X),H(Y) is
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are uncorrelated
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are uncorrelated
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≤H(X)+H(Y), equality holds when X,Y are uncorrelated
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are independent
H(X,Y)≥H(X)+H(Y), equality holds when X,Y are uncorrelated
Answers
There are no AI answers yet. The ability to request AI answers is coming soon!
There are no human answers yet. A form for humans to post answers is coming very soon!