Definition H(X,Y)=−E(X,Y)∼p(x,y)[log(p(x,y))]=−∫X,Yp(x,y)logp(x,y)dxdy Joint entropy is a measure of the uncertainty associated with a set of random variables.