Definition

Joint entropy is a measure of the uncertainty associated with a set of random variables.