Definition
Conditional entropy quantifies the amount of information needed to describe the outcome of a Random Variable given that the value of another Random Variable is known.
Conditional entropy quantifies the amount of information needed to describe the outcome of a Random Variable given that the value of another Random Variable is known.