Definition.

Let be a probability ensemble and let be discrete random variables on . Then, we define the conditional entropy of given as:

where

consistent with the definition of entropy.


Intuitively, conditional entropy quantifies the amount of “new information” we can expect to gain from after having already observed the information in .