Definition.
Let be random variables with a shared outcome space. The mutual information between and is defined as follows:
Consistent with the intuition behind conditional entropy, mutual information quantifies the amount of information in that is made redundant by an observation of the information in .
We can decompose the information given in as such: , i.e. the information already in , and new information that is only in .