Information Theory: Entropy, Relative Entropy, and Mutual Information (Personal Summary Notes)

Martin Wafula
Nov 8, 2020

--

Fig: Mind Map Summary of Entropy, Mutual Information, and Relative Entropy

Entropy

- Measure of the average uncertainty in the random variable

  • It is equal to the number of bits on average required to describe the random variable.

Note: Average length of a random variable lies between H(X) and H(X) + 1, where H(X) is the entropy of the random variable X.

Relative Entropy

- Measure of the distance between two distributions

  • It is the measure of the inefficiency of assuming the distribution is q when true distribution is p.

Mutual Information

- Measure of the amount of information that one random variable contains about another random variable

  • It is the reduction in the uncertainty of one random variable because of the knowledge of the other

TASK: Obtain the proofs of the equations in the attached figure if you are new to information theory

Reference: Thomas M. Cover and Joy A. Thomas, “Elements of Information Theory”, 2nd Edition. (I recommend this book to beginners).

--

--

Martin Wafula

DPhil candidate in Engineering Science at University of Oxford. My interests are in information theory, graph compression &network topology inference.