Important Distinction

This page relates to Entropy in Information Theory, not Thermodynamics.

What?

Entropy quantifies the amount of uncertainty, unpredictability or randomness in a system. The higher the unpredictability, the more information that is potentially contained in a message. Minimum entropy is the most informative message.

Mathematically:

Where:

  • is the probability of the occurrence of the -th outcome
  • is typically base 2 (for information in bits).
  • is a discrete random variable with possible values
  • refers to surprising-ness (surprisal).