/ˌɪnfərˈmeɪʃən ˈθiəri/

noun … “Mathematics of encoding, transmitting, and measuring information.”

Information Theory is the formal mathematical framework developed to quantify information, analyze communication systems, and determine limits of data transmission and compression. Introduced by Claude Shannon, it underpins modern digital communications, coding theory, cryptography, and data compression. At its core, Information Theory defines how much uncertainty exists in a message, how efficiently information can be transmitted over a noisy channel, and how error-correcting codes can approach the theoretical limits.

Key characteristics of Information Theory include:

  • Entropy: a measure of the average information content or uncertainty of a random variable.
  • Mutual information: quantifies the amount of information shared between two variables.
  • Channel capacity: the maximum rate at which data can be reliably transmitted over a communication channel, as formalized by the Shannon Limit.
  • Error correction: forms the theoretical basis for LDPC, Turbo Codes, and other forward error correction (FEC) schemes.
  • Data compression: defines limits for lossless and lossy compression, guiding algorithms such as Huffman coding or arithmetic coding.

Workflow example: In a digital communication system, Information Theory is applied to calculate the entropy of a source signal, design an efficient code to transmit the data, and select error-correcting schemes that maximize throughput while maintaining reliability. Engineers analyze the signal-to-noise ratio (SNR) and bandwidth to approach the Shannon Limit while minimizing errors.

-- Pseudocode: calculate entropy of a discrete source
import math
probabilities = [0.5, 0.25, 0.25]
entropy = -sum(p * math.log2(p) for p in probabilities)
print("Entropy: " + str(entropy) + " bits")
-- Output: Entropy: 1.5 bits

Conceptually, Information Theory is like designing a postal system: it determines how many distinct messages can be reliably sent over a limited channel, how to package them efficiently, and how to ensure they arrive intact even in the presence of noise or interference.

See Shannon Limit, LDPC, Turbo Codes, FEC.