Member-only story
🚅 Information Theory for People in a Hurry
A quick guide to Entropy, Cross-Entropy and KL Divergence. Python code provided. 🐍

Considered the Magna Carta of the Information Age, Claude Shannon’s seminal 1948 paper posed a groundbreaking question:
How can we quantify communication?
This question laid the foundation for information theory, revolutionising technology in ways still felt today. Shannon’s insights underpin how we measure, store, and transmit information, contributing to breakthroughs in signal processing, data compression (e.g., Zip files, CDs), the Internet, and artificial intelligence. Beyond technology, his work has influenced diverse fields such as neurobiology, statistical physics, and computer science (e.g., cybersecurity, cloud computing, and machine learning).
In this article, we focus on three key metrics: entropy, cross-entropy, and KL divergence, along with their foundation in self-information. These concepts bridge probability theory with real-world applications. They serve as common practical tools for analysis and optimisation used in data science and machine learning.
I’ll introduce these metrics and then explore an interesting use case — message length optimisation, using a toy example of weather…