PinnedPublished inTowards AI🚅 Information Theory for People in a HurryA quick guide to Entropy, Cross-Entropy and KL Divergence. Python code provided. 🐍Mar 76Mar 76
PinnedPublished inTDS Archive🤷 Quantifying Uncertainty — A Data Scientist’s Intro To Information Theory — Part 2/5: EntropyGain intuition into Entropy and master its applications in Machine Learning and Data Analysis. Python code provided. 🐍Feb 35Feb 35
PinnedPublished inTDS Archive🚪🚪🐐 Lessons in Decision Making from the Monty Hall ProblemA journey into three intuitions: Common, Bayesian and CausalOct 24, 202412Oct 24, 202412
PinnedPublished inTDS Archive➡️ Start Asking Your Data “Why?” - A Gentle Intro To CausalityBegin your causal journey with visual demonstrations and resources.Sep 12, 20244Sep 12, 20244
PinnedPublished inTDS Archive🧠🧹 Causality - Mental Hygiene for Data ScienceHarness The Power of Why with Causal Tools.Nov 28, 20242Nov 28, 20242
Published inTowards AI🎲🎲 Quantifying Dependence — A Data Scientist’s Intro To Information Theory — Part 4/5Gain an intuition into two-variable statistics as a prelude to understanding Mutual Information. Python code included. 🐍Apr 281Apr 281
Published inData Science Collective📏 Quantifying Misalignment — A Data Scientist’s Intro To Information Theory — Part 3/5Gain intuition into Cross-Entropy, KL-Divergence and their applications in Machine Learning and Data Analysis. Python code provided. 🐍Feb 24Feb 24
Published inTDS Archive😲 Quantifying Surprise — A Data Scientist’s Intro To Information Theory — Part 1/5: FoundationsGain intuition into Information Theory and master its applications in Machine Learning and Data Analysis. Python code provided. 🐍Feb 3Feb 3
Anatole, thanks for the kind words and for you question!Are you referring to causality or Simpson's Paradox?Oct 24, 2024Oct 24, 2024