Nautilus

How Shannon Entropy Imposes Fundamental Limits on Communication

What’s a message, really? Claude Shannon recognized that the elemental ingredient is surprise. The post How Shannon Entropy Imposes Fundamental Limits on Communication appeared first on Nautilus.

If someone tells you a fact you already know, they’ve essentially told you nothing at all. Whereas if they impart a secret, it’s fair to say something has really been communicated.

This distinction is at the heart of Claude Shannon’s theory of information. Introduced in an epochal 1948 paper, “A Mathematical Theory of Communication,” it provides a rigorous mathematical framework for quantifying the amount of information needed to accurately send and receive a message, as determined by the degree of uncertainty around what the intended message could be saying.

Which is to say, it’s time for an example.

In one scenario, I have a trick coin—it’s heads on both sides. I’m going to flip

You’re reading a preview, subscribe to read more.

More from Nautilus

Nautilus3 min read
A Buffer Zone for Trees
On most trails, a hiker climbing from valley floor to mountain top will be caressed by cooler and cooler breezes the farther skyward they go. But there are exceptions to this rule: Some trails play trickster when the conditions are right. Cold air sl
Nautilus6 min read
How a Hurricane Brought Monkeys Together
On the island of Cayo Santiago, about a mile off the coast of eastern Puerto Rico, the typical relationship between humans and other primates gets turned on its head. The 1,700 rhesus macaque monkeys (Macaca mulatta) living on that island have free r
Nautilus4 min read
Why Animals Run Faster than Robots
More than a decade ago a skinny-legged knee-less robot named Ranger completed an ultramarathon on foot. Donning a fetching red baseball cap with “Cornell” stitched on the front, and striding along at a leisurely pace, Ranger walked 40.5 miles, or 65

Related