Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

24: Language and Entropy (Information Theory in Language)

24: Language and Entropy (Information Theory in Language)

FromBreaking Math Podcast


24: Language and Entropy (Information Theory in Language)

FromBreaking Math Podcast

ratings:
Length:
48 minutes
Released:
Mar 7, 2018
Format:
Podcast episode

Description

Information theory was founded in 1948 by Claude Shannon, and is a way of both qualitatively and quantitatively describing the limits and processes involved in communication. Roughly speaking, when two entities communicate, they have a message, a medium, confusion, encoding, and decoding; and when two entities communicate, they transfer information between them. The amount of information that is possible to be transmitted can be increased or decreased by manipulating any of the aforementioned variables. One of the practical, and original, applications of information theory is to models of language. So what is entropy? How can we say language has it? And what structures within language with respect to information theory reveal deep insights about the nature of language itself?--- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/appSupport this podcast: https://anchor.fm/breakingmathpodcast/support
Released:
Mar 7, 2018
Format:
Podcast episode

Titles in the series (100)

Breaking Math is a podcast that aims to make math accessible to everyone, and make it enjoyable. Every other week, topics such as chaos theory, forbidden formulas, and more will be covered in detail. If you have 45 or so minutes to spare, you're almost guaranteed to learn something new!SFTM, our umbrella organization, also has another (explicit) podcast called "Nerd Forensics" all about nerd (and other) culture. Check it out wherever you get podcasts! Support this podcast: https://anchor.fm/breakingmathpodcast/support