Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

Ted Gibson: The Structure and Purpose of Language

Ted Gibson: The Structure and Purpose of Language

FromThe Gradient: Perspectives on AI


Ted Gibson: The Structure and Purpose of Language

FromThe Gradient: Perspectives on AI

ratings:
Length:
133 minutes
Released:
Jan 18, 2024
Format:
Podcast episode

Description

In episode 107 of The Gradient Podcast, Daniel Bashir speaks to Professor Ted Gibson.Ted is a Professor of Cognitive Science at MIT. He leads the TedLab, which investigates why languages look the way they do; the relationship between culture and cognition, including language; and how people learn, represent, and process language.Have suggestions for future podcast guests (or other feedback)? Let us know here or reach us at editor@thegradient.pubSubscribe to The Gradient Podcast:  Apple Podcasts  | Spotify | Pocket Casts | RSSFollow The Gradient on TwitterOutline:* (00:00) Intro* (02:13) Prof Gibson’s background* (05:33) The computational linguistics community and NLP, engineering focus* (10:48) Models of brains* (12:03) Prof Gibson’s focus on behavioral work* (12:53) How dependency distances impact language processing* (14:03) Dependency distances and the origin of the problem* (18:53) Dependency locality theory* (21:38) The structures languages tend to use* (24:58) Sentence parsing: structural integrations and memory costs* (36:53) Reading strategies vs. ordinary language processing* (40:23) Legalese* (46:18) Cross-dependencies* (50:11) Number as a cognitive technology* (54:48) Experiments* (1:03:53) Why counting is useful for Western societies* (1:05:53) The Whorf hypothesis* (1:13:05) Language as Communication* (1:13:28) The noisy channel perspective on language processing* (1:27:08) Fedorenko lab experiments—language for thought vs. communication and Chomsky’s claims* (1:43:53) Thinking without language, inner voices, language processing vs. language as an aid for other mental processing* (1:53:01) Dependency grammars and a critique of Chomsky’s grammar proposals, LLMs* (2:08:48) LLM behavior and internal representations* (2:12:53) OutroLinks:* Ted’s lab page and Twitter* Re-imagining our theories of language* Research — linguistic complexity and dependency locality theory* Linguistic complexity: locality of syntactic dependencies (1998)* The Dependency Locality Theory: A Distance-Based Theory of Linguistic Complexity (2000)* Consequences of the Serial Nature of Linguistic Input for Sentential Complexity (2005)* Large-scale evidence of dependency length minimization in 37 languages (2015)* Dependency locality as an explanatory principle for word order (2020)* Robust effects of working memory demand during naturalistic language comprehension in language-selective cortex (2022)* A resource-rational model of human processing of recursive linguistic structure (2022)* Research — language processing / communication and cross-linguistic universals* Number as a cognitive technology: Evidence from Pirahã language and cognition (2008)* The communicative function of ambiguity in language (2012)* The rational integration of noisy evidence and prior semantic expectations in sentence interpretation (2013)* Color naming across languages reflects color use (2017)* How Efficiency Shapes Human Language (2019) Get full access to The Gradient at thegradientpub.substack.com/subscribe
Released:
Jan 18, 2024
Format:
Podcast episode

Titles in the series (100)

Interviews with various people who research, build, or use AI, including academics, engineers, artists, entrepreneurs, and more. thegradientpub.substack.com