Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

BI 144 Emily M. Bender and Ev Fedorenko: Large Language Models

BI 144 Emily M. Bender and Ev Fedorenko: Large Language Models

FromBrain Inspired


BI 144 Emily M. Bender and Ev Fedorenko: Large Language Models

FromBrain Inspired

ratings:
Length:
72 minutes
Released:
Aug 17, 2022
Format:
Podcast episode

Description

Check out my short video series about what's missing in AI and Neuroscience.





Support the show to get full episodes and join the Discord community.






Large language models, often now called "foundation models", are the model de jour in AI, based on the transformer architecture. In this episode, I bring together Evelina Fedorenko and Emily M. Bender to discuss how language models stack up to our own language processing and generation (models and brains both excel at next-word prediction), whether language evolved in humans for complex thoughts or for communication (communication, says Ev), whether language models grasp the meaning of the text they produce (Emily says no), and much more.





Evelina Fedorenko is a cognitive scientist who runs the EvLab at MIT. She studies the neural basis of language. Her lab has amassed a large amount of data suggesting language did not evolve to help us think complex thoughts, as Noam Chomsky has argued, but rather for efficient communication. She has also recently been comparing the activity in language models to activity in our brain's language network, finding commonality in the ability to predict upcoming words.





Emily M. Bender is a computational linguist at University of Washington. Recently she has been considering questions about whether language models understand the meaning of the language they produce (no), whether we should be scaling language models as is the current practice (not really), how linguistics can inform language models, and more.



EvLab.Emily's website.Twitter: @ev_fedorenko; @emilymbender.Related papersLanguage and thought are not the same thing: Evidence from neuroimaging and neurological patients. (Fedorenko)The neural architecture of language: Integrative modeling converges on predictive processing. (Fedorenko)On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? (Bender)Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data (Bender)



0:00 - Intro
4:35 - Language and cognition
15:38 - Grasping for meaning
21:32 - Are large language models producing language?
23:09 - Next-word prediction in brains and models
32:09 - Interface between language and thought
35:18 - Studying language in nonhuman animals
41:54 - Do we understand language enough?
45:51 - What do language models need?
51:45 - Are LLMs teaching us about language?
54:56 - Is meaning necessary, and does it matter how we learn language?
1:00:04 - Is our biology important for language?
1:04:59 - Future outlook
Released:
Aug 17, 2022
Format:
Podcast episode

Titles in the series (99)

Neuroscience and artificial intelligence work better together. Brain inspired is a celebration and exploration of the ideas driving our progress to understand intelligence. I interview experts about their work at the interface of neuroscience, artificial intelligence, cognitive science, philosophy, psychology, and more: the symbiosis of these overlapping fields, how they inform each other, where they differ, what the past brought us, and what the future brings. Topics include computational neuroscience, supervised machine learning, unsupervised learning, reinforcement learning, deep learning, convolutional and recurrent neural networks, decision-making science, AI agents, backpropagation, credit assignment, neuroengineering, neuromorphics, emergence, philosophy of mind, consciousness, general AI, spiking neural networks, data science, and a lot more. The podcast is not produced for a general audience. Instead, it aims to educate, challenge, inspire, and hopefully entertain those interested in learning more about neuroscience and AI.