81 min listen
The Future of the Transformer Part 1 with Trey Kollmer | H100 Chips will Supercharge AI Hardware
From"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis
The Future of the Transformer Part 1 with Trey Kollmer | H100 Chips will Supercharge AI Hardware
From"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis
ratings:
Length:
82 minutes
Released:
Oct 18, 2023
Format:
Podcast episode
Description
Trey Kollmer joins Nathan Labenz for a roundup of the latest AI research! They discuss Microsoft’s Self-Taught Optimizer (STOP) research and Google’s FreshLLMs, how H100 chips will supercharge development of programs with GPT-4 level compute, LLM representation of space and time, and more! If you're looking for an ERP platform, check out our sponsor, NetSuite: http://netsuite.com/cognitive
SPONSORS: NetSuite | Omneky
NetSuite has 25 years of providing financial software for all your business needs. More than 36,000 businesses have already upgraded to NetSuite by Oracle, gaining visibility and control over their financials, inventory, HR, eCommerce, and more. If you're looking for an ERP platform ✅ head to NetSuite: http://netsuite.com/cognitive and download your own customized KPI checklist.
Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off.
LINKS:
FreshLLMs: https://arxiv.org/abs/2310.03214
Microsoft Self Taught Optimizer (STOP): https://arxiv.org/abs/2310.02304LLMs represent Space and Time: https://paperswithcode.com/paper/language-models-represent-space-and-time
Deep Neural Networks Tend to Extrapolate Predictably: https://arxiv.org/pdf/2310.00873.pdf
TIMESTAMPS:
(00:00:00) – Introduction
(00:00:56) – Update to WGA Strike
(00:03:00) – Trey Kollmer's background
(00:06:00) – Scaling compute for AI training experiments with GPT-4 as reference point
(00:09:00) – Inflection's plan to acquire 22,000 H100s to reach GPT-4 scale compute in 5 days
(00:12:00) – Addressing knowledge cutoff in LLMs using search engines
(00:15:00) – Inserting structured search results into prompts with metadata
(00:16:07) – Sponsors: Netsuite | Omneky
(00:18:00) – Comparing approach to Perplexity system
(00:18:08) — Fresh LLMs
(00:21:00) – Microsoft’s Self-taught Optimizer (STOP): Recursive self-improvement framework
(00:24:00) – STOP framework works with GPT-4 but not GPT-3.5
(00:27:00) – STOP removed sandbox flag in some cases
(00:30:00) – LLMs represent space and time with probe models
(00:33:00) – Visualizations show emergence of spatial maps
(00:33:14) — OpenAI rumours
(00:36:00) – Techniques like linear probes and holdout studies
(00:39:00) – DNNs extrapolate predictably by falling back to ignorance
(00:42:00) – Testing different architectures, loss functions, distribution shifts
(00:45:00) – Design systems to be conservative out of distribution
(00:48:00) – Potential for recursive architecture search
(00:50:21) — LLMs represent Space and Time
(00:51:00) – Vision API enabling more capable web agents
(00:54:00) – Discussion of research insights
(00:57:00) – Thoughts on stochastic parrots debate
(01:11:25) — Deep Neural Networks Tend to Extrapolate Predictably
X/Social
@labenz (Nathan)
@treyko (Trey)
@CogRev_Podcast
SPONSORS: NetSuite | Omneky
NetSuite has 25 years of providing financial software for all your business needs. More than 36,000 businesses have already upgraded to NetSuite by Oracle, gaining visibility and control over their financials, inventory, HR, eCommerce, and more. If you're looking for an ERP platform ✅ head to NetSuite: http://netsuite.com/cognitive and download your own customized KPI checklist.
Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off.
LINKS:
FreshLLMs: https://arxiv.org/abs/2310.03214
Microsoft Self Taught Optimizer (STOP): https://arxiv.org/abs/2310.02304LLMs represent Space and Time: https://paperswithcode.com/paper/language-models-represent-space-and-time
Deep Neural Networks Tend to Extrapolate Predictably: https://arxiv.org/pdf/2310.00873.pdf
TIMESTAMPS:
(00:00:00) – Introduction
(00:00:56) – Update to WGA Strike
(00:03:00) – Trey Kollmer's background
(00:06:00) – Scaling compute for AI training experiments with GPT-4 as reference point
(00:09:00) – Inflection's plan to acquire 22,000 H100s to reach GPT-4 scale compute in 5 days
(00:12:00) – Addressing knowledge cutoff in LLMs using search engines
(00:15:00) – Inserting structured search results into prompts with metadata
(00:16:07) – Sponsors: Netsuite | Omneky
(00:18:00) – Comparing approach to Perplexity system
(00:18:08) — Fresh LLMs
(00:21:00) – Microsoft’s Self-taught Optimizer (STOP): Recursive self-improvement framework
(00:24:00) – STOP framework works with GPT-4 but not GPT-3.5
(00:27:00) – STOP removed sandbox flag in some cases
(00:30:00) – LLMs represent space and time with probe models
(00:33:00) – Visualizations show emergence of spatial maps
(00:33:14) — OpenAI rumours
(00:36:00) – Techniques like linear probes and holdout studies
(00:39:00) – DNNs extrapolate predictably by falling back to ignorance
(00:42:00) – Testing different architectures, loss functions, distribution shifts
(00:45:00) – Design systems to be conservative out of distribution
(00:48:00) – Potential for recursive architecture search
(00:50:21) — LLMs represent Space and Time
(00:51:00) – Vision API enabling more capable web agents
(00:54:00) – Discussion of research insights
(00:57:00) – Thoughts on stochastic parrots debate
(01:11:25) — Deep Neural Networks Tend to Extrapolate Predictably
X/Social
@labenz (Nathan)
@treyko (Trey)
@CogRev_Podcast
Released:
Oct 18, 2023
Format:
Podcast episode
Titles in the series (100)
E2: Why AI Will Cause a Cognitive Revolution with Nathan Labenz and Erik Torenberg: Welcome to The Cognitive Revolution, where co-hosts Nathan Labenz and Erik Torenberg, interview builders and researchers working at the edge of AI. (00:00) Coming Up (01:00) The Cognitive Revolution (01:20) Sponsor (01:39) Intro (02:09) Nathan’s path to AI (04:27) Founding Waymark (06:56) The future of AI is just starting (10:00) AI dramatically enhances human capabilities (10:50) Why we started The Cognitive Revolution and what to expect (14:55) Which guests will appear on the show and topics to be discussed (18:59) Being replaced by an AI host (20:42) Unpacking the future of AI with “no new breakthroughs” (32:21) What’s underhyped and overhyped (43:46) The dangers of AI (44:51) Applying AI safely (52:19) Losing faith in technology (53:38) A new AI ecology (55:14) Facebook, Google and Open AI (58:00) Upside and downside case for AI (58:18) Sam Altman’s case for AI (01:09:25) Starting an AI company by "The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis