83 min listen
E36: Your Model, Your Weights with MosaicML's Abhi Venigalla and Jonathan Frankle
From"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis
E36: Your Model, Your Weights with MosaicML's Abhi Venigalla and Jonathan Frankle
From"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis
ratings:
Length:
62 minutes
Released:
Jun 16, 2023
Format:
Podcast episode
Description
In this episode, Nathan sits down with Jonathan Frankle, Chief Scientist, and Abhi Venigalla, Research Scientist of MosaicML. They chat about Mosaic’s custom LLMs, the customers seeking Mosaic out and what their journeys and use cases look like, and exciting developments in Mosaic’s research: including their new inference platform, as well as Mosaic’s MPT-7B-65k+ storywriter model.
The Cognitive Revolution is a part of the Turpentine podcast network. To learn more: www.turpentine.co
TIMESTAMPS:
(00:00) Episode Preview
(06:04) Mosaic’s business model
(07:28) Who uses Mosaic’s custom LLMs? What does their data look like?
(09:55)Mosaic’s use cases for custom LLMs
(12:47) How much extraction and summarization was done by humans pre-LLMs?
(15:28) Sponsor: Omneky
(21:50) The journeys of Mosaic’s customers and would a Wendy’s LLM know about a Big Mac?
(25:46) The curriculum model and fine-tuning
(29:10) Language models in the life sciences
(33:20) How raw can data be before it becomes a problem?
(35:44) Using the output of bulk pre-training process vs additional after training
(38:30) Redteaming as a service
(39:40) Mosaic’s inference platform
(41:53) Spending one cent on 20,000 tokens, how is that cent distributed?
(46:00)) Selling compute on a dedicated capacity basis
(47:30) Oracle and AWS
(49:50) The storywriter model and 65,000 token window
(54:35) The transition from finite parameters into infinite attention matrix
LINKS:
MosaicML: https://www.mosaicml.com/
MPT-7B Storywriter Model: https://huggingface.co/mosaicml/mpt-7b-storywriter
TWITTER:
@jefrankle (Jonathan)
@abhi_venigalla (Abhi)
@MosaicML (Mosaic)
@CogRev_Podcast
@labenz (Nathan)
@eriktorenberg (Erik)
SPONSOR:
Thank you Omneky for sponsoring The Cognitive Revolution. Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work, customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off.
Music Credit: MusicLM
The Cognitive Revolution is a part of the Turpentine podcast network. To learn more: www.turpentine.co
TIMESTAMPS:
(00:00) Episode Preview
(06:04) Mosaic’s business model
(07:28) Who uses Mosaic’s custom LLMs? What does their data look like?
(09:55)Mosaic’s use cases for custom LLMs
(12:47) How much extraction and summarization was done by humans pre-LLMs?
(15:28) Sponsor: Omneky
(21:50) The journeys of Mosaic’s customers and would a Wendy’s LLM know about a Big Mac?
(25:46) The curriculum model and fine-tuning
(29:10) Language models in the life sciences
(33:20) How raw can data be before it becomes a problem?
(35:44) Using the output of bulk pre-training process vs additional after training
(38:30) Redteaming as a service
(39:40) Mosaic’s inference platform
(41:53) Spending one cent on 20,000 tokens, how is that cent distributed?
(46:00)) Selling compute on a dedicated capacity basis
(47:30) Oracle and AWS
(49:50) The storywriter model and 65,000 token window
(54:35) The transition from finite parameters into infinite attention matrix
LINKS:
MosaicML: https://www.mosaicml.com/
MPT-7B Storywriter Model: https://huggingface.co/mosaicml/mpt-7b-storywriter
TWITTER:
@jefrankle (Jonathan)
@abhi_venigalla (Abhi)
@MosaicML (Mosaic)
@CogRev_Podcast
@labenz (Nathan)
@eriktorenberg (Erik)
SPONSOR:
Thank you Omneky for sponsoring The Cognitive Revolution. Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work, customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off.
Music Credit: MusicLM
Released:
Jun 16, 2023
Format:
Podcast episode
Titles in the series (100)
E6: The Computer Vision Revolution with Junnan Li and Dongxu Li of BLIP and BLIP2: As recently as January 2021, the challenge of "interpreting what is going on in a photograph" was considered "nowhere near solved." Today's guests Junnan Li and Dongxu Li changed that with their publication and open-sourcing of BLIP, which delivered state-of-the-art performance on image captioning and other vision-language tasks. BLIP became the #18 most-cited AI paper of 2022, and now Junnan and Dongxu are back with BLIP-2, this time showing how small models can harness the power of existing foundation models to do multi-modal tasks. We talked to Junnan and Dongxu about their research and how they see the trend toward connector models shaping the future. We talked to Junnan and Dongxu about their research and how they see the trend toward connector models shaping the future. by "The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis