46 min listen
Coding in Collaboration with AI with Sourcegraph CTO Beyang Liu
FromNo Priors: Artificial Intelligence | Technology | Startups
Coding in Collaboration with AI with Sourcegraph CTO Beyang Liu
FromNo Priors: Artificial Intelligence | Technology | Startups
ratings:
Length:
47 minutes
Released:
Jan 18, 2024
Format:
Podcast episode
Description
Coding in collaboration with AI can reduce human toil in the software development process and lead to more accurate and less tedious work for coding teams. This week on No Priors, Sarah talked with Beyang Liu, the cofounder and CTO of Sourcegraph, which builds tools that help developers innovate faster. Their most recent launch was an AI coding assistant called Cody. Beyang has spent his entire career thinking about how humans can work in conjunction with AI to write better code.
Sarah and Beyang talk about how Sourcegraph is thinking about augmenting the coding process in a way that ensures accuracy and efficiency starting with robust and high-quality context. They also think about what the future of software development could look like in a world where AI can generate high-quality code on its own and where that leaves humans in the coding process.
Sign up for new podcasts every week. Email feedback to show@no-priors.com
Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @beyang
Show Notes:
(0:00) Beyang Liu’s experience
(0:52) Sourcegraph premise
(2:20) AI and finding flow
(4:18) Developing LLMs in code
(6:46) Cody explanation
(7:56) Unlocking AI code generation
(11:00) search architecture in LLMs
(16:02) Quality-assurance in data set
(18:03) Future of Cody
(22:48) Constraints in AI code generation
(30:28) Lessons from Beyang’s research days
(33:17) Benefits of small models
(35:49) Future of software development
(42:14) What skills will be valued down the line
Sarah and Beyang talk about how Sourcegraph is thinking about augmenting the coding process in a way that ensures accuracy and efficiency starting with robust and high-quality context. They also think about what the future of software development could look like in a world where AI can generate high-quality code on its own and where that leaves humans in the coding process.
Sign up for new podcasts every week. Email feedback to show@no-priors.com
Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @beyang
Show Notes:
(0:00) Beyang Liu’s experience
(0:52) Sourcegraph premise
(2:20) AI and finding flow
(4:18) Developing LLMs in code
(6:46) Cody explanation
(7:56) Unlocking AI code generation
(11:00) search architecture in LLMs
(16:02) Quality-assurance in data set
(18:03) Future of Cody
(22:48) Constraints in AI code generation
(30:28) Lessons from Beyang’s research days
(33:17) Benefits of small models
(35:49) Future of software development
(42:14) What skills will be valued down the line
Released:
Jan 18, 2024
Format:
Podcast episode
Titles in the series (63)
How can we make sure that everyone has access to AI? Can small models outperform large models? With Stability AI’s Emad Mostaque: AI-generated images have been everywhere over the past year, but one company has fueled an explosive developer ecosystem around large image models: Stability AI. Stability builds open AI tools with a mission to improve humanity. Stability AI is most known for Stable Diffusion, the AI model where a user puts in a natural language prompt and the AI generates images. But they're also engaged in progressing models in natural language, voice, video, and biology. This week on the podcast, Emad Mostaque joins Sarah Guo and Elad Gil to talk about how this barely one-year-old, London-based company has changed the AI landscape, scaling laws, progress in different modalities, frameworks for AI safety and why the future of AI is open. by No Priors: Artificial Intelligence | Technology | Startups