8 min listen
Teaching Bots Learn by Watching Human Behavior - Ep. 67
FromThe AI Podcast
ratings:
Length:
37 minutes
Released:
Aug 22, 2018
Format:
Podcast episode
Description
Robots following coded instructions to complete a task? Old school. Robots learning to do things by watching how humans do it? That’s the future. Earlier this year, Stanford’s Animesh Garg and Marynel Vázquez shared their research in a talk on “Generalizable Autonomy for Robotic Mobility and Manipulation” at the GPU Technology Conference last week. We caught up with them to learn more about generalizable autonomy - the idea that a robot should be able to observe human behavior, and learn to imitate it in a way that’s applicable to a variety of tasks and situations. Like learning to cook by watching YouTube videos, or figuring out how to cross a crowded room for another.
Released:
Aug 22, 2018
Format:
Podcast episode
Titles in the series (100)
Ep. 21: Live at GTC - How AI and VR Intersect: Are AI and VR the peanut butter and chocolate of … by The AI Podcast