29 min listen
Bayesian Optimization for Hyperparameter Tuning with Scott Clark - TWiML Talk #50
FromThe TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Bayesian Optimization for Hyperparameter Tuning with Scott Clark - TWiML Talk #50
FromThe TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
ratings:
Length:
47 minutes
Released:
Oct 2, 2017
Format:
Podcast episode
Description
As you all know, a few weeks ago, I spent some time in SF at the Artificial Intelligence Conference. While I was there, I had just enough time to sneak away and catch up with Scott Clark, Co-Founder and CEO of Sigopt, a company whose software is focused on automatically tuning your model’s parameters through Bayesian optimization. We dive pretty deeply into that process through the course of this discussion, while hitting on topics like Exploration vs Exploitation, Bayesian Regression, Heterogeneous Configuration Models and Covariance Kernels. I had a great time and learned a ton, but be forewarned, this is most definitely a Nerd Alert show! Notes for this show can be found at twimlai.com/talk/50
Released:
Oct 2, 2017
Format:
Podcast episode
Titles in the series (100)
This Week in ML & AI - 7/8/16: A BS Meter for AI, Retrieval Models for Chatbots & Predatory Robots: This Week in Machine Learning & AI brings you the… by The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)