11 min listen
The Complexity of Learning Neural Networks
FromData Skeptic
ratings:
Length:
39 minutes
Released:
Oct 20, 2017
Format:
Podcast episode
Description
Over the past several years, we have seen many success stories in machine learning brought about by deep learning techniques. While the practical success of deep learning has been phenomenal, the formal guarantees have been lacking. Our current theoretical understanding of the many techniques that are central to the current ongoing big-data revolution is far from being sufficient for rigorous analysis, at best. In this episode of Data Skeptic, our host Kyle Polich welcomes guest John Wilmes, a mathematics post-doctoral researcher at Georgia Tech, to discuss the efficiency of neural network learning through complexity theory.
Released:
Oct 20, 2017
Format:
Podcast episode
Titles in the series (100)
[MINI] Bayesian Updating: In this minisode, we discuss Bayesian Updating - the process by which one can calculate the most likely hypothesis might be true given one's older / prior belief and all new evidence. by Data Skeptic