Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

Neural Net Dropout

Neural Net Dropout

FromLinear Digressions


Neural Net Dropout

FromLinear Digressions

ratings:
Length:
19 minutes
Released:
Oct 2, 2017
Format:
Podcast episode

Description

Neural networks are complex models with many parameters and can be prone to overfitting.  There's a surprisingly simple way to guard against this: randomly destroy connections between hidden units, also known as dropout.  It seems counterintuitive that undermining the structural integrity of the neural net makes it robust against overfitting, but in the world of neural nets, weirdness is just how things go sometimes.

Relevant links: https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
Released:
Oct 2, 2017
Format:
Podcast episode

Titles in the series (100)

Linear Digressions is a podcast about machine learning and data science. Machine learning is being used to solve a ton of interesting problems, and to accomplish goals that were out of reach even a few short years ago.