15 min listen
[MINI] The Vanishing Gradient
FromData Skeptic
ratings:
Length:
15 minutes
Released:
Jun 30, 2017
Format:
Podcast episode
Description
This episode discusses the vanishing gradient - a problem that arises when training deep neural networks in which nearly all the gradients are very close to zero by the time back-propagation has reached the first hidden layer. This makes learning virtually impossible without some clever trick or improved methodology to help earlier layers begin to learn.
Released:
Jun 30, 2017
Format:
Podcast episode
Titles in the series (100)
[MINI] Selection Bias: A discussion about conducting US presidential election polls helps frame a converation about selection bias. by Data Skeptic