4 min listen
[MINI] Dropout
FromData Skeptic
ratings:
Length:
16 minutes
Released:
Jan 13, 2017
Format:
Podcast episode
Description
Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.
Released:
Jan 13, 2017
Format:
Podcast episode
Titles in the series (100)
Introduction: The Data Skeptic Podcast features conversations with topics related to data science, statistics, machine learning, artificial intelligence and the like, all from the perspective of applying critical thinking and the scientific method to evaluate the... by Data Skeptic