Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging)

LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging)

FromLearning Machines 101


LM101-030: How to Improve Deep Learning Performance with Artificial Brain Damage (Dropout and Model Averaging)

FromLearning Machines 101

ratings:
Length:
32 minutes
Released:
Jun 8, 2015
Format:
Podcast episode

Description

Deep learning machine technology has rapidly developed over the past five years due in part to a variety of actors such as: better technology, convolutional net algorithms, rectified linear units, and a relatively new learning strategy called "dropout" in which hidden unit feature detectors are temporarily deleted during the learning process. This article introduces and discusses the concept of "dropout" to support deep learning performance and makes connections of the "dropout" concept to concepts of regularization and model averaging. For more details and background references, check out: www.learningmachines101.com !
 
Released:
Jun 8, 2015
Format:
Podcast episode

Titles in the series (85)

Smart machines based upon the principles of artificial intelligence and machine learning are now prevalent in our everyday life. For example, artificially intelligent systems recognize our voices, sort our pictures, make purchasing suggestions, and can automatically fly planes and drive cars. In this podcast series, we examine such questions such as: How do these devices work? Where do they come from? And how can we make them even smarter and more human-like? These are the questions which will be addressed in the podcast series Learning Machines 101.