Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

Multi - Armed Bandits

Multi - Armed Bandits

FromLinear Digressions


Multi - Armed Bandits

FromLinear Digressions

ratings:
Length:
11 minutes
Released:
Mar 7, 2016
Format:
Podcast episode

Description

Multi-armed bandits: how to take your randomized experiment and make it harder better faster stronger. Basically, a multi-armed bandit experiment allows you to optimize for both learning and making use of your knowledge at the same time. It's what the pros (like Google Analytics) use, and it's got a great name, so... winner!

Relevant link: https://support.google.com/analytics/answer/2844870?hl=en
Released:
Mar 7, 2016
Format:
Podcast episode

Titles in the series (100)

Linear Digressions is a podcast about machine learning and data science. Machine learning is being used to solve a ton of interesting problems, and to accomplish goals that were out of reach even a few short years ago.