Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

#032- Simon Kornblith / GoogleAI - SimCLR and Paper Haul!

#032- Simon Kornblith / GoogleAI - SimCLR and Paper Haul!

FromMachine Learning Street Talk (MLST)


#032- Simon Kornblith / GoogleAI - SimCLR and Paper Haul!

FromMachine Learning Street Talk (MLST)

ratings:
Length:
90 minutes
Released:
Dec 6, 2020
Format:
Podcast episode

Description

This week Dr. Tim Scarfe, Sayak Paul and Yannic Kilcher speak with Dr. Simon Kornblith from Google Brain (Ph.D from MIT). Simon is trying to understand how neural nets do what they do. Simon was the second author on the seminal Google AI SimCLR paper. We also cover "Do Wide and Deep Networks learn the same things?", "Whats in a Loss function for Image Classification?",  and "Big Self-supervised models are strong semi-supervised learners". Simon used to be a neuroscientist and also gives us the story of his unique journey into ML.

00:00:00 Show Teaser / or "short version"
00:18:34 Show intro
00:22:11 Relationship between neuroscience and machine learning
00:29:28 Similarity analysis and evolution of representations in Neural Networks
00:39:55 Expressability of NNs
00:42:33 Whats in a loss function for image classification
00:46:52 Loss function implications for transfer learning
00:50:44 SimCLR paper 
01:00:19 Contrast SimCLR to BYOL
01:01:43 Data augmentation
01:06:35 Universality of image representations
01:09:25 Universality of augmentations
01:23:04 GPT-3
01:25:09 GANs for data augmentation??
01:26:50 Julia language

@skornblith
https://www.linkedin.com/in/simon-kornblith-54b2033a/

https://arxiv.org/abs/2010.15327
Do Wide and Deep Networks Learn the Same Things? Uncovering How Neural Network Representations Vary with Width and Depth

https://arxiv.org/abs/2010.16402
What's in a Loss Function for Image Classification?

https://arxiv.org/abs/2002.05709
A Simple Framework for Contrastive Learning of Visual Representations

https://arxiv.org/abs/2006.10029
Big Self-Supervised Models are Strong Semi-Supervised Learners
Released:
Dec 6, 2020
Format:
Podcast episode

Titles in the series (100)

This is the audio podcast for the ML Street Talk YouTube channel at https://www.youtube.com/c/MachineLearningStreetTalk Thanks for checking us out! We think that scientists and engineers are the heroes of our generation. Each week we have a hard-hitting discussion with the leading thinkers in the AI space. Street Talk is unabashedly technical and non-commercial, so you will hear no annoying pitches. Corporate- and MBA-speak is banned on street talk, "data product", "digital transformation" are banned, we promise :) Dr. Tim Scarfe, Dr. Yannic Kilcher and Dr. Keith Duggar.