Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

#036 - Max Welling: Quantum, Manifolds & Symmetries in ML

#036 - Max Welling: Quantum, Manifolds & Symmetries in ML

FromMachine Learning Street Talk (MLST)


#036 - Max Welling: Quantum, Manifolds & Symmetries in ML

FromMachine Learning Street Talk (MLST)

ratings:
Length:
103 minutes
Released:
Jan 3, 2021
Format:
Podcast episode

Description

Today we had a fantastic conversation with Professor Max Welling, VP of Technology, Qualcomm Technologies Netherlands B.V. 

Max is a strong believer in the power of data and computation and its relevance to artificial intelligence. There is a fundamental blank slate paradgm in machine learning, experience and data alone currently rule the roost. Max wants to build a house of domain knowledge on top of that blank slate. Max thinks there are no predictions without assumptions, no generalization without inductive bias. The bias-variance tradeoff tells us that we need to use additional human knowledge when data is insufficient.

Max Welling has pioneered many of the most sophistocated inductive priors in DL models developed in recent years, allowing us to use Deep Learning with non-euclidean data i.e. on graphs/topology (a field we now called "geometric deep learning") or allowing network architectures to recognise new symmetries in the data for example gauge or SE(3) equivariance. Max has also brought many other concepts from his physics playbook into ML, for example quantum and even Bayesian approaches. 

This is not an episode to miss, it might be our best yet! 

Panel: Dr. Tim Scarfe, Yannic Kilcher, Alex Stenlake

00:00:00 Show introduction 
00:04:37 Protein Fold from DeepMind -- did it use SE(3) transformer? 
00:09:58 How has machine learning progressed 
00:19:57 Quantum Deformed Neural Networks paper 
00:22:54 Probabilistic Numeric Convolutional Neural Networks paper
00:27:04 Ilia Karmanov from Qualcomm interview mini segment
00:32:04 Main Show Intro 
00:35:21 How is Max known in the community? 
00:36:35 How Max nurtures talent, freedom and relationship is key 
00:40:30 Selecting research directions and guidance 
00:43:42 Priors vs experience (bias/variance trade-off) 
00:48:47 Generative models and GPT-3 
00:51:57 Bias/variance trade off -- when do priors hurt us 
00:54:48 Capsule networks 
01:03:09 Which old ideas whould we revive 
01:04:36 Hardware lottery paper 
01:07:50 Greatness can't be planned (Kenneth Stanley reference) 
01:09:10 A new sort of peer review and originality 
01:11:57 Quantum Computing 
01:14:25 Quantum deformed neural networks paper 
01:21:57 Probabalistic numeric convolutional neural networks 
01:26:35 Matrix exponential 
01:28:44 Other ideas from physics i.e. chaos, holography, renormalisation 
01:34:25 Reddit 
01:37:19 Open review system in ML 
01:41:43 Outro 
Released:
Jan 3, 2021
Format:
Podcast episode

Titles in the series (100)

This is the audio podcast for the ML Street Talk YouTube channel at https://www.youtube.com/c/MachineLearningStreetTalk Thanks for checking us out! We think that scientists and engineers are the heroes of our generation. Each week we have a hard-hitting discussion with the leading thinkers in the AI space. Street Talk is unabashedly technical and non-commercial, so you will hear no annoying pitches. Corporate- and MBA-speak is banned on street talk, "data product", "digital transformation" are banned, we promise :) Dr. Tim Scarfe, Dr. Yannic Kilcher and Dr. Keith Duggar.