Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

Rethinking Model Size: Train Large, Then Compress with Joseph Gonzalez - #378

Rethinking Model Size: Train Large, Then Compress with Joseph Gonzalez - #378

FromThe TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)


Rethinking Model Size: Train Large, Then Compress with Joseph Gonzalez - #378

FromThe TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

ratings:
Length:
52 minutes
Released:
May 25, 2020
Format:
Podcast episode

Description

Today we’re joined by Joseph Gonzalez, Assistant Professor in the EECS department at UC Berkeley.  Our main focus in the conversation is Joseph’s paper “Train Large, Then Compress: Rethinking Model Size for Efficient Training and Inference of Transformers,” which explores compute-efficient training strategies, based on model size. We discuss the two main problems being solved; 1) How can we rapidly iterate on variations in architecture? And 2) If we make models bigger, is it really improving any efficiency? We also discuss the parallels between computer vision and NLP tasks, how he characterizes both “larger” and “faster” in the paper. Check out the complete show notes for this episode at twimlai.com/talk/378.
Released:
May 25, 2020
Format:
Podcast episode

Titles in the series (100)

This Week in Machine Learning & AI is the most popular podcast of its kind. TWiML & AI caters to a highly-targeted audience of machine learning & AI enthusiasts. They are data scientists, developers, founders, CTOs, engineers, architects, IT & product leaders, as well as tech-savvy business leaders. These creators, builders, makers and influencers value TWiML as an authentic, trusted and insightful guide to all that’s interesting and important in the world of machine learning and AI. Technologies covered include: machine learning, artificial intelligence, deep learning, natural language processing, neural networks, analytics, deep learning and more.