18 min listen
Multi-GPU training is hard (without PyTorch Lightning)
Multi-GPU training is hard (without PyTorch Lightning)
ratings:
Length:
46 minutes
Released:
Jun 15, 2021
Format:
Podcast episode
Description
William Falcon wants AI practitioners to spend more time on model development, and less time on engineering. PyTorch Lightning is a lightweight PyTorch wrapper for high-performance AI research that lets you train on multiple-GPUs, TPUs, CPUs and even in 16-bit precision without changing your code! In this episode, we dig deep into Lightning, how it works, and what it is enabling. William also discusses the Grid AI platform (built on top of PyTorch Lightning). This platform lets you seamlessly train 100s of Machine Learning models on the cloud from your laptop.
Released:
Jun 15, 2021
Format:
Podcast episode
Titles in the series (100)
Government use of facial recognition and AI at Google by Practical AI: Machine Learning, Data Science