Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

The terabrain is near, with Simon Thorpe

The terabrain is near, with Simon Thorpe

FromLondon Futurists


The terabrain is near, with Simon Thorpe

FromLondon Futurists

ratings:
Length:
32 minutes
Released:
Oct 19, 2022
Format:
Podcast episode

Description

Why do human brains consume much less power than artificial neural networks? Simon Thorpe, Research Director of CNRS, explains his view that the key to artificial general intelligence is a "terabrain" that copies from human brains the sparse-firing networks with spiking neurons.00.11 Recapping "the AI paradox"00.28 The nervousness of CTOs regarding AI00.43 Introducing Simon01.43 45 years since Oxford, working out how the brain does amazing things02.45 Brain visual perception as feed-forward vs. feedback03.40 The ideas behind the system that performed so well in the 2012 ImageNet challenge04.20 The role of prompts to alter perception05.30 Drawbacks of human perceptual expectations06.05 The video of a gorilla on the basketball court06.50 Conjuring tricks and distractions07.10 Energy consumption: human neurons vs. artificial neurons07.26 The standard model would need 500 petaflops08.40 Exaflop computing has just arrived08.50 30 MW vs. 20 W (less than a lightbulb)09.34 Companies working on low-power computing systems09.48 Power requirements for edge computing10.10 The need for 86,000 neuromorphic chips?10.25 Dense activation of neurons vs. sparse activation10.58 Real brains are event driven11.16 Real neurons send spikes not floating point numbers11.55 SpikeNET by Arnaud Delorme12.50 Why are sparse networks studied so little?14.40 A recent debate with Yann LeCun of Facebook and Bill Dally of Nvidia15.40 One spike can contain many bits of information16.24 Revisiting an experiment with eels from 1927 (Lord Edgar Adrian)17.06 Biology just needs one spike17.50 Chips moved from floating point to fixed point19.25 Other mentions of sparse systems - MoE (Mixture of Experts)19.50 Sparse systems are easier to interpret20.30 Advocacy for "grandmother cells"21.23 Chicks that imprinted on yellow boots22.35 A semantic web in the 1960s22.50 The Mozart cell23.02 An expert system implemented in a neural network with spiking neurons23.14 Power consumption reduced by a factor of one million23.40 Experimental progress23.53 Dedicated silicon: Spikenet Technology, acquired by BrainChip24.18 The Terabrain Project, using standard off-the-shelf hardware24.40 Impressive recent simulations on GPUs and on a MacBook Pro26.26 A homegrown learning rule26.44 Experiments with "frozen noise"27.28 Anticipating emulating an entire human brain on a Mac Studio M1 Ultra28.25 The likely impact of these ideas29.00 This software will be given away29.17 Anticipating "local learning" without the results being sent to Big Tech30.40 GPT-3 could run on your phone next year31.12 Our interview next year might be, not with Simon, but with his Terabrain31.22 Our phones know us better than our spouses doSimon's academic page: https://cerco.cnrs.fr/page-perso-simon-thorpe/Simon's personal blog: https://simonthorpesideas.blogspot.com/Audio engineering by Alexander Chace.Music: Spike Protein, by Koi Discovery, available under CC0 1.0 Public Domain Declaration
Released:
Oct 19, 2022
Format:
Podcast episode

Titles in the series (80)

Anticipating and managing exponential impact - hosts David Wood and Calum Chace