Fast Company

Google on the Brain

THE HUMAN BRAIN IS A FUNNY THING. CERTAIN MEMORIES CAN STICK WITH US FOREVER: THE BIRTH OF A CHILD, A CAR CRASH, AN ELECTION DAY. BUT WE ONLY STORE SOME DETAILS—THE COLOR OF THE HOSPITAL DELIVERY ROOM OR THE SMELL OF THE POLLING STATION—WHILE OTHERS FADE, SUCH AS THE FACE OF THE NURSE WHEN THAT CHILD WAS BORN, OR WHAT WE WERE WEARING DURING THAT ACCIDENT. FOR GOOGLE CEO SUNDAR PICHAI, THE DAY HE WATCHED AI RISE OUT OF A LAB IS ONE HE’LL REMEMBER FOREVER.

“This was 2012, in a room with a small team, and there were just a few of us,” he tells me. An engineer named Jeff Dean, a legendary programmer at Google who helped build its search engine, had been working on a new project and wanted Pichai to have a look. “Anytime Jeff wants to update you on something, you just get excited by it,” he says.

Pichai doesn’t recall exactly which building he was in when Dean presented his work, though odd details of that day have stuck with him. He remembers standing, rather than sitting, and someone joking about an HR snafu that had designated the newly hired Geoffrey Hinton—the “Father of Deep Learning,” an AI researcher for four decades, and, later, a Turing Award winner—as an intern.

The future CEO of Google was an SVP at the time, running Chrome and Apps, and he hadn’t been thinking about AI. No one at Google was, really, not in a significant way. Yes, Google cofounders Larry Page and Sergey Brin had stated publicly 12 years prior that artificial intelligence would transform the company: “The ideal search engine is smart,” Page told Online magazine in May 2000. “It has to understand your query, and it has to understand all the documents, and that’s clearly AI.” But at Google and elsewhere, machine learning had been delivering meager results for decades, despite grand promises.

Now, though, powerful forces were stirring inside Google’s servers. For a little more than a year, Dean, Andrew Ng, and their colleagues had been building a massive network of interconnected computers, linked together in ways modeled on the human brain. The team had engineered 16,000 processors in 1,000 computers, which—combined—were capable of making 1 billion connections. This was unprecedented for a computer system, though still far from a human brain’s capacity of more than 100 trillion connections.

To test how this massive neural net processed data, the engineers had run a deceptively simple experiment. For three days straight, they had fed the machine a diet

You’re reading a preview, subscribe to read more.

More from Fast Company

Fast Company2 min read
49 campus
WITH STUDENT LOAN DEBT BALlooning into a $1.8 trillion crisis, education entrepreneur Tade Oyerinde has developed an affordable alternative to a traditional four-year degree. In 2022, he launched Campus, the first national online community college. W
Fast Company5 min readIntelligence (AI) & Semantics
1 1 the Power Broker
FOR BRINGING THE CHIPS TO THE AI PARTY I'M CHAT TING WITH NVIDIA CEO JENSEN HUANG AT the chip giant's Silicon Valley headquarters, where one of its DGX H100 computing modules sits partially disassembled before us. Stuffed with blazingly fast processo
Fast Company2 min readRobotics
Automating Dirty And Dangerous Work
THERE'S A long history of robots taking jobs that humans resent, resist, or outright fear. But a new crop of bots is tackling tasks that even machines might calculate to be out of their theoretical comfort zones. Gecko Robotics has been deploying its

Related Books & Audiobooks