Instead of amplifying human biases, can algorithms help fix them?
Alexa. Siri. The voice that responds when you say, “OK Google.” These virtual assistants rely on artificial intelligence. They are increasingly ubiquitous, and they are female.
So far no Martin or Harry or Alexander. Ever wonder why?
Kate Devlin, a technology expert and senior lecturer at King’s College, London, says it may stem from biases that can lurk deep in human thought, perhaps even unnoticed. She recounts how, when she asked a developer of one of the digital-assistants why he chose a female voice, his answer was, ”I didn’t really think about it.”
In sharing that anecdote in at the recent World Summit AI in Amsterdam, Dr. Devlin wasn’t alone in focusing on the link between gender bias and the fast-growing realm of artificial intelligence. In fact one of the hottest questions surrounding the technology is
‘Garbage in, garbage out’Fair gatekeepers?You’re reading a preview, subscribe to read more.
Start your free 30 days