When Bias Is Coded Into Our Technology
Facial recognition systems from large tech companies often incorrectly classify black women as male — including the likes of Michelle Obama, Serena Williams and Sojourner Truth. That's according to Joy Buolamwini, whose research caught wide attention in 2018 with "AI, Ain't I a Woman?" a spoken-word piece based on her findings at MIT Media Lab.
The video, along with the accompanying research paper written with Timnit Gebru of Microsoft Research, prompted many tech companies to reassess their facial recognition data sets and algorithms for darker and more female-looking faces.
"Coded Bias," a documentary directed by, an advocacy organization, with other examples of facial recognition software being rolled out around the world — on the streets of London, in housing projects in Brooklyn and broadly across China.
You’re reading a preview, subscribe to read more.
Start your free 30 days