INSIDE FACEBOOK’S AFRICAN SWEATSHOP
IN A DRAB OFFICE building near a slum on the outskirts of Nairobi, nearly 200 young men and women from countries across Africa sit glued to computer monitors, where they must watch videos of murders, rapes, suicides, and child sexual abuse.
These young Africans work for Sama, which calls itself an “ethical AI” outsourcing company and is headquartered in California.
Sama says its mission is to provide people in places like Nairobi with “dignified digital work.” Its executives can often be heard saying that the best way to help poor countries is to “give work, not aid.” Sama claims to have helped lift more than 50,000 people in the developing world out of poverty.
This benevolent public image has won Sama data-labeling contracts with some of the largest companies in the world, including Google, Microsoft, and Walmart. What the company doesn’t make public on its website is its relationship with its client Facebook.
In Nairobi, Sama employees, who speak at least 11 African languages among them, toil day and night as Facebook content moderators: the emergency first responders of social media. They perform the brutal task of viewing and removing illegal or banned content from Facebook before it is seen by the average user.
Since 2019, this Nairobi office block has been the epicenter of Facebook’s content-moderation operation for the whole of sub-Saharan Africa. Its remit includes Ethiopia, where Facebook is trying to prevent content on its platform from contributing to incitement to violence in an escalating civil war.
Despite their importance to Facebook, the workers in this Nairobi office are among the lowest-paid workers for the platform anywhere in the world, with some of them taking home as little as $1.50 per hour, a TIME investigation found. The testimonies of Sama employees reveal a workplace culture characterized by mental trauma, intimidation, and alleged suppression of the right to unionize. The revelations raise serious questions about whether Facebook—which periodically sends its own employees to Nairobi to monitor Sama’s operations—is exploiting the very people upon whom it is depending to ensure its platform is safe in Ethiopia and across the continent. And just as Facebook needs them most, content moderators at Sama are leaving the company in droves because of poor pay and working conditions, with six Ethiopian employees resigning in a single week in January.
This story
You’re reading a preview, subscribe to read more.
Start your free 30 days