Facebook Doesn't Understand Itself
Facebook’s 2 billion users post a steady stream of baby pictures, opinions about romantic comedies, reactions to the news—and disturbing depictions of violence, abuse, and self-harm. Over the last decade, the company has struggled to come to terms with moderating that last category. How do they parse a joke from a threat, art from pornography, a cry for help from a serious suicide attempt? And even if they can correctly categorize disturbing posts with thousands of human contractors sifting through user-flagged content, what should do they about it?
This weekend, began publishing stories based on 100 documents leaked to them from the training process that these content moderators go through.. Facebook neither confirmed nor denied the authenticity of the documents, but given ’s history of reporting from leaks, we proceed here with the assumption that the documents are real training materials used by at least one of Facebook’s content moderation contractors.
You’re reading a preview, subscribe to read more.
Start your free 30 days