Discover this podcast and so much more

Podcasts are free to enjoy without a subscription. We also offer ebooks, audiobooks, and so much more for just $11.99/month.

Building a Database of CSAM for AOL, One Image at a Time

Building a Database of CSAM for AOL, One Image at a Time

FromCommunity Signal


Building a Database of CSAM for AOL, One Image at a Time

FromCommunity Signal

ratings:
Length:
58 minutes
Released:
Oct 18, 2021
Format:
Podcast episode

Description

If you work in content moderation or with a team that specializes in content moderation, then you know that the fight against child sexual abuse material (CSAM) is a challenging one. The New York Times reported that in 2018, technology companies reported a record 45 million online photos and videos of child sexual abuse. Ralph Spencer, our guest for this episode, has been working to make online spaces safer and combatting CSAM for more than 20 years, including as a technical investigator at AOL. Ralph describes how when he first started at AOL, in the mid-’90s, the work of finding and reviewing CSAM was largely manual. His team depended on community reports and all of the content was manually reviewed. Eventually, this manual review led to the creation of AOL’s Image Detection Filtering Process (IDFP), which reduced the need to manually review the actual content of CSAM. Working with the National Center for Missing and Exploited Children (NCMEC), law enforcement, and a coalition of other companies, Ralph shares how he saw his own team’s work evolve, what he considered his own metrics of success when it comes to this work, and the challenges that he sees for today’s platforms. The tools, vocabulary, and affordances for professionals working to make the internet safer have all improved greatly, but in this episode, Patrick and Ralph discuss the areas that need continued improvement. They discuss Section 230 and what considerations should be made if it were to be amended. Ralph explains that when he worked at AOL, the service surpassed six million users. As of last year, Facebook had 2.8 billion monthly active users. With a user base that large and a monopoly on how many people communicate, what will the future hold for how children, workers, and others that use them are kept safe on such platforms? Ralph and Patrick also discuss: Ralph’s history fighting CSAM at AOL, both manually and with detection tools Apple’s announcement to scan iCloud photos for NCMEC database matches How Ralph and other professionals dealing with CSAM protect their own health and well-being Why Facebook is calling for new or revised internet laws to govern its own platform Our Podcast is Made Possible By… If you enjoy our show, please know that it’s only possible with the generous support of our sponsor: Vanilla, a one-stop shop for online community. Big Quotes How Ralph fell into trust and safety work (20:23): “[Living in the same apartment building as a little girl who was abused] was a motivational factor [in doing trust and safety work]. I felt it was a situation where, while I did basically all I could in that situation, I [also] didn’t do enough. When this [job] came along … I saw it as an opportunity. If I couldn’t make the situation that I was dealing with in real life correct, then maybe I can do something to make a situation for one of these kids in these [CSAM] pictures a little bit better.” –Ralph Spencer Coping with having to routinely view CSAM (21:07): “I developed a way of dealing with [having to view CSAM]. I’d leave work and try not to think about it. When we were still doing this as a team … everybody at AOL generally got 45 minutes to an hour for lunch. We’d take two-hour lunches, go out, walk around. We did team days before people really started doing them. We went downtown in DC one day and went to the art gallery. The logic for that was like, you see ugly stuff every day, let’s go look at some stuff that has cultural value or has some beauty to it, and we’ll stop and have lunch at a nice restaurant.” –Ralph Spencer How organizations work with NCMEC and law enforcement to report CSAM (28:32): “[When our filtering tech] catches something that it sees in the [CSAM] database, it packages a report which includes the image, the email that the image was attached to, and a very small amount of identifying information. The report is then automatically sent to [the National Center for Missing and Exploited Children]. NCMEC looks at it,
Released:
Oct 18, 2021
Format:
Podcast episode

Titles in the series (100)

Community Signal is a podcast for experienced online community professionals, including those working in audience engagement, association management, developer relations, moderation, trust and safety, and more. It's released every two weeks and hosted by industry veteran Patrick O’Keefe.  This is a very community-focused program. There are plenty of social media and marketing podcasts out there. That’s not what this is. Social media is a set of tools. Community is a strategy you apply to those tools. Marketing brings new customers. Community helps you keep them.