Apple’s child abuse photo scanning: What it is and why people are worried
Aug 20, 2021
4 minutes
Michael Simon reports
Apple has announced that it will begin scanning all photos uploaded to iCloud for potential child sexual abuse material (CSAM). It’s come under a great deal of scrutiny and generated some outrage, so here’s what you need to know about the new technology before it rolls out, initially in the US, later this year.
WHAT ARE THE TECHNOLOGIES APPLE IS ROLLING OUT?
Apple will be rolling out new anti-CSAM features in three areas: Messages, iCloud Photos, and Siri and Search. Here’s how each of them will be implemented, according to Apple.
The Messages app will use on-device machine learning to
You’re reading a preview, subscribe to read more.
Start your free 30 days