I Shouldn’t Have to Accept Being in Deepfake Porn
Recently, a Google Alert informed me that I am the subject of deepfake pornography. I wasn’t shocked. For more than a year, I have been the target of a widespread online harassment campaign, and deepfake porn—whose creators, using artificial intelligence, generate explicit video clips that seem to show real people in sexual situations that never actually occurred—has become a prized weapon in the arsenal misogynists use to try to drive women out of public life. The only emotion I felt as I informed my lawyers about the latest violation of my privacy was a profound disappointment in the technology—and in the lawmakers and regulators who have offered no justice to people who appear in porn clips without their consent. Many commentators have been tying themselves in knots over the threats posed
You’re reading a preview, subscribe to read more.
Start your free 30 days