Noelle Martin was 18 years old when she first watched herself performing hard-core porn. Sitting idly in front of her laptop one night, the then law student decided to do a reverse image search of herself on Google. She was expecting to unearth an archaic MySpace profile or some old, cringe-worthy social media posts. Instead, Martin was met with hundreds of explicit images and videos of herself, including graphic scenes of her engaging in intercourse and oral sex. Except it wasn’t her.
Martin is one of the hundreds of thousands of women who have been targeted by non-consensual sexual deepfakes. Meaning the image of her face was stolen and digitally mapped onto someone else’s body using AI technology. “I felt sick,” recalls Martin, now an award-winning activist and a researcher at UWA Tech and Policy Lab. “I was an absolute nobody. I didn’t know what the term was to describe what was happening to me. I’d never heard of image-based sexual abuse. My mind was racing with questions: ‘Were the videos made by someone that I knew? Were they going to come after me? How is this going to affect my job? Should I tell my family?’ It’s a shocking