29 min listen
How These Victims of Deepfake Pornography Found Their Harasser
FromBig Take
ratings:
Length:
34 minutes
Released:
Dec 4, 2023
Format:
Podcast episode
Description
This episode includes some disturbing descriptions of sexual acts and assault. If you have kids around, you might want to use headphones. And please take care when listening.
Artificial intelligence and “generative AI” tools – think ChatGPT or Stable Diffusion – have become ingrained in our daily lives as a way to make aspects of work and life easier when used for its intended purpose. But many of them are open-source and widely available, giving users free rein to alter publicly available photos of people – including images taken from social media – to depict events that never happened in real life.
These images are called “deepfakes” and increasingly, they’re being altered in sexually explicit ways and posted online without consent. While the photos are fake, the harm inflicted is real.
Bloomberg’s Olivia Carville and Margi Murphy join this episode to describe the fallout when deepfake creators use AI to alter images and videos. Despite the harm to victims, there is little legal recourse under US law.
Read more: No Laws Protect People From Deepfake Porn. These Victims Fought Back
Listen to The Big Take podcast every weekday and subscribe to our daily newsletter: https://bloom.bg/3F3EJAK Have questions or comments for Wes and the team? Reach us at bigtake@bloomberg.net.See omnystudio.com/listener for privacy information.
Artificial intelligence and “generative AI” tools – think ChatGPT or Stable Diffusion – have become ingrained in our daily lives as a way to make aspects of work and life easier when used for its intended purpose. But many of them are open-source and widely available, giving users free rein to alter publicly available photos of people – including images taken from social media – to depict events that never happened in real life.
These images are called “deepfakes” and increasingly, they’re being altered in sexually explicit ways and posted online without consent. While the photos are fake, the harm inflicted is real.
Bloomberg’s Olivia Carville and Margi Murphy join this episode to describe the fallout when deepfake creators use AI to alter images and videos. Despite the harm to victims, there is little legal recourse under US law.
Read more: No Laws Protect People From Deepfake Porn. These Victims Fought Back
Listen to The Big Take podcast every weekday and subscribe to our daily newsletter: https://bloom.bg/3F3EJAK Have questions or comments for Wes and the team? Reach us at bigtake@bloomberg.net.See omnystudio.com/listener for privacy information.
Released:
Dec 4, 2023
Format:
Podcast episode
Titles in the series (100)
The US Midterms Will Decide If the 2024 Election Can Be Stolen by Big Take