States are leading the way in implementing deep fake porn laws

The Defiance Act: How Artificial Intelligence can help victims of nonconsensual deep fake porn and other ill-gotten porn victims

Vittoria Elliott: It’s actually really piecemeal, and that’s because on a fundamental level, we don’t have national regulation on this. The Defiance Act, which was introduced by Congresswoman Alexandria Ocasio-Cortez, is meant to help victims of nonconsensual deepfake porn, as long as they can show that the images or videos were made with a nonconsensual partner. Cruz has a bill that would allow people to force technology platforms to take down their images and videos. A lot of attention has been paid to the issue of young people using artificial intelligence to make explicit images and videos in order to bully peers, but there hasn’t been any major movement on these in several months. According to the data we have, generative artificial intelligence is still being used in politics, but is mostly used to target and harass women.

Leah Feiger is a woman. So let’s start with porn, if that’s OK. You have an article in today about states trying to address the issue of porn created with artificial intelligence. Tell us about it. What are people doing with this?

What We Should Concern About Deep Fakes and Forged Images: A WIRED Analysis of the 2024 U.S. Defiance and Nonconsensual Editing Measures

Though there’s bilateral support for many of these measures, federal legislation can take years to make it through both houses of Congress before being signed into law. State Legislatures and local politicians can move faster.

The Explicit Forged Images and Non-Consenual Edits Act, or Defiance Act, was introduced earlier this year by Democratic congresswoman Alexandria Ocasio-Cortez who had herself been a victim of nonconsensual deepfakes. The bill would allow the victims of deepfake pornography to make a case that the deep fakes had been made without their consent. Cruz introduced a bill that would force platforms to remove both revenge porn and nonconsensual deepfake porn.

A few months ago, everyone was worried about how AI would impact the 2024 election. Some angst has dissipated but political deepfakes, including pornographic images and video are still present. Today on the show, WIRED reporters Vittoria Elliott and Will Knight talk about what has changed with AI and what we should worry about.

Leah Feiger is @LeahFeiger. Vittoria Elliott is @telliotter. Will Knight is @willknight. Write to us at [email protected]. Be sure to subscribe to the WIRED Politics Lab newsletter here.

Podcasts and Podcasts: How to subscribe and listen to WIRED Politics Lab podcast on your iPhone, iPad, or iPhone / iPod Touch

You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:

If you’re on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts, and search for WIRED Politics Lab. We are also on the streaming service.

Leah Feiger: Hey, Tori. And from Cambridge, Massachusetts, senior writer Will Knight. Thank you for coming on, Will. It’s your first time here.

Previous post Telegram CEO will moderate after being arrested in France
Next post There’s a corner in Brazil where Musk is attempting to navigate