Senate passes a bill that would let nonconsensual deepfake victims sue


The Senate passed a bill that could give people who’ve found their likeness deepfaked into sexually-explicit images without their consent a new way to fight back.

The Disrupt Explicit Forged Images and Non-Consensual Edits Act (DEFIANCE Act), would let victims sue the individuals who created the images for civil damages. The bill passed with unanimous consent — meaning there was no roll-call vote, and no Senator objected to its passage on the floor Tuesday. It’s meant to build on the work of the Take It Down Act, a law that criminalizes the distribution of nonconsensual intimate images (NCII) and requires social media platforms to promptly remove them.

Senate Democratic Whip Dick Durbin (D-IL), a lead sponsor of the bill, referenced Grok’s nonconsensual undressing in remarks on the Senate floor. “Even after these terrible deepfake, harming images are pointed out to Grok and to X, formerly Twitter, they do not respond. They don’t take the images off of the internet. They don’t come to the rescue of people who are victims,” Durbin said. Though the Take It Down Act, whose takedown provision goes into full force later this year, could have implications for X, the DEFIANCE Act would impact individuals, like those Grok users creating deepfaked nonconsensual intimate imagery.

Governments around the world are creating new protections against AI-generated nonconsensual images, spurred in part by the recent Grok controversy. The UK, for example, recently pushed up a law that criminalizes the creation of nonconsensual intimate deepfakes.

The DEFIANCE Act similarly passed the Senate in 2024 following a different nonconsensual deepfake scandal on X. Early that year, sexually explicit AI-generated images of Taylor Swift circulated on the platform. Durbin along with Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO) introduced the bill to expand on a provision in the Violence Against Women Act Reauthorization Act of 2022, which gave people whose non-AI generated intimate images were shared without consent a right to sue. Rep. Alexandria Ocasio-Cortez (D-NY), who has found her own image digitally altered in nonconsensual intimate deepfakes, sponsored the bill in the House. The bill stalled in the House without a vote during the last Congress, requiring the Senate to take it up again this year. Now the ball is again in the House leadership’s court; if they decide to bring the bill to the floor, it will have to pass in order to reach the president’s desk.



Source link

  • Related Posts

    Anker made its own chip to bring AI to all its products

    Anker has announced its own custom silicon that the company says will bring local AI to audio devices, mobile accessories, and IoT devices. The Thus processor is the world’s first…

    Report: Meta will train AI agents by tracking employees’ mouse, keyboard use

    Meta will begin tracking the mouse movements, clicks, and keystrokes of its US employees to generate high-quality training data for future AI agents, Reuters reports. The news organization cites internal…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Mexican officials reveal new details about pyramid shooting

    Mexican officials reveal new details about pyramid shooting

    Thieves steal over $100,000 worth of Pokémon cards

    Thieves steal over $100,000 worth of Pokémon cards

    Taxes on wages hit decade high across OECD countries

    Michael Murphy: The farcical attempt to ‘decolonize’ Shakespeare

    Anker made its own chip to bring AI to all its products

    Anker made its own chip to bring AI to all its products

    Marcus Smart’s breakout game helps Lakers go up 2-0 on Rockets

    Marcus Smart’s breakout game helps Lakers go up 2-0 on Rockets