Grok assumes users seeking images of underage girls have “good intent”



Chatbot makers must protect kids, NCMEC says

X has been very vocal about policing its platform for CSAM since Musk took over Twitter, but under former CEO Linda Yaccarino, the company adopted a broad protective stance against all image-based sexual abuse (IBSA). In 2024, X became one of the earliest corporations to voluntarily adopt the IBSA Principles that X now seems to be violating by failing to tweak Grok.

Those principles seek to combat all kinds of IBSA, recognizing that even fake images can “cause devastating psychological, financial, and reputational harm.” When it adopted the principles, X vowed to prevent the nonconsensual distribution of intimate images by providing easy-to-use reporting tools and quickly supporting the needs of victims desperate to block “the nonconsensual creation or distribution of intimate images” on its platform.

Kate Ruane, the director of the Center for Democracy and Technologys Free Expression Project, which helped form the working group behind the IBSA Principles, told Ars that although the commitments X made were “voluntary,” they signaled that X agreed the problem was a “pressing issue the company should take seriously.”

“They are on record saying that they will do these things, and they are not,” Ruane said.

As the Grok controversy sparks probes in Europe, India, and Malaysia, xAI may be forced to update Grok’s safety guidelines or make other tweaks to block the worst outputs.

In the US, xAI may face civil suits under federal or state laws that restrict intimate image abuse. If Grok’s harmful outputs continue into May, X could face penalties under the Take It Down Act, which authorizes the Federal Trade Commission to intervene if platforms don’t quickly remove both real and AI-generated non-consensual intimate imagery.

But whether US authorities will intervene any time soon remains unknown, as Musk is a close ally of the Trump administration. A spokesperson for the Justice Department told CNN that the department “takes AI-generated child sex abuse material extremely seriously and will aggressively prosecute any producer or possessor of CSAM.”

“Laws are only as good as their enforcement,” Ruane told Ars. “You need law enforcement at the Federal Trade Commission or at the Department of Justice to be willing to go after these companies if they are in violation of the laws.”

Child safety advocates seem alarmed by the sluggish response. “Technology companies have a responsibility to prevent their tools from being used to sexualize or exploit children,” NCMEC’s spokesperson told Ars. “As AI continues to advance, protecting children must remain a clear and nonnegotiable priority.”



Source link

  • Related Posts

    Live Nation says it will fight monopoly suit loss

    After a jury found that Live Nation-Ticketmaster violated antitrust law on several counts, the company warns in a blog post that the verdict “is not the last word on this…

    As they got close to the Moon, Artemis II astronauts were eager to land

    Koch said one thing she and her fellow astronauts learned was that they were well-trained to handle whatever issues arose. “This mission taught me that the unknown is way scarier than…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Live Nation says it will fight monopoly suit loss

    Live Nation says it will fight monopoly suit loss

    Trump Says Iran’s Concessions Pave Way for Deal to End War

    Saskatoon Blades pack their bags following playoff elimination – Saskatoon

    Saskatoon Blades pack their bags following playoff elimination – Saskatoon

    Women’s Six Nations 2026: Ireland boss Scott Bemand backs Aoife Wafer after ‘quiet’ start

    Women’s Six Nations 2026: Ireland boss Scott Bemand backs Aoife Wafer after ‘quiet’ start

    Satin Pants Are Trending—Shop Our Favorites for Spring

    Satin Pants Are Trending—Shop Our Favorites for Spring

    House Votes to Extend Expiring Law on Warrantless Surveillance for 10 Days