Grok assumes users seeking images of underage girls have “good intent”



Chatbot makers must protect kids, NCMEC says

X has been very vocal about policing its platform for CSAM since Musk took over Twitter, but under former CEO Linda Yaccarino, the company adopted a broad protective stance against all image-based sexual abuse (IBSA). In 2024, X became one of the earliest corporations to voluntarily adopt the IBSA Principles that X now seems to be violating by failing to tweak Grok.

Those principles seek to combat all kinds of IBSA, recognizing that even fake images can “cause devastating psychological, financial, and reputational harm.” When it adopted the principles, X vowed to prevent the nonconsensual distribution of intimate images by providing easy-to-use reporting tools and quickly supporting the needs of victims desperate to block “the nonconsensual creation or distribution of intimate images” on its platform.

Kate Ruane, the director of the Center for Democracy and Technologys Free Expression Project, which helped form the working group behind the IBSA Principles, told Ars that although the commitments X made were “voluntary,” they signaled that X agreed the problem was a “pressing issue the company should take seriously.”

“They are on record saying that they will do these things, and they are not,” Ruane said.

As the Grok controversy sparks probes in Europe, India, and Malaysia, xAI may be forced to update Grok’s safety guidelines or make other tweaks to block the worst outputs.

In the US, xAI may face civil suits under federal or state laws that restrict intimate image abuse. If Grok’s harmful outputs continue into May, X could face penalties under the Take It Down Act, which authorizes the Federal Trade Commission to intervene if platforms don’t quickly remove both real and AI-generated non-consensual intimate imagery.

But whether US authorities will intervene any time soon remains unknown, as Musk is a close ally of the Trump administration. A spokesperson for the Justice Department told CNN that the department “takes AI-generated child sex abuse material extremely seriously and will aggressively prosecute any producer or possessor of CSAM.”

“Laws are only as good as their enforcement,” Ruane told Ars. “You need law enforcement at the Federal Trade Commission or at the Department of Justice to be willing to go after these companies if they are in violation of the laws.”

Child safety advocates seem alarmed by the sluggish response. “Technology companies have a responsibility to prevent their tools from being used to sexualize or exploit children,” NCMEC’s spokesperson told Ars. “As AI continues to advance, protecting children must remain a clear and nonnegotiable priority.”



Source link

  • Related Posts

    Netflix cofounder Reed Hastings is officially leaving the company

    Netflix cofounder and chairman Reed Hastings plans to leave the company after nearly 30 years. The news comes as part of Netflix’s Q1 2026 earnings results released on Thursday, which…

    After a saga of broken promises, a European rover finally has a ride to Mars

    Instead, the agency turned to Russia to launch the orbiter and rover on two Proton rockets and provide the descent system to deliver the rover to Mars. In exchange, ESA…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Kennedy announces policy changes and faces criticism at House hearings

    Kennedy announces policy changes and faces criticism at House hearings

    Netflix cofounder Reed Hastings is officially leaving the company

    Netflix cofounder Reed Hastings is officially leaving the company

    Trump says he might go to Pakistan if an Iran deal were signed there.

    F.A.A. Caps Flights at O’Hare Airport in Chicago to Cut Delays

    Trump draws Marie Antoinette comparisons as he leans into the gilded trappings of the presidency

    Trump draws Marie Antoinette comparisons as he leans into the gilded trappings of the presidency

    Kamloops mayor bullied, intimidated staff, external investigation finds

    Kamloops mayor bullied, intimidated staff, external investigation finds