Elon Musk’s Grok ‘Undressing’ Problem Isn’t Fixed


Elon Musk’s X has introduced new restrictions stopping people from editing and generating images of real people in bikinis or other “revealing clothing.” The change in policy on Wednesday night follows global outrage at Grok being used to generate thousands of harmful nonconsensual “undressing” photos of women and sexualized images of apparent minors on X.

However, while it appears that some safety measures have finally been introduced to Grok’s image generation on X, the stand-alone Grok app and website seem to still be able to generate “undress”-style images and pornographic content, according to multiple tests by researchers, WIRED, and other journalists. Other users, meanwhile, say they’re no longer able to create images and videos as they once were.

“We can still generate photorealistic nudity on Grok.com,” says Paul Bouchaud, the lead researcher at Paris-based nonprofit AI Forensics, who has been tracking the use of Grok to create sexualized images and ran multiple tests on Grok outside of X. “We can generate nudity in ways that Grok on X cannot.”

“I could upload an image on Grok Imagine and ask to put the person in a bikini, and it works,” says the researcher who tested the system on a person appearing as a woman. Tests by WIRED, using free Grok accounts on its website in both the UK and US, successfully removed clothing from two images of men without any apparent restrictions. On the Grok app in the UK, when asked to undress a male, the app prompted a WIRED reporter to enter the users’ year of birth before the image was generated.

Meanwhile, other journalists at The Verge and investigative outlet Bellingcat also found it was possible to create sexualized images while being based in the UK, which is investigating Grok and X and has strongly condemned the platforms for allowing users to create the “undress” images.

Since the start of the year, Musk’s businesses—including artificial intelligence firm xAI, X, and Grok—have all come under fire for the creation of nonconsensual intimate imagery, explicit and graphic sexual videos, and sexualized imagery of apparent minors. Officials in the United States, Australia, Brazil, Canada, the Europe Commission, France, India, Indonesia, Ireland, Malaysia, and the UK have all condemned or launched investigations into X or Grok.

On Wednesday, a Safety account on X posted updates on how Grok can be used on the social media website. “We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis,” the account posted, adding that the rules apply to all users, including both free and paid subscribers.

In a section titled “Geoblock update,” the X account also claimed: “We now geoblock the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in those jurisdictions where it’s illegal.” The company’s update also added that it is working to add additional safeguards and that it continues to “remove high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity.”

Spokespeople for xAI, which creates Grok, did not immediately reply to WIRED’s request for comment. Meanwhile an X spokesperson says they understand the geolocation block to apply to both its app and website.

The latest move follows a widely criticized shift on January 9 where X limited image generation using Grok to paid “verified” subscribers. A a leading women’s group described the act as the “monetization of abuse.” Bouchaud, who says that AI Forensics has gathered around 90,000 total Grok images since the Christmas holidays, confirms that only verified accounts have been able to generate images on X—as opposed to the Grok website or app—since January 9 and that bikini images of women are rarely generated now. “We do observe that they appear to have pulled the plug on it and disabled the functionality on X,” they say.



Source link

  • Related Posts

    Natural Cycles launches wristband to replace thermometers for its FDA-cleared birth control app

    Natural Cycles, the company behind a controversial FDA-cleared birth control app, is replacing its thermometer with a wristband that measures skin temperature, heart rate, and movement during sleep. The newly-launched…

    ChatGPT wrote “Goodnight Moon” suicide lullaby for man who later killed himself

    Eventually, Gordon got ChatGPT to acknowledge that the suicide cases were real by sharing evidence that he’d found online. But the chatbot rejected Gordon’s concern that he might be at…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Wall Street steadies as chip stocks bounce back and oil prices ease

    Wall Street steadies as chip stocks bounce back and oil prices ease

    A ‘ghoulish reminder’ to be knocked down or a memorial to the dead: council debates future of Bondi footbridge | Bondi beach terror attack

    A ‘ghoulish reminder’ to be knocked down or a memorial to the dead: council debates future of Bondi footbridge | Bondi beach terror attack

    Natural Cycles launches wristband to replace thermometers for its FDA-cleared birth control app

    Natural Cycles launches wristband to replace thermometers for its FDA-cleared birth control app

    Charlie Smyth: Former Gaelic footballer’s 2025 form ‘not going to cut it long-term’ in the NFL

    Charlie Smyth: Former Gaelic footballer’s 2025 form ‘not going to cut it long-term’ in the NFL

    Foraged mushrooms in California linked to three deaths

    Foraged mushrooms in California linked to three deaths

    Pathologic 3 is the great Russian novel told as a video game, and a timely critique of intellectualism

    Pathologic 3 is the great Russian novel told as a video game, and a timely critique of intellectualism