Elon Musk’s Grok Still Has Issues with ‘Undressing’ Feature

Elon Musk’s X has implemented new restrictions that prevent users from editing and producing images of real individuals in bikinis or other “revealing clothing.” This policy update, announced on Wednesday night, comes in response to widespread backlash over the misuse of Grok for generating numerous harmful nonconsensual “undressing” photos of women and sexualized images of suspected minors on X.
Despite the apparent introduction of some protective measures for Grok’s image generation on X, the separate Grok application and website reportedly still permit the creation of “undress”-style images and pornographic content, as confirmed by various tests conducted by researchers, WIRED, and other media outlets. Other users indicate they can no longer produce images and videos as freely as before.
“We can still create photorealistic nudity on Grok.com,” states Paul Bouchaud, the lead researcher at the Paris-based nonprofit AI Forensics, who monitors Grok’s usage for sexualized imagery and has conducted several tests outside of X. “We can produce nudity in ways that Grok on X cannot.”
“I could upload an image on Grok Imagine and request to dress the person in a bikini, and it works,” mentions a researcher who tested the system using an image of a woman. Tests conducted by WIRED using free Grok accounts in the UK and US successfully removed clothing from two images of men without visible restrictions. In the Grok app in the UK, when attempting to undress a male figure, the app prompted a WIRED reporter to input the user’s year of birth prior to generating the image.
In addition, other journalists from The Verge and investigative outlet Bellingcat discovered that sexualized images could still be created while operating from the UK, which is actively investigating Grok and X, condemning the platforms for permitting the creation of “undress” images.
Since early this year, Musk’s enterprises—including artificial intelligence firm xAI, X, and Grok—have faced criticism for generating nonconsensual intimate imagery, explicit and graphic sexual videos, and sexualized images involving suspected minors. Officials from the United States, Australia, Brazil, Canada, the European Commission, France, India, Indonesia, Ireland, Malaysia, and the UK have condemned these actions or initiated investigations into X or Grok.
On Wednesday, a Safety account on X shared updates on Grok’s usage on the platform. “We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis,” the account announced, emphasizing that these rules apply to all users, both free and paid subscribers.
In a section titled “Geoblock update,” the X account also stated: “We now geoblock the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in jurisdictions where it’s illegal.” The company’s update indicated plans to introduce additional safeguards and its ongoing efforts to “remove high-priority violative content, including Child Sexual Abuse Material (CSAM) and non-consensual nudity.”
Representatives for xAI, the creator of Grok, did not respond immediately to WIRED’s request for comment. An X spokesperson acknowledged that the geolocation block applies to both its app and website.
This recent action follows a controversial change on January 9, when X restricted image generation using Grok to paid “verified” subscribers. A prominent women’s organization criticized this move as the “monetization of abuse.” Bouchaud, who reports that AI Forensics has compiled approximately 90,000 Grok images since the Christmas holidays, confirms that only verified accounts have been able to create images on X, unlike the Grok website or app, since January 9, noting that images of women in bikinis are now seldom generated. “It seems they have indeed disabled that functionality on X,” he comments.
