“The creation of undressed or sexualised images without consent is degrading, abusive and it is not a victimless crime,” said Bella Wallersteiner
January 7, 2026 15:06
Ofcom has launched an investigation after a descendant of Holocaust survivors was “digitally undressed” by X's Grok AI, which created an image of her in a bikini outside the Auschwitz death camp.
Bella Wallersteiner, a public affairs executive, whose ancestors survived the Shoah, is the latest user to fall victim to a trend of online trolls prompting the bot to generate sexualised images of women based on fully-clothed images they have posted online.
She said: “The creation of undressed or sexualised images without consent is degrading, abusive and it is not a victimless crime. It leaves you feeling exposed, powerless and unsafe, and the harm does not simply disappear once the images are removed.”
Confirming Ofcom had informed her it would investigate, Wallersteiner for reform of the rules regulating the use of AI on social media.
“Ofcom’s intervention is both necessary and long overdue. Robust, enforceable safeguards must now be put in place to prevent this kind of abuse from happening again," she said.
“Without decisive action, there is a real risk that this technology will normalise sexual exploitation and digital abuse, shaping an online world in which girls and women are expected to tolerate harm as the price of participation.”
Discussing the case with The Times, Jessaline Caine, another victim of the trend, warned others on X of the potential dangers of Grok.
“A lot of people disagreed with me, they thought AI should not be limited whatsoever. When I responded back to an argument, someone said, ‘hey Grok, put her in a string bikini'." she said.
“It was totally dehumanising because I’d given them an argument back and they didn’t even say anything, they just put me in a bikini to humiliate me.”
This prompted her to test if the system has any limits in its function by asking it in a private chat if it would create naked images of her as a child. The tool allegedly removed clothes from pictures of her when she was as young as three-years-old.
She added: “I thought ‘this is a tool that could be used to exploit children and women’, as it’s clearly doing.”
X has been contacted for comment.
To get more news, click here to sign up for our free daily newsletter.