Ofcom has initiated an investigation into Elon Musk’s social media platform, X, over serious allegations that its AI tool, Grok, is being misused to generate sexualized images, including explicitly non-consensual images of individuals and children. The UK communications regulator expressed concern about numerous reports highlighting the chatbot’s role in creating and disseminating inappropriate content.
In terms of potential penalties, if X is found in violation of the law, Ofcom has the authority to impose fines up to 10% of the company’s global revenue or £18 million, whichever is greater. In previous statements, X emphasized that users who prompt Grok to produce illegal content would face the same repercussions as those who upload such content directly.
Elon Musk responded to the scrutiny by suggesting that the UK government is seeking “any excuse for censorship,” questioning why other AI platforms have not faced the same level of investigation. The BBC has uncovered instances of digitally altered images posted on X, showcasing various women in sexualized scenarios without their consent. One individual noted that over 100 sexualized images had been generated featuring her likeness.
If X fails to comply with the investigation’s findings, Ofcom could take further measures, including seeking a court order to block access to the platform from within the UK entirely. The issue holds significant importance for many, as voiced by Technology Secretary Liz Kendall, who has urged Ofcom to expedite its investigation, emphasizing the necessity for swift action for the sake of the victims involved.
Concerns raised extend beyond regulatory compliance; former Technology Secretary Peter Kyle expressed his outrage over Grok’s apparent lack of rigorous testing. He recounted a troubling incident involving an image of a Jewish woman featured in a bikini against the tragic backdrop of Auschwitz, highlighting the distress such content can cause.
Condemnation of Grok’s implications has come from various politicians, including Cara Hunter from Northern Ireland, who announced her departure from the platform in protest. In response, Downing Street assured that the government remains committed to safeguarding children’s welfare, while keeping its presence on X “under review” and indicating that “all options are on the table.”
Dr. Daisy Dixon, who has personally dealt with distressing experiences involving Grok, welcomed the regulatory examination but critiqued Musk’s portrayal of the situation as censorship. She stressed that calls for immediate compliance with regulations should be prioritized over deflections from the core issues of misogyny and exploitation.
Ofcom’s investigation will assess whether X responded adequately to reports of illegal content and took appropriate measures to shield UK users from harmful material, including non-consensual intimate images and child sexual imagery. The regulator will also determine whether X has implemented effective age verification measures to prevent minors from accessing explicit content.
This scrutiny comes in light of growing global discontent over Grok’s capabilities. Authorities in Malaysia and Indonesia have temporarily blocked access to the feature, showcasing the widespread concern regarding AI-generated content.
While an official timeframe for the investigation remains unspecified, Ofcom labeled it a “matter of the highest priority.” The agency reiterated its commitment to ensuring that platforms actively protect UK users from illegal content, particularly with regard to potential harm to children.
Experts in internet law expressed the challenges associated with predicting the investigation’s pace. Ofcom has the discretion to control the expedition of the investigation but could also seek a disruption order to restrict X’s operations in the UK more rapidly if deemed necessary. Notably, legal scholars underscored the need for tangible action and reform to prevent Grok from generating illicit content, reiterating that the focus should remain on implementing effective changes for the protection of individuals affected by such imagery.

