News Story

Irish police investigating 200 reports of Grok child abuse material
PA Media
Detective Chief Superintendent Barry Walsh said gardai remain in contact with the Irish media regulator Coimisiun na Mean.
Received: 13:54:42 on 14th January 2026

There are hundreds of open investigations into content shared on X, a senior Irish police officer has said, amid concerns over potential child sexual abuse material (CSAM) generated on the platform using the artificial intelligence tool Grok.
Irish politicians convened on Wednesday for an Oireachtas Media Committee hearing with Irish police and other experts which largely dealt with growing concerns over the proliferation of CSAM and other AI-generated sexualised material on the social network.
They were told there are currently 200 reports being investigated by the Irish police service, An Garda Siochana, in relation to the platform that are potentially indicative of containing CSAM.
Detective Chief Superintendent Barry Walsh said: “We have received reports and referrals of content on that particular platform (X) that is under investigation.
“The investigation process takes some time, the content has to be assessed to make sure it’s criminal, and thereafter the people responsible have to be identified, if that’s possible, and the investigative action stems from there.
“So what follows is the investigative process, and that may result in various different actions, such as execution of warrants, interview people responsible, interview being brought before the court or for direction from the Director of Public Prosecutions.”
He added: “As of this morning, there are 200 reports that are being investigated involving content that is child sexual abuse material, or child sexual abuse indicative material.”
The senior officer said these all related to Grok.
Mr Walsh said gardai remain in contact with the Irish media regulator Coimisiun na Mean about AI-generated CSAM.
He said gardai believe existing legislation allows it to carry out investigations into the material and that they had not yet encountered anything that would prevent it from carrying out a probe.
Mr Walsh said he wanted to reassure the public that reports are being “treated with utmost seriousness” and thoroughly investigated.
In his written submission, he said: “I would encourage any individual who may be a victim of these crimes to make contact with your local Garda station where you will be provided with access to the wide range of specialist help and support that is available.
“Victims of intimate image abuse also have the option of reporting online via Hotline.ie.”
Mr Walsh said recent commentary had focused “on one AI model in particular” but the reality is that it was “a conceptual possibility” that other AI models could be trained to create such content.
He called for a “robust response” from AI service providers to ensure that their models cannot be manipulated to create content that is “both unlawful and hugely harmful to those individuals who are impacted”.
Mr Walsh said a minimum step is for online service providers to make sure material disseminated on their platforms is appropriate for recipient audiences and that it has been effectively considered for accuracy but it was clear this was not currently the case.
The officer, who is attached to the Garda National Cyber Crime Bureau, said there are ever-increasing levels of CSAM being produced and distributed online.
Mr Walsh said gardai do proactive work to find CSAM online but mainly deal with referrals through the US private agency the National Centre for Missing & Exploited Children.
He said referrals had been increasing year-on-year, with 13,300 in 2024 and roughly 25,000 in 2025.