A Co Armagh grammar school is among the latest to be impacted by the spread of AI-generated deepfake images.
A police investigation is now underway after generated sexual images were shared among pupils of The Royal School Armagh, on College Hill.
The principal told The Times that as soon as the school became aware of the issue it had contacted the authorities and will take all appropriate action as advised.
This follows on just two weeks after a Portadown-based GAA club – Tír-na-nOg – warned parents of the dangers of the falsified explicit images after a young member of the community was “blackmailed” for money to avoid the “very realistic” images being circulated online.
The appalling incident in Portadown caught the attention of Newry and Armagh MLA, Justin McNulty who subsequently contacted the Minister for Communities, Gordon Lyons requesting him to detail any actions his Department would be taking to implement safeguards to protect young people in sport from being targeted by AI “deepfakes”.
Mr Lyons explained that while the safety and responsibility of young people and children within our communities is a “shared responsibility by everyone in society”, that addressing “emerging risks” such as AI-generated deepfakes is a “wide-ranging challenge” that will require input and action across multiple areas of government and public services.
With responsibility for safeguarding within sport in Northern Ireland, Sport NI currently has a contract with NSPCC, said the Minister.
Through the contract, Sport NI ensures legislation and best practice with respect to safeguarding is followed and that the sector is appropriately supported.
As part of this contract, the Minister explained that Sport NI will “discuss this matter with NSPCC”.
On the morning of January 15, Elon Musk’s platform X – which has been at the centre of growing controversy for the misuse of its AI model Grok – said it will no longer allow users to alter photos of real people to make them sexually explicit or suggestive in countries where such actions are illegal.
It has also limited access to the feature to paying members only.
In Northern Ireland, SDLP MLA Cara Hunter, for East Londonderry has been largely leading the charge against the spread of deepfake explicit images, ever since she herself fell prey to the illicit image generation, by campaigning for tougher legal action against the creation and sharing of deepfakes.
Following her lobbying a public consultation on the matter was concluded in October 2025 and the findings of that consultation could now feed into the development of future legislation.
On January 13, Ms Hunter also spoke in favour Northern Ireland following suit with the UK Government after their announcement of new laws to crack down on the creation of explicit images using AI.
Technology Secretary, Liz Kendall has said creating, or trying to create, an intimate non-consensual image will be an offence from this week.
Apps allowing users to create these images will also be criminalised under the Crime and Policing Bill in the UK.