Non-consensual intimate images – amendment to the Crime and Policing Bill
In February 2026, the Government announced an amendment to the Crime and Policing Bill, requiring tech companies to detect and remove intimate images shared without a victim’s consent within 48 hours of the image being flagged. A platform that fails to take prompt action could face a fine of up to 10% of its qualifying worldwide revenue or have their services blocked in the UK.
According to the Government’s press release, the amendment seeks to recognise the devastation that these images may cause to victims, with the Government announcing a commitment to victims needing to report an image only once. Once reported, an image should then be removed across multiple platforms in one go, and from then on, automatically deleted at every new upload.
The amendment reflects the Government’s commitment to protecting women and girls as part of its wider pledge to recognise violence against women and girls as a national emergency. Placing greater obligations on platforms to remove content within 48 hours, rather than a victim having to continuously chase platform forms part of the Government’s pledge to keep women and girls safe online.
In January 2026, a new offence was introduced by s.138 of the Data (Use and Access) Act 2025 of creating or requesting the creation of a purported intimate image of another person without their consent. It is now a criminal offence to create or request the creation of a non-consensual intimate image, as well as to share or threaten to share such an image (s.188 of the Online Safety Act 2023 amending s.66B/66C Sexual Offences Act 2003). The amendment also seeks to include AI generated or altered intimate images.
It was further announced that Ofcom are considering treating non-consensual intimate images with the same severity as child sexual abuse and terrorism content under the Online Safety Act, so that such images may be digitally marked to prevent further uploads. The offence is to be made a priority offence under the Online Safety Act.
The recent amendments reflect a Government responsive to changes in technology, and followed the public outcry and concern around the use of X’s ‘Grok’ AI tool to generate edited sexualised images. The Government has announced that it will introduce legislation to ban “nudification” apps and the use of AI. Both Ofcom and the ICO announced an investigation into X. Ofcom is unable to investigate xAI, the provider of the standalone Grok service, as certain chatbots do not currently fall within its remit. The Government has announced a plan to review AI chatbots within the scope of the OSA.
