NSPCC Calls for Real-Time Tech to Block Child Sexual Abuse Images
- More Radio Writer
- 5 minutes ago
- 1 min read

The NSPCC is urging tech companies to embed technology on children’s devices that blocks nude images in real time, after new data shows a rise in child sexual abuse image crimes.
Freedom of Information data from 42 UK police forces reveals nearly 37,000 offences recorded between April 2024 and March 2025, a 9% increase on the previous year.
In London and the South East, 6,597 crimes were logged.
Of 10,811 offences where the platform was identified, 43% occurred on Snapchat, with Meta platforms accounting for almost a quarter.
The charity warns that without stronger safeguards, children remain at risk of grooming, extortion, and abuse, often with long-term impacts.
One teenager told Childline that leaked images had forced him to change schools and left him fearful for his future.
Chris Sherwood, NSPCC CEO, said:
“Children across the UK are being failed by tech companies.
"Technology exists to prevent these crimes today — if companies don’t act, the Government must step in to make it mandatory.”
The charity wants the Government to enforce safeguards that stop illegal images being created, shared, or viewed, arguing this goes further than social media bans alone in keeping children safe online.

