More than 100 child sexual abuse image crimes being recorded by police every day

We’re urging the government to ensure children are better protected in private messaging environments.

 43655-exp-2027-04 900x506.jpg

  • Over 38,0001 child sexual abuse image crimes were logged by police forces in England and Wales in the past year, according to Home Office data.
  • Data shows that private messaging platforms are being exploited by criminals to share child sexual abuse material.
  • Together with Barnardo’s and the Marie Collins Foundation, we sent a letter to the Home Secretary and DSIT Secretary of State urging action to strengthen Ofcom’s current approach to private messaging.

The data in detail

Recently published Home Office data revealed police forces across England and Wales recorded 38,685 child sexual abuse image offences last year (2023/24), an average of more than 100 every day.

A separate Freedom of Information request2 we submitted showed that of the 7,338 offences last year where law enforcement recorded the platform used by perpetrators3 :

  • Exactly half (50%) took place on Snapchat.
  • A quarter took place on Meta products (11% on Instagram, 7% on Facebook and 6% on WhatsApp).

In addition, it revealed that child sexual abuse offences have reached record levels in Northern Ireland (859 crimes) and remain consistently high in Scotland (748 crimes).

Worried about a child?

You can contact the NSPCC Helpline by calling 0808 800 5000, emailing help@NSPCC.org.uk or completing our report abuse online form.

Find out more

Letter to Home Secretary and DSIT Secretary of State

Along with the charities Marie Collins Foundation, Lucy Faithfull Foundation, Centre of expertise on child sexual abuse, and Barnardo’s, we sent a joint letter to Home Secretary Yvette Cooper and Secretary of State for Science, Innovation, and Technology Peter Kyle.

The letter expresses collective concern regarding Ofcom's final Illegal Harms Code of Practice published in December 2024. The charities argue that, as it stands, children will not be protected from the worst forms of abuse on private messaging services under Ofcom’s plans, despite this being a core aim of the Online Safety Act.

Ofcom has stated that user-to-user services are only required to remove illegal content where it is ‘technically feasible’. This exception creates an unacceptable loophole, allowing some services to avoid delivering the most basic protections for children.

Data from police forces on the number of recorded offences where the platform was known indicates private messaging sites are involved in more crimes than any other type of platform. Perpetrators exploit the secrecy offered by these spaces to harm children and go undetected.

We want the UK Government to push Ofcom to review and strengthen their most recent codes of practice on tackling this threat to children's safety online.

We're also calling for private messaging services, including those using end-to-end encryption, to make sure there are robust safeguards in place to ensure their platforms do not act as a ‘safe haven’ for perpetrators of child sexual abuse. End-to-end encryption is a secure communication system where only communicating users can participate. This means that service providers can be blinded to child sexual abuse material being shared through their platform.

chris sherwood 900x506.jpg

Chris Sherwood, NSPCC Chief Executive, said:

"It is deeply alarming to see tens of thousands of child abuse image crimes continue to be recorded by police. These offences cause tremendous harm and distress to children, with much of this illegal material being repeatedly shared and viewed online. It is an outrage that in 2025 we are still seeing a blatant disregard from tech companies to prevent this illegal content from proliferating on their sites.

"Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place. This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act.”

"The government must set out how they will take a bold stand against abuse on private messaging services and hold tech companies accountable for keeping children safe, even if it requires changes to the platform’s design – there can be no excuse for inaction or delay."

Messaging apps used in abuse crimes

Insight from Childline provides further evidence of how young people are being targeted or blackmailed to share child abuse images via the calculated use of private messaging apps.

Last year, Childline delivered 903 counselling sessions to children and young people relating to blackmail or threats to expose or share sexual images online. This was a 7% increase compared to 2022/23.

47446-exp-2027-08 900x506.jpg

One 13-year old girl4 said:

“I sent nude pics and videos to a stranger I met on Snapchat. I think he’s in his thirties. I don’t know what to do next. I told him I didn’t want to send him any more pictures and he started threatening me, telling me that he’ll post the pictures online. I’m feeling really angry with myself and lonely. I would like support from my friends, but I don’t want to talk to them about it as I’m worried about being judged.”

Support from Childline

Childline and IWF’s Report Remove tool helps young people under-18 report sexual images or videos that have been shared online to see if they can be taken down.

Childline is available for young people via the phone on 0800 1111 and online through the 121 chat on the Childline website. 


References

  1. 1) The figure presented for the number of offences where the platform was recorded does not account for Facebook Messenger and may encompass certain offenses that are found within the publicly accessible segments of the platforms.

  2. 2) The NSPCC sent Freedom of Information requests to all 43 police forces in England and Wales and received data from PSNI and Police Scotland relating to recorded offences of indecent images of children.

     

    Many police forces have previously refused to share information about the social media sites used in child abuse image offences on time and cost grounds. Therefore, the data this year is based on a key word search provided by the NSPCC and is not extensive of every method used in child abuse image offences. We asked police forces to search for instances where the following were used: Snapchat, Facebook/Messenger, Instagram, WhatsApp, X/Twitter, Kik, TikTok, YouTube, Discord, Skype, Facetime, Roblox, Oculus, VR, Metaverse, Only Fans, Signal, iCloud, iMessage, Dropbox, Mega, Patreon.

  3. 3) 20 forces provided useable data for 2023/24. Police disclosed the platform involved in 7,338 instances.

     

    Of these, Snapchat was flagged 3,648 times (50%), Instagram 840 (11%), Facebook 537 (7%) and WhatsApp 457 times (6%). Other notable platforms included Kik 124 (2%), TikTok 245 (3%), Twitter/X 250 (3%) and YouTube 330 (4%).

  4. 4) Snapshots are based on real Childline and service users but are not necessarily direct quotes.