IWF Annual Report
Last week the IWF released their Annual Data and Insights report in relation to CSA/CSE/CSAM. As awful as the subject matter is, this is a really important report which not only shows the scale of abuse but also emerging areas, which are important for safeguarding and education/curriculum/staff training purposes.
As always it’s a big report, but you can glean much of the information from the executive summary. I have put links to the full report and the executive summary below. Here are some highlights:
- AI CSAM continues to grow significantly – 491 reports led to the identification of 8,029 illegal images/videos. AI-generated CSAM videos increased from 13 in 2024 to 3,443 in 2025.
- Sextortion continues to be a major risk, particularly for 14-17 year old boys (98% of all sextortion reports).
- Girls remain at the highest risk, 11-13 particularly. 77% of all victims found by the IWF were girls, this increases to 97% in AI-generated imagery.
Link: IWF Annual Data Insights Report 2025
Link: IWF Annual Data Insights Report 2025 – Executive Summary
Which Platforms Report CSA the Most?
As I mentioned above, the IWF report is a really important annual insight not only for us in education, but also for law enforcement purposes: to bring perpetrators to account and safeguard victims. But that’s just one side of the overall jigsaw puzzle.
The IWF receive reports from some companies, members of the public, and also through pro-active search (the majority). NCMEC (National Center for Missing and Exploited Children) in the US receives automated reports from just under 2,000 tech companies predominantly in the US (think Google, TikTok, Snapchat etc.). Essentially NCMEC is a clearing centre to receive and pass on reports to the correct country for investigation purposes.
The annual NCMEC report should be out soon but there’s been a recent development. The US Senate Judiciary Committee has opened an enquiry into 8 tech companies for quote failing to sufficiently report online CSE, frustrating law enforcement investigations into online CSA unquote. In 2025 these 8 companies submitted over 17 million reports to NCMEC which collectively accounted for 81% of all reports. The concern is in relation of the quality of the report and missing data for investigation purposes. So the platforms which are reporting the most CSA are:
- Meta (Facebook, Instagram, Facebook Messenger).
- Amazon AI services.
- TikTok.
- Snapchat.
- Discord.
- X (Twitter) AI.
- Grindr.
- Roblox.
Video Game – Five Nights at Epstein’s
I haven’t heard about this game from any students personally, I have only seen a couple of articles, so I’m being careful not to make assumptions that this is widespread because I don’t think it is.
The game is a parody of Five Nights at Freddy’s. We all know who Epstein is, it now appears there is a game where players monitor surveillance cameras in rooms modelled on his private island with the tagline “Can you survive the night?” When Epstein approaches players attempt to lure him away by playing sounds of a child’s voice. It’s a pretty vile game theme, I had a short play and, as a gamer myself, the actual gameplay is truly terrible, I really can’t see that anyone would last more than a couple of minutes playing.
I’m only mentioning this just in case you hear anything and, if so, it may be worth a chat with parents.
Article/Resource – The Manosphere
South West Grid for Learning have published a new article/resource around the growing focus and concerns of the manosphere. It’s a quick but very useful read for those who don’t know too much about it including harmful behaviours to be aware of, why the content spreads so quickly and how to teach young people to respond.
Link: The Manosphere – Understanding and Challenging Harmful Online Behaviours.
For Parents – Your Child’s Digital Footprint
What is a digital footprint? Basically it’s the trail we all create online whenever we visit a web page, buy something, scroll social media and everything else we do online. It’s a really important aspect to understand for lots of reasons such as privacy, reputation, advertising, risks of harm and much more.
Internet Matters have a really good explainer article covering what a digital footprint is, what we mean by an active footprint, why it matters and much more.
Link: Internet Matters – What you need to know about your child’s digital footprint.
Straight to your Inbox
Get the most important updates delivered to your inbox every Wednesday morning including:
- Emerging risks & issues.
- New free curriculum resources.
- New research and insights.
- Links to information for your parents.
- No spam, your information is not shared with anyone else.