Thorn is committed to driving innovation to defend children against sexual abuse. We rely on data to guide our work, and National Center for Missing and Exploited Children’s (NCMEC) CyberTipline Annual Report provides crucial information.
In 2022, NCMEC’s CyberTipline received over 32 million reports of suspected child sexual exploitation. This staggering number is more than just a statistic – it represents millions of child abuse victims and highlights the growing need for comprehensive measures to respond to and counter child sexual abuse.
At Thorn, we know that the overwhelming prevalence of child sexual exploitation material (CSAM) and online grooming far exceeds reported cases – and that platforms need to step up their detection efforts, which will in turn increase the number of reports before we can eradicate the problem of child sexual abuse.
We have long maintained that the increase in reports of child sexual abuse material (CSAM) is a good thing. Indeed, when platforms don’t report CSAM, it’s not because there isn’t one – it just means it’s just not being detected. Additionally, a lower number of reports relative to a platform’s user base may indicate that CSAM is only being inadvertently discovered and reactively reported.
However, there are nuances to the most recent conclusions of the CyberTipline report, and an increase in numbers is not a good thing in all cases.
The numbers are increasing. Sextortion and grooming may be partly responsible.
One of the striking revelations of this year’s report was the dramatic increase in CyberTipline Reports’ online enticement of children for sex acts category. These numbers have seen a dramatic increase from 37,872 in 2020 to 44,155 in 2021, and then a massive jump to 80,524 in 2022.
Although there are several reasons that could explain this increase, one of the probable reasons is the increase in cases of grooming and online sextortion that occur on all types of online platforms.
Recent Research by Thorn shows that 2 in 5 children have been approached online by someone they believe was trying to “befriend and manipulate them”. And, between 2019 and 2021, the number of reports to NCMEC involving the sextortion of children or adolescents more than doubled. As the prevalence of this type of online abuse increases, so does the number of people and platforms reporting it.
However, it is important to reiterate that an increase in these reports could still be a sign of certain types of positive progress. This likely means more platforms becoming aware of the problem, actively engaging, and reporting suspected abuse on their sites.
Additionally, we know that children themselves are increasingly proactive when it comes to taking charge of their online safety, understanding the dangers of the digital landscape, and flagging/reporting harmful conversations on platforms. Our recent research shows that more than half of kids think online grooming is a common experience for kids their age.
As those with a stake in protecting children, alongside platforms and children themselves, continue to become more vigilant and empowered, we will begin to see positive changes.
The tech industry has the potential to do more when it comes to reporting.
Finally, it’s worth addressing a somewhat concerning figure: 4% of CyberTipline reports submitted by the tech industry in 2022 contained information so limited that it was impossible for NCMEC to identify the location of the breach or the appropriate law enforcement agency to receive the report.
This low number indicates that CSAM is detected, but critical user information is not included in reports.
The ecosystem must work together.
As is the case each year when the CyberTipline report is released, these results are not just numbers; they are also a call to action. We must continue to evolve our strategies, advance our technology and expand our partnerships to ensure the digital world is a safe space for the children who now spend so much time in it – as well as to stop re-victimization through the viral spread of CSAM.
As Thorn strives to equip those on the front lines to protect children, we know that no single entity can change the trajectory of the child sexual abuse problem and that the number and frequency of reports filed depends on all of us doing our unique part to advocate for children.
For Thorn, that means working to bring more tech companies to the table and equipping them with the tools to effectively detect, investigate and report CSAM, as well as working directly with law enforcement to equip them with the tools and resources they need to identify victims as effectively as possible. This means bringing together key partners – those with the interest and capacity to protect children – and developing the best solutions to work towards our goal of defending children from sexual abuse.
It’s an ongoing process – but we know that together we can make substantial progress towards a world where every child is free to just be a child.