To address the growing threat of inappropriate relationships between adults and children on digital platforms, Thorn and the Technology Coalition– an alliance of global tech companies working together to combat online child sexual exploitation and abuse – is pleased to announce a new initiative that empowers the tech industry to respond to the issue.
In partnership with several Tech Coalition members engaged in this effort, Thorn works closely with trust and safety and engineering teams to adapt, train, and evaluate technical solutions to identify and address attempts to prep and exploit a youth.
More often than not, companies develop new technology designed to enforce their platform’s child safety policies and terms of service, recognizing how harm can manifest specifically on their services. Developing useful and usable technical solutions for a range of platforms offering text-based communications subject to enforcement means Thorn and the Tech Coalition are helping more companies keep young people safe.
Line grooming is extremely common. Children are increasingly vulnerable as they regularly communicate with people they only know online and do not consider strangers, even when that online friend is an adult.
In a recent report on groomingThorn found that:
-
1 in 3 young people (32%) say that the friends they make online are among their closest friends. Only 14% of minors classified their online-only contacts as “strangers”.
-
Nearly half of all children online (40%) have been approached by someone who they believe was trying to “befriend and manipulate them”.
-
1 in 4 minors stayed in touch with online-only connections despite feeling uncomfortable.
-
40% of minors were contacted online by people they had never had contact with before, immediately requesting nude images, including about 1 in 4 (29%) of 9-12 year olds.
Similarly, the National Center for Missing and Exploited Children (NCMEC) found an 82% increase in reports of “online incitement” from 2021 to 2022, including grooming scenarios and related harms such as financial sextortion.
The concept of grooming is not new, but technology and the internet have changed the way it manifests itself in everyday life. With features like online games, live streaming, metaverse platforms, social instant messaging platforms, and more traditional photo and video sharing platforms, it’s never been easier for adults seeking to abuse children to access, build trust, and manipulate children in dangerous situations globally. These platforms create a complex ecosystem where evil can start on one platform and then move on to the next, and where the communities that miners form among themselves can be targeted for exploitation and abuse.
Rapid advances in perpetrator tactics are forcing tech companies to innovate even faster to meet the threat. Businesses face different challenges in identifying when this harm occurs in text exchanges. The sheer volume of text on a chat platform makes it difficult to sort through and find instances where that content violates a platform’s child safety policies. Likewise, the volume of user reports can make triaging and sorting out false positives impossible for trust and safety teams. That’s why, over the past few years, Thorn’s team has been working on an NLP (natural language processing) classifier, or machine learning model, that detects and categorizes when online content or behavior falls into defined grooming-related “classes” (such as exposure to sexual material or seeking an in-person encounter with a minor) as well as an overall score of the relationship between a conversation and grooming.
Here’s how it works:
Grooming risk: 90%
-
User 2: Do you have other girls my age to chat with? (Age: 19.3%)
-
User 1: A
-
User 2: Yeah
-
User 2: where does she live?
-
User 1: she is in New York (PII: 98.9%)
-
User 2: how old is she? (Age: 98.5%)
-
User 1: she is 11 years old but she looks mature in her profile picture (Age: 87.6%, PII: 39%)
Thorn has taken advantage of recent advances in artificial intelligence to develop comprehensive grooming detection solutions. Rather than waiting for users to report an injury, Thorn’s set of classifiers enables real-time detection and prioritization of conversations where a child may be in immediate danger, including grooming situations, pressure on children to send illicit images, and other forms of inappropriate contact between adults and minors. A company can use these classifiers in spaces that are unencrypted and where users can expect their communications, even one-on-one with another user, to follow company policies and terms of service.
At the base of the set is a language model that learns grooming-specific language patterns. Additional classifiers are overlaid on the language model to predict different categories, all related to grooming, for each message in a conversation, as well as to produce a risk score for the conversation as a whole. Trust and safety teams can then use this information to quickly identify cases and prioritize a trained team member for further investigation. Problematic conversations can be quickly reported to these teams, and parts of the conversation that violate platform policies are immediately highlighted for review.
The goal of the joint initiative between Thorn and the Tech Coalition is to create tools that enable those responsible for enforcing a platform’s child safety policies and terms of service to better identify, prevent and mitigate harm to children. With shared goals of developing cutting-edge technology, lowering barriers to adoption, and facilitating collaboration through partnerships, Thorn and the Tech Coalition are proud to use this technical solution to disrupt online grooming and prevent harm to children in digital spaces.
What you can do:
-
Learn more about the Technology Coalition
-
Explore Membership with the TechCoalition
-
Give to help Thorn develop technology to defend children against sexual abuse
To learn more about the state of online grooming and considerations for detection, response and prevention, check out the latest research from the Tech Coalition here.
————
Originally published: June 20, 2023 on technologycoalition.org
Rob Wang, Data Scientist, Thorn
Dr. Rebecca Portnoff, Director of Data Science, Thorn
Lauren Tharp, Head of Technology Innovation, Tech Coalition