Algorithms may prevent online sexual abuse

Nov 2, 2020

Tekst: Anne-Lise Aakervik

Reports indicate that the number of cases involving online sexual abuse of children has increased by almost 50 percent in five years. Researchers at NTNU Gjøvik have developed algorithms that may help reveal planned online sexual abuse by analysing chats.

Slik ser app’en som niende klassene på Kopperud skole i Gjøvik bruker for å bli kjent med elever på andre skoler. Det gir også forskerne relevant informasjon om hvordan algoritmene fungerer i forhold til å avsløre hvem som skriver.


Every day, millions of children log on to online chat rooms to interact with other children. One of these children may very well be a man, just pretending to be a 12-year-old girl, harbouring far more sinister plans than simply discussing “My Little Pony”. Professor and inventor Patrick Bours at NTNU Gjøvik hopes to thwart exactly these types of plans. He has researched how behavioural biometrics and algorithms may reveal sexual predators in online chats with children.

Cyber grooming is a term for when adults use fake profiles to build relationships with children online. Their goal is often to get the children to switch to a private channel, to get them to send pictures of themselves, both with and without clothes, and perhaps even to organize a meeting.

Patrick Bours is a professor of information security at NTNU Gjøvik, and he had the idea for the tool that may reveal sexual predators online. “I have two young children, and I am concerned about how children are let loose online with the risk of running into these people who are prowling for children to meet up with. We can’t let that happen,” Bours concludes. As a company, AiBA will be developing and marketing a tool that can identify online sexual abuse. 

Warning lights
“Monitoring these conversations to prevent abuse is an impossible task for moderators manually moderating these chat sites. We need something more automated, something to alert us to ongoing chats,” says Bours.

He has developed a system including several different algorithms as a tool chat providers can use to reveal whether the user is a child or an adult. This is where behavioural biometrics come in. An adult man can pretend to be a 14-year-old boy online, but the way he types, quickly or slowly, and the words he chooses to use can, among other things, reveal that this is an adult man around the age of 40.

Using machine learning, the system analyses every chat, using specific criteria to build a risk assessment. The risk level may move up and down as the conversation progresses. When the risk level becomes too high, a red light is triggered. This alerts the moderator, who can enter the chat to assess the situation.

The algorithms find the conversations that need closer monitoring while they’re happening, and not afterwards, when the damage is done and abuse has taken place. The algorithms serve as a kind of warning light.

Cold and cynical
In developing the algorithms, Patrick Bours analysed countless conversations from old logs.

“By analyzing these conversations, we can learn how men ‘groom’ their targets using compliments, gifts and flattery, coaxing them to give more and more. It’s cold, cynical and carefully orchestrated. The risk of sexual abuse is high. This is especially true if the abuser is trying to convince their target to switch platforms, to video chat, for example. In live situations, the algorithm will flag this as a conversation to monitor more closely.”

Real time analysis
The objective is to identify the abuser as soon as possible. “If we wait until the conversation is over, it may be too late. It’s also possible to let the child in the chat know that the person they’re talking to is an adult, not a child.” Bours is working with gaming companies to install the algorithm, and a solution may be in place by the end of the year. 


Behavioural biometrics is about how we do things, e.g. how a person types on a mobile phone or computer. If researchers have access to how you use the keyboard, your typing rhythm and keystroke power, they can predict, with 80 % accuracy, whether you are a man or woman, or if you’re old or young. And if we add to that the words used in the chat itself, the algorithm can predict gender and age with an accuracy of more than 90 percent.



Jan Hassel
Telefon: 906 53 180
Kontor: Hovedbygget, sokkel

Håvard Wibe
Telefon: 41 47 37 68
Kontor: Hovedbygget, sokkel


Brosjyrer og årsrapporter: