Worrying stuff coming out of Helsinki

There are several organizations whose new publications I never miss because I know they are likely to be important. One of those is Suojellaan Lapsia. That is Finnish for “Protect Children”. And they do. Mightily. In several different ways.

Published with support from End Violence Against Children, “CSAM users in the dark web. Protecting children through prevention”  is just such a work. A world first, it presents some very disturbing data.

The data are not disturbing in the sense that they tells us things we didn’t already know, or thought we knew. The power of the report and its unique value comes from the methodical way in which the research was carried out with a significant sample size.

As the title suggests, and bearing in mind the study aimed at and engaged self-declared CSAM users on the dark web, here are some headlines

  • 70% of respondents said they first saw CSAM when they were under 18 and almost 40% before they were 13
  • Over half said they first saw CSAM accidentally, meaning they were not actively searching for it
  • 6% said they viewed material of babies and toddlers in the 0-3 age range
  • 24% viewed sadistic or brutal material
  • The gendered nature of the material sought was clear: 45% preferred material showing girls in the 4-13 age range, and 18% boys in the same range
  • The material showing boys was likely to be more egregious
  • 52% felt viewing CSAM might lead them to engage in sexual acts with a child
  • 44% thought viewing CSAM made them think about making contact with a child
  • 37% had in fact sought contact with a child
  • 45% had seen livestreamed child sexual abuse

The dark web is dark in more than one way. And that direct, close link between the availability of CSAM and threats to children is once more underscored. Those debating the CSA Regulation in Brussels please note.

Governments and other bodies that finance the continued functioning of major elements of the dark web in its present form need to read the report from Helsinki and think about how they can make things better i.e. less dangerous for children.

Many years ago an eminent therapist who works with child sex offenders told me

“People who collect or consume child sexual abuse images do so because they have a sexual interest in children.” Adding a rider “But some of them may not have realised it yet.” 

We see echoes of this in the Finnish study.

Removing CSAM is an important prevention tactic

CSAM removal is not to be dismissed or diminished because it is “merely” an acknowledgement of a failure to prevent a child being sexually abused in the first place, although it is certainly that and no less important for being so. In addition CSAM removal is a vital means of helping uphold a depicted victim’s legal right to human dignity and privacy as part of their pathway to recovery. But CSAM removal is also much more.

CSAM removal is linked with gathering intelligence to allow law enforcement to identify perpetrators. The more perpetrators that are apprehended and addressed the safer all children are likely to be.  That’s a prevention dimension.

As the Finnish study amply demonstrates, the presence of CSAM on the internet helps create and sustain paedophile behaviour. Its removal does the opposite. Prevention again.

And with inexpensive international travel and the internet’s global reach, when we restrict paedophiles’ freedom to act or when we remove CSAM we are not only helping safeguard the victims depicted, we are also helping prevent harm to children as yet unharmed in all parts of the world . Prevention once more.

Why am I labouring this point?

If it seems I am labouring a rather obvious point it is because too often you hear people banging on about the importance of education and empowerment initiatives as the best or only way of preventing children coming to harm online.  Absolutely not.

Here’s where the sceptic in me takes over. Education and empowerment initiatives undertaken by an online business are or can be made publicly visible.  Highly publicly visible.  They should be. Great PR can be generated.  A halo effect is created. It’s not so easy to do that with nerdy internal stuff on the network. And typically, for the business, education and empowerment initiatives will be a great deal cheaper and less complicated to implement and manage than intervening in the technical infrastructure.

I completely get that we all have a part to play in helping  keep children safe but there are singular things only a business can do. Nobody else can.

Education and empowerment must go hand-in-hand with effective removal 

Don’t get me wrong, education and empowerment initiatives are, of course, extremely important and we can probably never get enough of them.  If children can be made aware of online hazards and be taught how to avoid or deal with them, so much the better. Likewise if parents, teachers and others can be brought up to speed and be shown how to  support their charges, spot signs of distress, then again that has to be all to the good. One wonders if  industry-provided education and awareness initiatives can scale to meet the challenge but they are nevertheless welcome.

But what education and empowerment initiatives most emphatically are not is an alternative to the deployment on the network of effective, proactive tools to identify CSAM, ensuring its swift removal then preventing it from being reuploaded.

Education and empowerment and CSAM removal are not in competition with each other. They complement each other. Together they make a strategy. Half a strategy is not acceptable. Not any more.