Canada puts another nail in the coffin of self-regulation

The Canadian Centre for Child Protection (C3P) is justly famous for many things. One of them is the quality of their research. It is based on two vital, interdependent pillars.

First is a deep understanding of the position and needs of survivors of child sex abuse, particularly those who have had the additional misfortune of appearing in a picture or video which depicts their abuse where the picture or video has also been circulated online.

Second is CP3’s exceptionally strong grounding in the technologies used to make, publish and distribute child sex abuse material (csam) over all parts of the internet.

More evidence of C3P’s top class work became available yesterday when they published their long-awaited report: “Project Arachnid: Online availability of child sex abuse material”. Its nearly 60 pages do not make easy reading (there is an Executive Summary as well) but it is essential reading for anyone engaged in trying to rid the world of the scourge of csam.

The period covered is 2018-2020. Not a vast span but that only serves to underline the scale of what we are facing. And when you look at the report see what C3P say about their backlog. Scary stuff.

Enormous numbers

In the two years under review C3P examined and verified 5.4 million images and issued take down notices to over 760 electronic service providers (ESPs) in all parts of the world.

Qu’est ce que vous allez faire Monsieur le Président ?

Astonishingly, C3P found that fully 48% of all the material identified was linked to a single, French telecommunications company. The G7 is starting in the UK today. President Macron will be there. I wonder if any journalists will tackle him on this and, if so, what will he say? I expect he will be absolutely horrified because there is no doubt his Administration has been making several moves in the right direction and we expect to see even more.

A dark web problem? Emphatically not

You see all kinds of people rolling their eyes and talking about the dark web, encryption and a variety of subterranean exotica as if they were therefore already resigned to being powerless to do anything about csam. But the unvarnished truth is 97% of the csam detected by C3P was in the clear web. So far from being intractable, online csam is highly tractable. What has been missing is the will on the part of far too many ESPs.

And a massive number of repeats

Perhaps even more schockingly 48% of all the take down notices issused by C3P related to images which they had already flagged as illegal to the same provider. That is truly shameful because the technology exists, and is widely available, which would allow any ESP to detect already known images and prevent them ever seeing the light of day again, at least not on a publicly accessible web page. Table 7 of the report (p 38) shows “recidivism” rates going up, not down. And clock Table 7.3 for the names of the companies involved.

Why don’t more companies use the technology that would allow them to detect csam in milliseconds? Because they don’t have to. No law requires it and this, maybe more than anything else, reminds us why self-regulation – voluntarism – has had its day.

Too slow, too slow

C3P says following the issue of a take down notice the median removal time for the item concerned is less than 24 hours. Bravo! But in 10% of the cases it took more than seven weeks for the illegal material to go.

That is utterly unacceptable and again is a product of voluntarism. And by the way it seems the delays are longest where the images concerned involve older adolescents. This conjures up several unpleasant thoughts about non-expert Sysadmins second-guessing child protection experts meanwhile leaving a child’s sexually abusive image on view until they conclude their own internal interrogation. Not on.

Change is gonna come

From page 48 onwards C3P’s recommendations will be instantly recognisable by children’s advocates in the UK and the EU and in many other parts of the world.

Big Tech obstructionism created space for too many bad guys to flourish

Any and every major thinker in the internet space has known this moment would arrive – the end of voluntarism – but too many industry leaders were determined to drag things out as long as possible to keep the mountains of cash rolling in. It was this obstructionism by Big Tech and their deliberate delaying tactics which created the space for myriad unsavoury characters to hide in the shadows. Until now. Thank you C3P. Keep it up.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Internet governance, Regulation, Self-regulation. Bookmark the permalink.