Commas, common sense and csam

Many people in Europe and elsewhere simply cannot understand the whole gun thing in the USA. However, it is a perfect example of how words from another era  (1791), in this case compounded by the poor use of commas, can cause untold damage and misery if future generations regard themselves tied to those words irrespective of how much else has changed in between. Common sense alone should point us in a different direction.

A question about the GDPR

The GDPR is often held out as the best available model for modern privacy. It is the law in the EU and the UK and has been copied elsewhere.

There is a perfectly good argument to say the GDPR presents no insurmountable difficulties in terms of allowing a wide range of actions to be taken to identify, delete and report child sexual abuse material (csam), identify likely csam and identify grooming behaviour which threatens to put children at risk of being sexually abused.

But to the extent there appears to be any doubt or uncertainty about that proposition,  maybe it has been caught out by the pace of technological change, the legislation currently being considered within the EU and the UK provides a perfect opportunity to resolve matters.

Here are the numbers

The initial, draft version of the GDPR was published in 2012, reflecting conversations and consultations which had taken place over an extended period before that. The relevant parts of the text did not materially change before it was finally adopted in 2016 or became operative in 2018.

In 2012 the NCMEC Cybertipline received 415,650 csam reports. In 2013 it was 505,280. In 2014 we were at 1,106,07 (numbers sent to me by NCMEC because I couldn’t find the relevant annual reports online). A steep upward trend was already apparent.

In 2016, the year of adoption of the GDPR, NCMEC received 8.2 million reports and we were further advised this represented half of all the reports NCMEC had received since it was established in 1998. Yet more evidence of an accelerating trend. We didn’t know if there was more csam out there or companies were just getting better at detecting it but, since detecting it was a big part of what they were trying to do, that hardly mattered.

In 2018, GDPR adoption year,  we were at 18.4 million reports and in 2021 it was 29.3 million  (an increase of 35% on the previous year). Remember each report can reference more than one item. For example in 2021 the 29.3 million reports received contained 85 million different bits of csam.

As we come out of pandemic-related lockdowns, when the data for 2022 becomes available they are expected to show yet more growth.

An insight from Canada

In  2017, so between the GDPR being adopted and it becoming operative, those clever people at the Canadian Centre for Child Protection provided us with a world-first insight into the scale of the problem of csam. In six weeks, using a new automated tool, they looked at 230 million weg pages and found 5.1 million carried csam. These pages held  40,000 unique child sexual abuse images. Look at that timescale again. Six weeks.

And let’s not forget none of the numbers shown above include either anything at all or not very much from platforms providing encrypted messaging services. This includes the whole of Apple, who have described themselves as “the greatest platform for distributing child porn”.

Everything is moving in one direction

The numbers are going in only one direction and if the use of encrypted messaging Apps is going to grow in the years ahead the volume of csam circulating on the internet is likely to increase commensurately for the rather obvious reason more malefactors will feel confident they cannot be discovered.

This explains why we need complete clarity about what obligations to protect children fall on the shoulders of the providers of all kinds of messaging services, including the providers of encrypted messaging services. Ditto in respect of the transparency regimes which will reassure us all nobody is pulling a fast one on anybody else.