Last Wednesday the USA’s National Center for Missing and Exploited Children (NCMEC) published its numbers for 2020. 16.9 million reports received in 2019 grew to 21.7 million in 2020. That’s up over 25%. Messaging platforms remain the largest source.
21.4 million of the 2020 reports came directly from online businesses themselves, the balance from members of the public. The latter represents a threefold increase on 2019. Strikingly, there was a year-on-year increase of almost 100% in reports of online enticement. A consequence of large scale lockdowns around the world? Probably.
The 21.7 million reports, among other things, contained 31,654,163 video files and 33,690, 561 files containing still pictures. A single report can reference more than one item.
Thus, within the total number of reports there is an overwhelming focus on dealing with illegal images of one kind or another but the 120,590 “other files” shown in NCMEC’s chart also represent serious threats to children.
With 2,725,518 reports India, once again, heads the country list. The Philippines, Pakistan and Algeria come next, a long way behind but still all above the 1 million mark.
Good news or bad news?
People opposed to proactive scanning for child sex abuse on messaging platforms sometimes point to these numbers and say because they are always going up this proves scanning is not a useful deterrent. Some say we should even call the policy “a fail”.
Because criminals steadfastly refuse to complete annual returns faithfully declaring what they did last year while outlinging their plans for the coming 12 months, we have never known and can never know just how much csam is, has been or is likely to be out there or how many attempts have been or will be made to engage children online in a sexually abusive way. NCMEC’s new numbers could therefore simply be telling us we are getting better at detection. What they definitely do not do is provide a mandate to abandon this area of crime-fighting, deserting the victims, declaring victory for child abusers and the unmanageability of the online space.
The tools we have at our disposal today are just better than they used to be and are being more widely and energetically deployed. And of course there are more internet users this year than there were last year. There is bound to be a part of the increase which is attributable solely to this sort of organic growth. It can be expected to continue for some time as the availability of wifi and broadband expands and more and more of the world goes online.
In any and every area of crime, detecting and addressing criminal behaviour after the event is or ought always to be only one part of a larger strategy in which prevention through education and awareness raising are always to be preferred. But the idea that you should refuse to try to mitigate the effects of criminal behaviour wherever and whenever you can, particularly where children are concerned, is both heartless and an insult to the victims. Actions speak louder than words and no action speaks louder still.
Meanwhile in the EU
The previous week NCMEC published statistics showing reports received from EU Member States were down by 51% since the December, 2020 date when the European Electronic Communications Code took effect.
Set against an overall global rise, the fear must therefore be that by reporting a percentage fall in reports from EU Member States European kids may be faring even worse than children in other parts of the world. Commissioner Johansson pointed out, in the EU, 663 reports per day are not being made that otherwise would have been. That would be true if the level of reporting had remained constant. Evidently that is not so. The real number of absentee reports will probably be north of 663.
And still the European Parliament paralyses the process of reform.
Facebook on manoeuvres
Let us recall last December when the new European Electronic Communications Code kicked in Facebook, a notoriously litigious, combative company, decided it would break ranks with industry leaders by stopping scanning for child sex abuse. Facebook could have fought it or, like their colleagues, ignored it. They didn’t do either.
Cynics have suggested the company’s decision to roll over like an obedient puppy dog was inspired by a desire to pave the way for their long declared ambition to introduce strong encryption to Messenger and Instagram Direct. If there is no legal way to scan messaging platforms whether or not the platforms are encrypted almost ceases to matter.
Facebook’s December decision certainly appeared to legitimise opposition from groups who have always been against scanning for content and behaviour that threatens children.
The effrontery of the most privacy abusing business in the history of the Planet Earth performing a complete volte face , and doing so at the expense of children and law-abiding citizens generally, takes your breath away. No warm words can wash that away.
Hold that thought for a moment.
A matter of timing?
Facebook has recently conducted research into child sex abuse activities on their platforms. The results have just been published in a blog.
There were two separate studies. They both raises doubts about or question the value of proactive scanning to protect children.
This is a radical break with Facebook’s past. They proudly and repeatedly used to declare their commitment to proactive scanning for content and activity which threatens children. In fact to their credit they have continued scanning for signs of people likely to engage in self-harm and suicide although quite how they square that with what they are doing in relation to child sex abuse momentarily eludes me.
Who could be against research? Not me. But the same cynics I referred to earlier were not slow to point out that the timing of the release of this research does make one wonder if it was done with the purest of motives (see below). Did the people who actually did the work or who decided when to publish pause to wonder if they were being manipulated?
A surprise
The first of the two studies found that in October and November of 2020 90% of all the content found on their platform and reported to NCMEC concerned material that was identical or very similar to previously reported material.
I’m guessing those of us who have worked in the field for a long time might be surprised it was as low as 90%. I had always understood the percentage of repeats would be in the very high 90s. High percentages show the proactive tools are doing their job. This is why their continued use is so important, particularly to the victims depicted in the images. The fact that an image is repeated only underlines and magnifies the harm being done to the child. Most certainly it does not diminish it.
In asserting their legal right to privacy and human dignity, victims want every instance of the image gone, no matter how many times or where it appears.
Publishing a number like “over 90%” without explaining this kind of context is likely to lead an ill-informed observer e.g. someone in a hurry with lots of papers to read, to wonder what all the fuss is about?
If you note in NCMEC’s report they refer to having received reports of 10.4 million unique images, specifically distinguishing them from the repeats which we are asked to believe make up 90% of the payload in Facebook’s research.
More potentially misleading impressions
In the same blog and referring to the same study Facebook goes on to tell us “only six” videos were responsible for more than half” of all the reports they made to NCMEC. Apart from being left to speculate about how many videos made up the other half the obvious question is “And your point?”
My guess is what will stick in busy people’s minds is “six”. Six and 90%. Headline numbers. Watch out for them being repeated by, well you know who by.
The second study
Taking a different timeframe (why?), July-August, 2020 and January 2021, and a different, much smaller cohort (only 150 accounts) we are told of the people who uploaded csam that was reported to NCMEC 75% did so without apparent “malicious intent”. To the contrary the reserach suggests the individuals committing the crime of uploading csam acted out of a “sense of outrage” or because they thought it was funny. “75%”. That’s another headline number that will stick and be repeated.
Maybe there is a paper somewhere which explains how Facebook concluded there was no “malicious intent”. I cannot find it but it is not hard to work out the net effect of Facebook’s various self-serving timely manoeuvres.
The target audience is politicians and journalists
Facebook wants people – and by that I mean principally politicians and journalists- in Europe, the USA and elsewhere, to start thinking the problem of online child sex abuse is different from and a lot smaller than they might previously have believed and that it is substantially down to (excusable?) human idiocy.
Yet the unalterable truth is the images need to be gone. That’s the beginning and end of it. If we have the means to get rid of illegal images of children’s pain and humiliation, why wouldn’t we? Why would we, instead, deliberately hide them? Money is the only answer I can come up with and it is not good enough.
Poor substitutes
In the third part of the same blog Facebook tells us about other things it plans to do to address people’s apparent lack of good taste in jokes or their stupidity.
So far they have come up with two pop-ups. Bravo. Facebook should put them out anyway. Neither gets anywhere close to compensating for their plans on encryption. In any other walk of life if a group of people combined to hide evidence of crimes my guess is they would be arrested and charged with conspiracy to obstruct the course of justice.
Facebook’s numbers in 2020
The results of Facebook’s research came out in the middle of the row in the EU and right up against the publication of NCMEC’s new numbers.
In 2019 NCMEC received 16,836,694 reports of which 15,884,511 (94%) came from Facebook owned platforms, principally Messenger and Instagram. In 2020, of the 21.7 million, 20,307,216 came from the same places (93%).
Although I am extremely critical of Facebook we should not forget two important qualifiers: they are by far the biggest platform in the social media space and we only know so much about them because data are available. This is because Messenger and Instagram Direct are not (yet) encrypted.
You therefore have to wonder what is happening on other messaging platforms that are already encrypting their services and so can produce almost no data. Actually, we need not wonder all that much.
A glimpse behind an encrypted door
Last Friday The Times revealed in 2020 UK policing received 24,000 tip offs from Facebook, meaning mainly Messenger and Instagram but only 308 from WhatsApp, which is already encrypted.
With 44.8 million users the UK has the third highest number of Facebook customers in the world behind India and the USA. All of the 44.8 million will have Messenger because it is integrated into Facebook but on top Instagram has 24 million UK users. Obviously there is likely to be a large overlap between Messenger and Instagram. WhatsApp has 27.6 million users in the UK.
It’s impossible to say what the WhatsApp number “should have been” – too many imponderables- but the ratio of 308:24,000 looks a little off. If anything you would expect the traffic in illegal images to be greater on WhatsApp precisely because it is already encrypted. Think about that.