There is something strange going on

I expect many of you will by now have seen the piece in the weekend edition of the  New York Times. I will be writing more fully about it quite soon but, meanwhile, there was something that rather leapt off the page.

In the article we learn that  in 2017 NCMEC received 18.4 million reports of child sex abuse material. This was up from “only” 1 million  in 2014.

18.4 million is one third of all the reports NCMEC has received since it was first founded in 1998. Clearly this  gargantuan and seemingly still growing volume is attributable to  the increased use of automated systems to identify and report csam. Incidentally, since you asked, the 18.4 million reports contained 45 million images and videos flagged as csam.

In the later blog I mentioned I will return to what these numbers tell us about the progress being made in combatting csam. It might not be all bad news. Key word: might.

What hit me between the eyes?

Here is what the article says

Of the 18.4 million reports received by NCMEC 12 million came from a single source: Facebook Messenger.

Almost two thirds from a single App.

Massively disproportionate

I know Facebook Messenger is popular but their contribution to the total is massively disproportionate. What it suggests to me is not that there is an over concentration of bad actors using Facebook Messenger. Instead :

Either

(a) Facebook is sweeping up and reporting everything to NCMEC that is being identified as being contrary to its policies, so a lot of stuff is included in their reports to NCMEC which may be against Facebook’s policies but is not actually illegal

Or

(b) A great many companies are not being as diligent or as serious as Facebook.

It could be a bit of both but I suspect it is much more (b) than (a).

Not knowing the truth is unacceptable

The case for greater transparency is overwhelming. As long as online businesses believe their identities will be shielded from public view by a good guy in the middle they will feel under no pressure or less pressure to up their game. We have known this for a while but the New York Times article puts it beyond dispute.

NCMEC in the USA, the IWF in the UK  and, internationally INHOPE, should voluntarily publish the fullest possible account of the sources of the illegal images they are processing.

And if they won’t do it voluntarily they should be compelled to do so by law.

There was a lot of other stuff in the Times article but I hope you agree that will do for now.

 

 

 

 

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Internet governance, Privacy, Regulation, Self-regulation. Bookmark the permalink.