A further note on encryption

In my earlier blog on the subject of encryption I referred to a statement made by Mark Zuckerberg about his intention to make Messenger and Instagram encrypted services. WhatsApp already is.

Just in case some of you missed the significance of this decision we should not forget that last year, in three months, Facebook itself acknowledged that it removed 8.7 million “child nudity” images and 99%  (8.6 million) of these were detected and taken down before any Facebook user had reported them. In other words they were detected and taken down by virtue of the fact the company was using automated tools to find them. photoDNA was one such tool.  None or very few of the images that were removed would have  involved WhatsApp. The vast bulk will have come from Messenger and Instagram.

Thus the decision the company has made about encryption means it is planning to deprive itself of the ability to carry on doing what it has done in the past to protect children. In other words, without more, most if not all of the  8.6 million “child nudity” images would likely stay up and be distributed. That is tantamount to willfully abandoning the policy.

We don’t actually know (because Facebook won’t tell us) how many of the 8.7 million images were csam as opposed to “simple” nude pictures which nevertheless are banned by Facebook, but in an interview with Reuters in October last year Michelle deLaune of NCMEC said that they expected to receive 16 million csam reports from Facebook and other tech companies (up from 10 million in the previous year).

Assuming those projections and past numbers are correct no one I know is in any doubt that the lion’s share will be coming from Facebook’s currently unencrypted apps. The magnitude of the decision Facebook is on the edge of implementing therefore couldn’t be clearer. In Zuckerberg’s statement the company says it will look to find other ways of dealing with “bad things” but nothing can beat what they do now so why stop doing it?

I wonder if this proposal has been discussed by Facebook’s Global Safety Advisory Board and if so what comments they made about it? I know paper tigers rarely roar but it would be good to know if they even whimpered or raised an eyebrow.

My message to Facebook  is crystal clear. I get why, after the hammering you have received over your past privacy failings, you might want to create a strong impression that you are now in a completely different place.  Maybe you looked at the “bad publicity” Apple got around its refusal to co-operate with the FBI in the terrorist case and you thought a little bit of that would be ok if it helped establish a new idea about Facebook  in the minds of the internet using public.

But don’t make children pay the price. By all means proceed with encrypting Messenger and Instagram but do not use a form of encryption which makes it impossible to detect csam. And while you are about it put WhatsApp on the same footing.

 

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Default settings, Regulation, Self-regulation. Bookmark the permalink.