A big “thank you” to Facebook

The essence of Facebook’s argument for changing Facebook Messenger and Instagram Direct Messaging from unencrypted to encrypted environments is that “all the other major messaging Apps are already encrypted or are going to be so if we don’t do this it will hurt our business”. 

In the various presentations I have heard from Facebook personnel a more convincing or elevated motive has not yet surfaced.  There was a discussion about changing patterns of messaging, a trend towards smaller groups and so on but really it was pretty clear that while usually Facebook has been ahead of the curve, this time they were behind it. Encryption is now the fashion. They are going with it. Money beckons. It always does.

A much relished additional benefit of the company announcing its intention to encrypt is, if Governments and various interest groups fight them over it, Facebook gets a unique opportunity to present itself as a champion of privacy. The chutzpah, the irony, takes your breath away. But let that pass. What counts is now, not then.

Actually, I think we kind of owe Facebook a big “thank you”. They revealed the scale of bad or illegal behaviour in messaging Apps. In 2018 Facebook Messenger made 12 million reports to NCMEC, the USA’s official body for receiving details of online child sex abuse materials (csam).

In the same period how many reports were received from iMessage, the principal messaging App used on the Apple platform? 8.  That is not 8 million. That is 8 as in the single digit representing two fewer than 10. What is the difference between Facebook Messenger and iMessage? The latter is already encrypted.

Isn’t the real and obvious question therefore “If Facebook offers us a glimpse of the potential scale of offending in an unencrypted messaging environment what might be happening in the encrypted ones?”

No one knows.

I am going to write another blog on this (soon). It will be slightly more discursive (that’s code for “longer”) but in the meantime I think we need to shift the focus away from what one company (Facebook) is doing, to what encryption as a whole is doing or threatens to do to the modern world.

I mean we now know with complete certainty that techno wizards have an endless capacity for two things: making gigantic sums of money and getting things wrong.

Isn’t it time for citizens and our elected representatives to step up and say “Hold on guys. You are about to take another misstep. This time we can see it before you. We don’t want to wait for the apology or the promise to ‘try harder to get it right next time’. Let’s slow things down a little. Take a breath.”

My instinct is to say every service that deploys strong encryption must be required also to maintain a means by which, with proper authority e.g. a court order,  the contents and meta data associated with any particular message can be made available in plain text to the court or another appropriate agency. And enough of the talk of “back doors”.  No one I know wants them. Proper and transparent authority is what matters.

Moreover,  encryption comes in many forms and has many uses, most of them wholly benign. No way should anyone express blanket opposition to all forms of encryption everywhere and always. But in the realm of mass messaging services open to the public we need to insist companies explore, for example, the possibility of deploying tools which can scrutinise a message or content before it is encrypted. If alarm bells ring appropriate action can be taken. Here I am thinking in particular about csam but there could be other material of equal concern.

I understand there are flavours of strong encryption and ways of managing strong encryption which lend themselves more easily to the possibility of “peering in” to the encrypted tunnel to detect criminal behaviour. If that is true why would anyone want to use a flavour or a method that makes that impossible or appreciably harder?

Industry and Governments have created the climate or conditions that are fuelling the demand for encryption. We must not allow that climate to threaten the rule of law and neither should we allow it to put children in danger.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Privacy, Regulation, Self-regulation. Bookmark the permalink.