Good but nowhere near good enough

Last Thursday Facebook made an announcement about its plans for Messenger. In truth the substance of the announcement concerned great stuff I thought they were already doing, at least on their main Facebook platform, so discovering they were now going to do the same on Messenger was a little underwhelming.

But first, if you click on the link to the announcement you will see it is headed

“Preventing Unwanted Contacts and Scams in Messenger”. 

So we’re clear, this is not about the content of messages in Messenger, at least not insofar as it relates to known illegal images of child sex abuse, the sort that have previously been picked up by PhotoDNA.

Then we see these important words

“As we move to end-to-end encryption, we are investing in privacy-preserving tools….. to keep people safe without accessing message content.”

This bears out two things: it ain’t about content, illegal or otherwise, and they are going ahead with it.

Their mind is made up. They know exactly what they are doing and why they are doing it. The only Damascene moment they are likely to experience will have been the result of legislative action or the threat of it in a jurisdiction which is important to their business.

Don’t get me wrong.  As I suggested earlier, the measures they are proposing are welcome. However, even Alex Stamos, former Chief Security Officer for Facebook, could only bring himself to say it was a “good start”.  Maybe he is as underwhelmed as I am.

Analysing metadata

The software tools Facebook say they will be deploying in Messenger will analyse metadata to pick out dodgy patterns. Once detected, an alert will be triggered on the end user’s screen and maybe there will also be an intervention by the company itself.  But what these tools will not do is spot known illegal content being exchanged between users.

Facebook must find a way to convince us

Facebook must find a way to convince independent and respected experts that its move to encrypt Messenger has not worsened the lives of children who have suffered the tragedy of being sexually abused while a camera was trained on them. If Facebook cannot do that I really don’t know where it is going to leave the company’s reputation. And I don’t mean just with people like me.

Finally, be aware, dear readers, some of the members of the Silicon Valley chapter of the Sons of Anarchy are working on ways to encrypt metadata. Can you see where this is going? Can you see the next pressure point?  When other messaging services announce they are encrypting users’ metadata, rendering it unreadable, how will Facebook react?

What this highlights and reminds us is Facebook is really a victim of its previous policy of transparency and US laws on reporting. To that extent it is “unfair” to single them out or pick on them. There is a larger and wider issue to be faced about how modern societies tackle the emergence of strong encryption.  And we need to be emphatic about that. It is a societal issue, not a technological one individuals or companies can decide for themselves.

If there has been a rise in the demand for encrypted services it has been caused by the previous bad behaviour of companies like, er, Facebook. Surveillance capitalism did not drop from the skies, but when it got here it was compounded by the bad behaviour of certain Governments, as Snowden reminds us. However, we cannot let the previous bad form of Governments and companies create an insoluble problem for the rest of us.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, E-commerce, Facebook, Privacy, Regulation, Self-regulation. Bookmark the permalink.