On encryption and child protection

A company is normally driven by a desire and a legal obligation to build “shareholder value”. In the case of one company, Facebook, Mark Zuckerberg owns a majority of the voting stock so when looking at its big decisions we are not talking about a “company” in the way it is generally understood. We are talking about decisions taken by one person.

In recently leaked transcripts of an internal staff Q&A session Zuckerberg acknowledged that it is only because he owns a majority of the voting stock he is still in post because “some of the things I have done would otherwise have got me fired several times over.”

This uncomfortable fact of ownership and control matters hugely at the moment because Facebook, meaning Zuckerberg, has announced an intention to introduce end-to-end encryption (e2e) for Facebook Messenger.

12 million reports in 2018

In 2018 Facebook Messenger’s automated systems identified, deleted and reported 12 million instances of child sex abuse related activity or material. Any images of child sex abuse thus detected typically were gone within minutes or hours. Bravo.

Yet this will end if Zuckerberg persists with his plans.

In anticipation of introducing e2e on Messenger Facebook said

“We are working to improve our ability to identify and stop bad actors across our apps by detecting patterns of activity or through other means, even when we can’t see the content of the messages, and we will continue to invest in this work.”

Any alternative approach which maintains or improves on the status quo in terms of protecting children will be welcomed by everybody. Anything that changes the status quo in the wrong direction will not be. If Facebook cannot actually see the content, it is difficult to imagine how, for example, they will be able to spot illegal images, therefore they will not be able, as now, to delete or report them in fast time or prevent their further distribution. This will compound and expand the harm already done to the child in the image and limit the possibility of her or him being rapidly identified and located in real life.

The alternative being offered

As a way of ameliorating the freely acknowledged adverse impact of going encrypted, I understand, inter alia, Facebook is suggesting that where they find behaviour which suggests a connection with a bad actor the individual’s account will be closed down. Let’s not dwell on the obvious implications of this. They are not the main point.

In addition, seemingly Facebook will hand over to law enforcement the metadata of the person. 12 million reports? A deluge of data will be added to the pre-existing deluge.

Has Zuckerberg had an irony bypass?

The irony of Facebook seeking to position itself as a champion of privacy will not be lost on those who have documented its persistent failures in that field. But already Zuckerberg’s strategy is paying dividends. Just look at the long list of free speech and similar organizations that have signed a letter praising Facebook’s decision and urging them on.

Few people will believe Zuckerberg’s Damoscene moment was prompted by anything other than a calculation about the future profitability of the good ship Facebook. Here’s my analogy. None of the porn companies and online gambling outfits active in the UK market wanted to introduce age verification until everyone did. They didn’t want less fastidious competitors to eat their lunch.

Similarly here, Zuckerberg has seen the likelihood of e2e services growing in importance so he has to find a way to move his major messaging services (Instagram gets caught up in this as well) into that space as quickly as possible.

If there is a sustained, public fight to bring that about, so much the better. The company once seen as the enemy of privacy will be able to burnish its reputation as a champion of it. Brilliant. But wrong. Wrong in principle but also wrong because it is too short-sighted.

Zuckerberg’s potential or actual motives, in truth, are irrelevant. What matters is the idea itself. It is a bad one that will not survive although it may not disappear quickly. Why? Because Facebook’s decision will prompt the US Congress to start off on a path which ultimately will lead to new, bi-partisan Supreme Court-friendly laws limiting what US-based entities can do with encryption, at least on mass messaging services. But before getting to that point Facebook and other businesses could find their devices and services banned in many different countries. Not all of these will be totalitarian dictatorships.

“Back doors” are a bad idea.

And here is the point: nobody I know wants or supports the creation of “back doors” into encrypted services. That implies the police, security services or others, could covertly access a person’s account without proper authorisation, be that a warrant or a court order. Such an approach is completely beyond the pale. But right now courts are issuing orders and they have no effect. Subpoenas and warrants are ignored or are not capable of being acted upon. That is not right. It is a trend that must be halted and reversed.

It was these sorts of concerns that were behind the US Government’s decision to call a conference yesterday under the title Lawless Spaces: Warrant-Proof encryption and its impact on child exploitation cases . Senior Ministers from the UK and Australia attended. I cannot recall any event like it devoted to the protection of children online.

Cynics say the US and other governments are showing fake concern about children when what they are really about is an undeclared intention to get to a position where they can spy on any and all of us in the online world as easily as they can in the physical world. Even if that were true it would not obviate the need to address the point about harming children, unless you are willing to accept that children are collateral damage, a sacrifice to be made on the altar of a different cause.

Companies or organizations providing encrypted services must be required to maintain the means whereby, on production of a properly authorised warrant or court order, they can produce a clear version of every piece of content they helped transmit. The businesses don’t have to hand over the decryption keys to anyone. They can do it all themselves.

Private organizations and the public interest

It is completely unacceptable for companies, or indeed any other types of private organizations, to decide that it is ok to create spaces which are completely beyond the reach of the law. Society is entitled to take a view on the balance to be struck between the public interest and the private interest. Facebook isn’t.

We should not base every decision we make about the internet solely on the basis of whether or not it helps or hinders paedophiles or puts children at risk without regard to any other factors. But equally I completely reject the idea that the protection of whistleblowers, political dissidents and the like trumps any and all other considerations. It’s back to that question of balance and who decides how and where to strike it.

The pendulum has swung too far. It is time for a correction.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Facebook, Privacy, Regulation, Self-regulation. Bookmark the permalink.