Further thoughts on the encryption debate 1

Wherever legislation intended to make the internet a better place for children is being promoted it will be extremely important for child protection advocates never to give the impression “we are against encryption”. Why? Because we aren’t, or we shouldn’t be. Anyway I’m not. Whenever anyone suggests otherwise don’t let it pass unremarked.

There is no generalised problem with encryption

Encryption is a many-splendoured thing. It is a feature of and essential to a wide range of applications and uses. The specific problem I have with encryption is not a generalised one. It is the way it has become a significant enabler of crimes against children through being integrated into mass messaging Apps and platforms which are being made indiscriminately available. This has only become apparent in recent years because we have finally been able to see the numbers. The evidence is indipustable. It is not a hypothetical.

In the beginning

When Phil Zimmerman first released PGP so anyone anywhere in the world could download and use it, yes it raised eyebrows. But for a long while strong encryption associated with messaging in the consumer space was irrelevant.

Being clunky and time consuming to use, PGP wallowed in the margins of the nerdhood.  Outside of B2B and niche environments its only users were a comparatively small community of geeks and survivalists. They didn’t want to take any chances that “they”/ the Man/CIA/FBI etc might discover arrangements were being made for the local chapter to meet up at the Mall tomorrow at 3.00.p.m to buy more long-life milk.

As with so many things associated with the internet, however, things which might previously have been tolerated or been manageable, have been completely and utterly changed by the scale, speed, technical and jurisdictional complexity which is all part and parcel of the TCP/IP package. An unintended consequence.

Encrypted messaging Apps are putting children in danger

A senior Apple executive acknowledged that, on a substantial scale, their messaging capabilities were being used to exchange child sex abuse material yet they found and reported almost none. Why? Because their messaging Apps are encrypted.

Apple – an acknowledge global leader in privacy – came up with an elegant, privacy-respecting way of identiyfing, deleting and reporting known child sex abuse material but then, under pressure from the disbelieving, self-interested privacy lobby, instead of sticking by their guns and defending their proposal, Apple abandoned it. This was one of the most shocking corporate decisions taken by any Big Tech company in my lifetime.

Meta say they are going to encrypt what is currently unencrypted, knowing perfectly well what the consequence of their decision will be. In some ways this is even more shameful than Apple’s volte-face. Why? Because here we have the data so there can be no dispute or room for speculation about what will happen as a result. Meta is knowingly putting children in harm’s way in a manner which presently it does not. They are cocking-a-snook at Governments and public opinion throughout the liberal democratic world. They cannot possibly believe there is any serious support among the wider public for what they are proposing to do.

Which makes you wonder what exactly is driving them on. In the case of Meta it’s usually money. They think they can cut the otherwise ever-spiralling costs of moderation and, given their reckless profligacy on the Metaverse, witness the huge staff layoffs of late, cost-cutting in Palo Alto is obviously very much in favour.

Actually, Meta has already stopped looking for csam

Meta has already stopped looking for child sex abuse material linked to its main messaging Apps but only in the Ukraine and Russia. Where’s the logic in that? War zones and refugees trails are notorious for providing opportunities for the sexual abuse of children. This is the moment, these are the circumstances in which Meta chooses to stop looking for evidence of child sexual abuse? Words fail me. At least printable ones do.

I will refrain from making any observations about their even-handedness in treating Russia and the Ukraine identically.

But privacy respecting, child protection tools exist

To remind ourselves: tools exist which, if deployed, will ONLY  do one or other of three things:  (i) see and address already known child sex abuse materials,  (ii) see and address images which are likely to feature child sex abuse, (iii) see and address behaviour which is likely to indicate a child is being groomed for an illegal sexual purpose.

These tools do not collect, store, process or in any meaningful way “see” anything else.  In the case of the latter two, if someone checks it out and sees nothing illegal appears to be going on, it was a false alarm, that’s it. An action motivated by individual suspicion comes to an end. Nothing happens. No record is made so no record is or can be kept. No prosecution or investigation ensues. Nobody’s name or reputation is trashed. Nobody’s time is wasted. No harm. No foul. De minimis applies.

And if you don’t believe any of that or you worry others might not believe that – we live in a world of zero trust- the answer is transparency and auditability, not a refusal to engage in the first place.

For all these reasons, as in the UK, the idea of the “duty of care” is so important. This says to tech companies, they can go ahead and use encryption anywhere they like. Including in messaging apps. But don’t do it in a way which puts children in danger. If you can show us that is how you have implemented it we’ll leave you alone. If you can’t we will tell you how you can and compel you to act. You had your chance to solve the problem on a voluntary basis. But you blew it. The worm has turned.  That’s what happens in democracies.

 

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Internet governance, Privacy, Regulation, Self-regulation. Bookmark the permalink.

1 Response to Further thoughts on the encryption debate 1

  1. Pingback: Children are not pawns in a geo-political chess game | Desiderata

Comments are closed.