Companies behaving like states

Tech companies and Governments between them created the crisis of confidence that is said to be stimulating the demand for greater online privacy protection. Yet they are now lining up on opposite ends of the argument about how to address that crisis.

Certain  businesses seem to think the answer to their own past failings is to deploy forms of strong encryption that not only shields content from their own eyes but also makes it impossible for it to be seen by others with a legitimate interest e.g. those concerned with the administration of justice.

No human rights document or charter has ever said privacy is an absolute right so why do some companies seem intent on making it one?  It is hard to think of a clearer example of companies behaving like states. In this case bad states.

We have to find ways of deploying encryption that allows the legal system to carry on pretty much as before. Justice delayed is said to be justice denied. In this instance justice is being adjourned sine die.  

Putting children at risk

This is far from being merely a theoretical challenge. We have accepted, even rejoiced, in tools such as PhotoDNA. It came out in 2009 and has allowed businesses to spot child sex abuse material, delete it from their systems and report the distributors.

PhotoDNA was the first. Along with similar tools  developed by other companies, this represented a huge step forward in online child protection. However, these tools have been served with a redundancy notice by the spread of strong encryption into major messaging and cloud storage platforms, probably the two areas where they are most needed.

Limits of Artificial Intelligence

People speak about the potential benefits AI will soon deliver across multiple headings, for example in relation to detecting grooming. Those benefits will be reduced to nought within encrypted environments.

Obviously that is not a reason not to proceed with AI because an awful lot of activity will continue to take place in environments that are not encrypted. However, we need to be clear AI is not going to be any kind of panacea, particularly in respect of reducing the kind of serious criminal activity commonly found in encrypted spaces.

Displacement and encouragement

Actually, isn’t it likely that as unbreakable strong encryption becomes more widespread and probably easier to use,  criminal behaviour currently taking place in unencrypted environments will shift there? Encouraged by its impregnability isn’t it likely to grow?  Maybe the arrival of quantum computers will mean there will cease to be such a thing as an unbreakable form of encryption but right now nobody can rely on that or predict what else might follow. Speaking of quantum computing….

Five bits

The CEO of Post-Quantum  has suggested a solution to the encryption conundrum. He thinks companies and organizations that deploy strong encryption should split the relevant decryption keys into five parts. These would be distributed to five trusted bodies, one of which could be the company or organization itself.

Only on production of proper legal authority would the key holders be able to co-operate and provide law enforcement with the means to decrypt  or open specific devices, messages or stored content linked to identified entities.

No back doors, no mass or indiscriminate surveillance. It sounds like an avenue that is worth exploring, although a person who works inside a company that provides encryption services didn’t quite get what would be gained by involving so many players i.e. five instead of one.  If a court gives an order isn’t that enough? Good question. I only mention it to illustrate that people are at least trying to find answers.

Do you trust democracy?

Ultimately, whatever  technical fixes emerge, for those who are anxious about privacy it will come down to whether or not they trust the political and other institutions that make the laws, appoint the judges and supervise the enforcement of the law in their country.

When you look around the world it is not hard to work out why many people don’t have such trust and therefore why they cling on to what they think strong encryption can do for them. However, that is a problem under a different and larger heading.  It requires a political fix not a technical one. Puleeze don’t try to tell me strong encryption will help deliver that political fix. Truly oppressive states have so many other real world tools at their disposal.

But suppose you do have faith in your own judiciary, your own political institutions and your own means of providing oversight for the justice system and the security services?

Can we  still do nothing until every Government in the world is committed to liberal democracy? Must I tell a mother in Portsmouth that until the guys in Pyongyang take out a subscription to The Guardian we cannot use certain tools to protect her child from a sexual predator, or even try to invent new and better tools that could? That is the worst kind of blind utopianism.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Consent, Internet governance, Regulation, Self-regulation. Bookmark the permalink.