A problem of trust. Not tech. Part 3

In my last blog I pointed out that companies have significant financial incentives to switch to or use E2EE systems.

I also reminded readers that while the message itself might be strongly encrypted the surrounding metadata will typically be in the clear.

If the service provider can collect, analyse and exploit the metadata, to make more money out of you, other interests can do the same for other purposes. They can do so without the knowledge or permission of the service provider, or you. Obviously.

Former NSA General Counsel Stewart Baker  said

“Metadata absolutely tells you everything about somebody’s life. If you have enough metadata you don’t really need content.”

I am sure that’s right to a considerable extent even though, in particular cases, access to the content might be essential to get a warrant or bring a prosecution. However,  if you know “everything about somebody’s life”  it might not be too difficult to use such intelligence to find another, legitimate route to the same ends.

Metadata can get you killed or outed

In 2014 General Michael Hayden, former Head of the NSA said  

“The US Government kills people based on metadata”

Obviously he was not referring to people being killed on US soil but..

In an article published last year the Electronic Frontier Foundation pointed out that by analyzing metadata a hypothetical “they” could

“….. know you rang a phone sex service at 2:24 am and spoke for 18 minutes. They know that you called a suicide prevention hotline from the Golden Gate Bridge.

They know you spoke with an HIV testing service, then your doctor, then your health insurance company in the same hour.

They know you called a gynaecologist, spoke for a half hour, and then called the local Planned Parenthood’s number later that day. But nobody knows what you spoke about.”

Careless words cost lives

The way  E2EE in general and strong encryption in particular are presented – as an ultimate and complete guarantee of privacy – is positively misleading and in parts of the world it could put you in extreme danger. So please, dial down the rhetoric.

And the next time someone mentions the tragedy of the Ukraine ask them to explain exactly how E2EE could have helped prevent the invasion or mitigate its worst effects.

The privacy E2EE is providing is really only partial or incremental. And soon it may not even be that. See my next blog.

Inspecting content in E2EE environments

Here is an extract from WhatsApp’s FAQs

“WhatsApp automatically performs checks to determine if a link is suspicious. To protect your privacy, these checks take place entirely on your device, and because of end-to-end encryption, WhatsApp can’t see the content of your messages.”

Please note the bit I put in bold.

This sounds remarkably similar to Apple’s solution and I like it.

And of course if you allow companies to contact you via WhatsApp Business once the messages or other data are stored at their end you have no real way of knowing what becomes of the data or the messages they have harvested.

My next blog will be the last in this series.  I think.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Executive Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. This was renewed in 2018. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Facebook, Internet governance, Privacy, Regulation, Self-regulation. Bookmark the permalink.