Don’t be a child in Europe

Yesterday the European Data Protection Supervisor (EDPS) published an opinion on the European Commission’s proposal for a temporary suspension of parts of the e-Privacy Directive of 2002. It is a weak Opinion, riddled with error. The good points the EDPS makes are dwarfed and completely overshadowed by the bad.

A rebuke

A major part of the Opinion, in essence, is a rebuke of European Institutions for not doing things in precisely the right order, in exactly the right way at the right time.   The Opinion shows an abundance of bureaucratic correctness which entirely misses the human heart of the issues at stake, as well as important parts of the law.

Everywhere else, in every legal instrument I have ever read, including the GDPR, we are told children require special care and attention. Why? Because they are children. The EDPS affords them no such considerations. 

Article 24 of the Charter of Fundamental Rights

The EDPS makes no reference to the explicit language of the EU’s Charter of Fundamental Rights. Nada.  Pas un mot. As an aide-memoire I repeat the key words here:

The rights of the child

  1. Children shall have the right to such protection and care as is necessary for their well-being…..
  2. In all actions relating to children, whether taken by public authorities or private institutions, the child’s best interests must be a primary consideration.

The EDPS never once even mentions the rights of children. If there is a balance to be struck he shows no signs of knowing how to locate the fulcrum.

A child’s right to privacy? Not mentioned

Search the document high and low. There’s nothing there. No mention of the legal right to privacy of a child who has been raped where pictures of the rape have been distributed for the whole world and her classmates to see. Not one word.

A child’s right to human dignity? Not mentioned

Neither is there any mention of a child’s legal right to human dignity which, in this case, entails getting the images of their humiliation off the internet, away from public view, to the greatest extent possible, as fast as possible. Not one word. 

The EDPS misunderstands the technologies

The technologies being debated do not understand the content of communications. They work in an extremely narrow and specific way.

If I go to a zoo wearing spectacles that only allow me to see zebras, the giraffes, lions and penguins will be invisible to me. They may pass in front of my unseeing eyes, but they might as well not be there. All I see are zebras.

This is how PhotoDNA works.  The EDPS is therefore simply, factually wrong when (page 2 and paras 9 and 52) he suggests there is any

“monitoring and analysis of the content of communications”

PhotoDNA only sees the zebras. In this case the zebras are the already known images of a child being sexually abused. That is to say an image that should not be there in the first place, which nobody has any right to possess never mind publish or distribute.

And the other child protection tools work in similar ways. They do not “analyse” the content of a communication. They cannot say what the picture is about or what a conversation is about. They can only say whether the communication contains known signals of harm or known signals of an intention to harm a child.

Do we really want companies to be indifferent and inert?

Does the EDPS want companies wilfully and knowingly to blind themselves to heinous crimes against children? Is he suggesting they should be indifferent to and inert towards what they are facilitating on their platforms?

A resolution of the European Parliament says otherwise

Law enforcement agencies have repeatedly stated it is completely beyond them to address these issues alone. They rely and depend on tech companies doing their bit, a fact recognised by the European Parliament less than a year ago.  On 29th November 2019 in a resolution  at para 16 we see the following:

“Acknowledges that law enforcement authorities are confronted with an unprecedented spike in reports of child sexual abuse material (CSAM) online and face enormous challenges when it comes to managing their workload as they focus their efforts on imagery depicting the youngest, most vulnerable victims; stresses the need for more investment, in particular from industry and the private sector, in research and development and new technologies designed to detect CSAM online and expedite takedown and removal procedures;”

How do scanning tools work?

The EDPS makes no reference to other types of scanning taking place on an extremely large scale, such as for cyber security purposes.  At a webinar organized by the Child Rights Intergroup on 15th October Professor Hany Farid made the following observations (at 24.28):

“If you don’t think that PhotoDNA and anti-grooming have a place on technology platforms then I ask you to do the following: turn off your spam filter, turn off your cybersecurity that protects from viruses, malware and ransomware because that is the same technology. And if you believe that we should use a spam filter and if you believe that you should protect your computer from viruses and malware, which I think you do, and if you believe that that technology has a role to protect this computer right here, then why shouldn’t these technologies protect children around the world? At the end of the day it is exactly the same technology, simply tackling a different problem.”

No mention of Microsoft’s Affidavit

On 14th October Microsoft published a sworn Affidavit in which the following words appear at para 8:

“PhotoDNA robust hash-matching was developed for the sole and exclusive purpose of detecting duplicates of known, illegal imagery of child sexual exploitation and abuse, and it is used at Microsoft only for these purposes.”  

At a LIBE Committee meeting it was suggested that companies were scanning content, ostensibly looking for illegal content then processing the data they collect for commercial purposes. Leaving aside the fact that this would be illegal anyway, the Microsoft Affidavit, under acknowledged pain of perjury, expressly states that is not happening.

Microsoft also published the terms of its licence which gives other companies and organizations permission to use PhotoDNA.

The EDPS makes no reference to the Affidavit. If it would help preserve the use of online child protection tools, surely other companies would be willing to swear similar Affidavits? Such Affidavits could remain in force at least until this matter is resolved, and even beyond if necessary.

The EDPS says he is worried about precedents

The EDPS says (para 53):

“The issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes.” (emphasis added).

Here the EDPS abandons lawyer’s clothes and dons those of a (not very skilful) politician or campaigner.

This is the notorious “slippery slope” argument. It is morally and intellectually bankrupt.  A demagogue’s trick. A sleight of hand.

The unnamed terror

What is the unnamed terror the EDPS is worrying about?  We are not told. Isn’t the position clear? The proposed suspension is entirely and only about the protection of children. Nothing else. Nothing that isn’t written in the document.

It is quite wrong and legally completely incorrect, to plead a concern for something that is not on the table, not in anyone’s line of sight.

If something comes up in the future deal with it on its merits.  If you agree with it say “yes”. If you don’t, say “no”.  Lawyers are meant to be able to distinguish between cases based on the facts.

Punishing children for other people’s mistakes

I have no brief to defend the Commission, much less the history of events leading up to  their proposal. But whatever the history, it is completely unacceptable to allow the tools to become illegal on 20th December only because nobody managed to sort this out to the satisfaction of the EDPS before now. 

That amounts to intentionally putting children in danger, punishing them for the past failures of others, adults who should have known better and acted differently sooner. Shame, shame.

Don’t be a child in Europe

Next week at the LIBE Committee meeting, if Members of the European Parliament are persuaded by the EDPS report, if it is ultimately reflected in the decision of the upcoming Trialogue and the tools are outlawed,  my advice is clear: “don’t be a child in Europe.”

Be a child somewhere else.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Privacy, Regulation, Self-regulation, Uncategorized. Bookmark the permalink.

2 Responses to Don’t be a child in Europe

  1. Pingback: children’s rights always second to ideology and profit – Urrús

  2. Pingback: the derogation – a summary – Urrús

Comments are closed.