Read this and weep

Hot news: on Thursday evening in Brussels officials working on the draft ePrivacy Regulation agreed they were at an impasse. They cannot reach a consensus on the text so the matter has been referred “upstairs” to see if the outstanding issues can be resolved at political level i.e. by national governments.

One of these outstanding issues concerns the use of PhotoDNA and similar tools which can detect child sex abuse material moving across a network or being stored on devices connected to a network.  PhotoDNA is currently being used extensively for those purposes and it works spectacularly well. It represents the single most important advance in eliminating child sex abuse material since the internet began. The draft makes it illegal.

The only way I can explain the otherwise inexplicable is that privacy people are taking an exceptionally narrow, blinkered view of what privacy means. This is limiting their ability to see the absurd results of their decisions, in this case profoundly damaging consequences for children’s safety and well-being across the 28 Member States.

Children’s advocates across the European Union need to get busy. We must lobby national Governments to include in an Article (not simply a Recital) a specific exemption to allow businesses to deploy or continue deploying tools which are designed to protect children.

At the moment only officials from Ireland, the UK, Portugal, the Czech Republic and Belgium are speaking out directly in a way which is favourable to children’s interests. The rest appear to be opposed or are silent, which amounts to the same thing. The Austrian Presidency is being singularly unhelpful. This matters precisely because they hold the Presidency. Member States look to them for a lead. If you know any Austrian Members of Parliament or Ministers, get on the phone straight away.

The current draft Regulation allows Member States to  “derogate from” (opt out of) the ban on PhotoDNA and similar tools. In other words unless an individual country  “derogated” the use of PhotoDNA would be illegal in that jurisdiction.

This is wholly inadequate as it would inevitably lead to a patchwork of laws. That could, effectively, cripple operations.  And, to repeat, what the Regulation is proposing is to disrupt systems that are already in place and have been working brilliantly for several years.

What is the EU saying? Children in Britain, Ireland, Portugal, the Czech Republic and Belgium or other  smart countries that choose to opt out can be protected but kids in other jurisdictions  we just have to hope their national legislative bodies get their act together, preferably sooner rather than later? How does this square with other lofty statements about the importance the EU attaches to protecting children? Answer: it doesn’t.

How does it square with the 2011 Directive which requires every Member State (no discretion or variation allowed) to have machinery to remove discovered child sex abuse material? How does it square with the UNCRC and the Declaration of Fundamental Rights? States have obligations to protect children.  Surely states do not have a discretion to prevent companies from protecting children?

Meanwhile, in other hot news UK MEP Mary Honeyball is tabling a question in the European Parliament asking officials to document how and when experts on online child safety and officials within the Commission with a brief to look after children’s interests were consulted on the content of the draft Regulation. I think I already know the answer. They weren’t.

Is this how it was for Woodward and Bernstein?

Background

Today practically every law enforcement agency in the world is saying they cannot cope with the volumes of child sex abuse materials circulating on the internet. They desperately want tech companies to help stem the tide. If we hear criticisms these are generally grumbles from governments, the police or civil society about the private sector not doing enough. It is therefore strange beyond words to hear objections to companies doing too much to try to make the internet safer for children. Yet that is what it looks like here.

“Multistakeholderism”

Almost universally the task of making the internet a better place is seen as a joint endeavour. We even have a word for it: “multistakeholderism”.

The European Union in particular has been an extremely vocal and energetic proponent of multistakeholderism but it is becoming apparent that the message has not been filtering down or taking hold evenly in all parts of the Commission and its associated agencies.

Who gave you permission to do that?

The argument focuses on PhotoDNA but the logic extends way beyond that single product. No one is suggesting Microsoft did anything illegal when, unilaterally, it developed PhotoDNA and decided to give it away to companies or agencies with a legitimate interest in locating, deleting and reporting child sex abuse material. The fact I am even writing a sentence like that tells you something about how ridiculous this whole thing has become.

While nobody is accusing Microsoft of breaking the law, in effect what is being said is that because there is no EU or national law which specifically mandates or specifically allows PhotoDNA to be deployed,  really it has to stop.

In some countries citizens can only do what the law allows. In others, citizens can do anything providing the law says it is not illegal. I know which approach I prefer. Up to now, deploying tools to protect children has never been illegal. Let’s keep it that way.

If an individual company can be shown to be using tools ostensibly designed to protect children in order to gain access to metadata, or anything else which, otherwise, they would not be able to access, and they then exploit the information thus obtained for commercial purposes, they should be brought before a court and severely punished. But absent that we should leave things as they are.

Bungling again

Signor Buttarelli has made clear he is worried that if an agreed text for the Regulation is not finalised swiftly then all will be lost when the current Parliament dissolves ahead of the next European elections. Having to start again with a new Parliament and a perhaps new Commission clearly holds little appeal.

Remember what happened when the GDPR was coming through? On the little matter of the age limit for children to give consent without online businesses having to engage with obtaining verifiable parental consent, the privacy priesthood inside and outside the Commission united behind a single proposition. They said there should be one age, 13, which should be applied without exception in every Member State.

They presented no evidence in support of it,  did no research to justify it, simply believing they could steamroller it through, hoping that, after four years of discussions, politicians would say “enough already”  because they wanted to move on to something new.

At the very last minute politicians bit back. We ended up with a patchwork of ages. And now, guess what? We are hearing across the EU that data protection authorities are astonished to learn how many companies are abandoning the idea of obtaining consent altogether, including for children, instead using legitimate interests or contracts as the basis of their relationships with their customers.

While constantly talking about the need to engage with parents, in effect, businesses have intentionally cut parents out of the loop. By a back door a minimum age of 13 has been created EU-wide. Politicians have been thwarted by a sleight of hand. Article 8 is becoming a dead letter.

Privacy leaders need to start engaging with the children’s agenda

What this story illustrates is, with notable and honourable exceptions, thinking through how to address children’s interests, is not high on the list of priorities of the leadership of the privacy world.

That has to change.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Default settings, E-commerce, Internet governance, Privacy, Regulation, Self-regulation. Bookmark the permalink.