The shame continues

Last Thursday afternoon the LIBE Committee of the European Parliament met. The proceedings are viewable.

The meeting was scheduled to take two hours.  It had several items to discuss. The earlier ones overran. When the question of the derogation and children being sexually abused on the internet was reached the Chair of the Committee opened with the following words

“Let’s see if we can manage in 10/15 minutes the three points left on our agenda”.

The discussion of the derogation began at 15.35.  It ended at 15.51 so it got more than its fair share of what was supposed to be the remaining time but, even so, many will feel after all the energy that has gone into this issue, to get 16 minutes was, well, disappointing. But perhaps no more disappointing than the reasons why the discussion was necessary in the first place.

How did we get here?

Back in 2018 the EU adopted the European Electronic Communications Code (the Code). The law was due to take effect on 20th December, 2020.

In the early part of 2020, Commission officials realised some might interpret its provisions to mean it was no longer lawful for companies to continue voluntarily scanning messaging services looking for child sex abuse content or activity. 

Why? Because it became apparent the Code would be subject to the overarching provisions of the GDPR.  This raised the possibility that end-user consent was therefore required before the relevant data processing could take place. Alternatively, as a condition precedent, to allow the child protection tools to continue being used as before there also needed to be a much more detailed and broader legal framework put in place. 

Both these points are contested, but the key question here is had anyone in the Parliament, Council of Ministers or Commission spotted and considered any of this when the 2018 measure was being drafted, debated and finally adopted? No.

Children were forgotten or overlooked. Out of sight. Out of mind.  Simple as that. And not for the first time.

Did the European Data Protection Supervisor or his predecessor step in at any point during the co-legislative process or very soon after it concluded specifically to draw attention to these matters? No.

If anything, when the Supervisor did make an appearance, he contrived to make things worse by issuing an Opinion in which he omitted  to mention a child’s right to privacy or Article 24 of the EU’s Charter of Fundamental Rights which, just to remind you, reads as follows

  1. Children shall have the right to such protection and care as is necessary for their well-being…..
  2. In all actions relating to children, whether taken by public authorities or private institutions, the child’s best interests must be a primary consideration.  

There are several EU-specific and other legal instruments which likewise make clear children are in a separate and special class, that they do require extra care and attention which would not apply to other classes or groups. This fact appears to have escaped the Supervisor who, far from considering the particular position of children, felt free to say

“The EDPS wishes to underline that the issues at stake are not specific to the fight against child abuse but to any initiative aiming at collaboration of the private sector for law enforcement purposes. If adopted, the Proposal, will inevitably serve as a precedent for future legislation in this field.”

Actually the issues at stake are specific to the fight against child abuse. Children should not be lost in a muddled conflation.

In other words, at the relevant time, all the European Institutions, including the political parties within the Parliament, failed to consider the Code’s impact on children’s rights and welfare.  They also failed to consider the express legal burden placed upon them to take account of the unique position of children. If any “precedent” was to be set it would be one about children. Nothing else. The scrutiny processes which are said to be fundamental to the way new laws take shape in a democracy let children down. Badly.  

Protecting children through scanning was not being done covertly

Please bear in mind none of the companies scanning messaging services to keep children safe made any secret of it. On the contrary they spoke about it openly and often.  They were proud of it and wanted people to know.  They thought it reflected well on them. They were right. It did.

The companies were frequently praised for their efforts by the most senior figures in our European Institutions, including in the Parliament.

In some jurisdictions these facts alone would entitle the companies to invoke the doctrine of estoppel. And if it didn’t do that it would almost certainly provide a strong defence against any or many forms of hostile legal action.

Unintended, unforseen and unmentioned

There is no getting away from the fact, for the reasons just given, this is all a terrible mess. A legislative accident. As far as I am aware absolutely nobody wanted us to be in the position we are now in. When accidents happen people generally pull together, at least to get things back to where they thought they were before.

Not here.

Let’s not forget, when the Commission proposed a solution, the temporary derogation, they were not seeking to close down discussion or debate. They were not seeking to make a final, irrevocable decision. They only wanted to create a breathing space so as not to interrupt or give anyone an excuse to interrupt child protection work which had in some cases been going on since 2009.

But the LIBE Committee would not co-operate. They would not help to put things back where everyone thought they were before.  And they did it in the name of defending the rights of political parties to comment and scrutinise. See above for how well that worked last time out.

The result?

A 46% drop in reports of child sex abuse followed this robust assertion of the rights of political parties.

Two wrongs do not make a right

True enough two wrongs do not make a right. Just because EU institutions botched the process in the run up to the 2018 decision, and since, it is no reason to allow them to botch it again, if that is what you think  happened. But that would only be  true if LIBE was being asked to endorse a permanent or longer term position.  Which they weren’t. They were only being asked to agree to a stopgap.

It is clear why, under existing arrangements,  LIBE took the leading role in shaping the Parliament’s decision, nevertheless they are not the most obvious place one would look  to first when a matter so intimately connected with children’s rights was at issue.

Perhaps the Parliamentary authorities could reflect on that. Could the Intergroup on Children’s Rights be converted into a full Committee of the Parliament in an effort to ensure nothing like this can happen again? This might help guarantee every legislative measure is considered expressly in terms of its impact on children. 

Threat to money: we fight. Threat to kids: we don’t fight 

At the LIBE meeting Sophia IN’T Veld made one of the most telling –  I would say crushing – points when she criticised the transparent hypocrisy of Facebook. 

On 21st December Facebook immediately stopped scanning.  IN ‘T Veld pointed out that on a great many previous occasions, with its enormous war chest and great phalanx of lawyers, Facebook had gone to court at the drop of a hat to contest a point if it looked like it might interfere with their ability to make money.

When it came to defending the good work they had been doing for years to protect children they rolled over without a whimper on Day 1.

Preparing the way for end-to-end encryption

A great many people saw Facebook’s decision to stop scanning as being linked to their larger ambition to introduce end-to-end encryption. If there are no tools which can be legally used to scan, it renders otiose the discussion on encryption.  Despite their protestations to the contrary, by saying they agreed the Code made the use of the tools illegal, Facebook were providing legitimacy, succour and comfort to forces who want to see the tools banned forever. 

Simply to believe Facebook could have made a calculation of that kind tells you a lot about how low people’s opinion of them has sunk. In that sense whether it is actually true or not barely matters. And we still do not know when, or indeed even if the decision to stop scanning was discussed internally at the so-called “Safety Advisory Board”. Maybe they should rename it the “Danger Enabling Board”. Just a thought.

IN ‘T Veld contrasted Facebook’s decision with that of Microsoft, Google, Roblox, Linkedin and Yubo. These companies decided the legal risk was minimal to non-existent so they carried on scanning as before.

Complying with the industry standards we like. Disregarding the others

Appearing recently before British Members of Parliament,  Facebook told the MPs one of the reasons they wanted to proceed with end-to-end encryption is because it is now an “industry standard” . Really? Well what about the industry standard established by Microsoft, Google, Roblox, Linkedin and Yubo?

I would dearly like to know who in Facebook made the decision to stop scanning. One has to imagine it was Zuckerberg personally, and if it was then I would say the prosecution can rest its case.  In his Harvard dorm he had a great idea and manifestly Zuckerberg is talented at making decisions which generate tons of cash. But he is plainly not fit to run one of the world’s largest and most important companies as a personal project, empowered as he is by his majority shareholding. 

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Consent, Facebook, Google, Internet governance, Microsoft, Privacy, Regulation, Self-regulation, Uncategorized. Bookmark the permalink.