Child protection delayed is child protection denied

In December 2020, one second after midnight when the new provisions of the European Electronic Communications Code kicked in, Facebook stopped looking for child sex abuse material (csam) across its platforms in all 27 EU Member States. 

Historically, a huge proportion of the reports of csam found in the EU originated on one or other of Facebook’s properties. The company’s decision therefore had immediate, catastrophic and entirely predictable consequences. 

The above graphic was prepared by NCMEC, the principal global centre for receiving reports of csam. 

It shows that in the first seven months of this year over half a million reports that could have been made, weren’t made. A 76% decrease. In fact the real impact of the decision was almost certainly greater because, as we know from law enforcement agencies around the world, Covid-related lockdowns had led to general increases in illegal activity of this sort during that period.

ECPAT International first broke this story via Politico Pro (paywall). In her excellent blog Acting-ED Dr. Dorothea Czarnecki commented

Every unreported image represents a child potentially in imminent danger of being sexually abused again and often it will be the image of a child who needs to be found and helped now. 

Facebook’s decision was baloney

Let’s not forget lots of other companies e.g. Google and Microsoft, concluded there was no need to stop what they were doing, and they were doing exactly the same as Facebook. If Google and Microsoft could get a legal opinion to support them carrying on, so could Facebook. Self-evidently they had no desire so to do.

One is therefore bound to wonder what exactly motivated Facebook to take such an extraordinary step. Any idea of sticking with “industry standards”  – a line the company frequently pushes – had obviously been thrown out of the window. Something else was going on. But what?  My sense of puzzlement was not dimished when….

The story kept changing

After Facebook stopped scanning for csam, I spoke to various people in the company. They were adamant. As soon as “the situation” was clarified scanning would start again. Arguably “the situation” was cleared up on 29th  April 2021 when a political agreement was reached. An “interim derogation” would be introduced to restore much or all of what had been understood to be the status quo ante.

Facebook was never in any material legal jeopardy but surely once the political horizon was clear they could resume? They could but they didn’t.

The message coming out of Facebook changed.

They said they now had to wait until the new law was not only politically finalised, they also needed to see the legal text. This meant waiting until the last dots and commas had been inserted and it had been published in the Official Journal of the EU.

The first happened on 14th July,  the second on 30th July (ibid). Yet here we are.

Then the line switched once more

Now I was told it was not, after all, just a matter of seeing the final version of the legal text as published in the Official Journal because

“It’s not as easy as flipping a switch. It’s with engineering.  They are sorting it out. But we will be going again soon.”

A minor digression

In Frances Haugen’s testimony to the US Congress last week we learned when the 2020 US Presidential elections were over Facebook did just “flip a switch” to restore the status quo ante. 

This is what Haugen said

“… as soon as the election was over, they turned them [the safety systems] back off or they changed the settings back to what they were before, to prioritise growth over safety. And that really feels like a betrayal of democracy to me.”

In other words Facebook intentionally allowed the craziness to begin again. What does craziness mean on Facebook? It means eye balls. And what do eye balls mean? Money.

The line changes once more

Back to the main theme of this blog. When ECPAT went public the journalist on Politico Pro contacted Facebook for a comment. This was when we discovered the company was “consulting” the Irish Data Protection Commission.

So it wasn’t just an engineering question, if it ever was, and it was never only about seeing the legal text or the difficulties of switch flipping.

Given the alacrity and chronological precision with which Facebook stopped scanning, their present tardiness is utterly shameful. More evidence of the company’s detachment from reality, their arrogance. Or is Facebook on manouvres?

Rumours, rumours and speculation

The rumour mill is alive with speculation. How can we explain what otherwise seems to be another wholly avoidable, gratuitous self-inflicted wound?

Everyone I know assumes Facebook’s actions must be linked to their plans to introduce end-to-end encryption. If that happens the amount of csam finding its way to NCMEC from Facebook will plummet to zero or very close.  The number of children being abused via the platform won’t change. If anything likely it will increase because perpetrators will be emboldened to do more.

Nevertheless, what would make the transition to end-to-end encryption so much easier for Facebook is for the Irish Data Protection Commission to find in favour of Patrick Breyer’s complaint, originally lodged on 28th October, 2020 with the data privacy authority in Schleswig-Holstein and later referred to Dublin.

If all or the bulk of Breyer’s petition is upheld resuming scanning on Facebook  might never happen or it could restart in a much reduced form. The decision to introduce end-to -end encryption, at least from a child protection point of view, would be a big non-event and Facebook will be able to say “don’t blame us”. 

And if such is the ruling that comes out of Dublin everyone else will have to stop  or change the way they try to protect children. It can’t come to that. Can it? 

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Child abuse images, Facebook, Internet governance, Privacy, Regulation, Self-regulation, Uncategorized. Bookmark the permalink.