A very bad day for children in Europe

If you live in an EU Member State and you have used Facebook Messenger or Instagram Direct today you probably saw this message. “Some features are not available. This is to respect new rules for messaging services in Europe. We’re working to bring them back.”

This cryptic statement refers, among other things, to the fact Facebook have turned off their proactive child protection tools in every EU Member State.  This is because today the provisions of the European Electronic Communications Code (EECC) kick in.

But Microsoft, Google, Linkedin, Roblox and Yubo somehow managed to find a way to carry on with the tools. Well done them.

Given Facebook is the world’s largest source of csam and other child sexual exploitation materials reported to NCMEC and law enforcement, this is unbelievably disappointing.

This should never have happened in the first place BUT

OK, we should never have got into this position but where there is a will there is a way. Obviously with the five companies I just named there was a will to carry on protecting children by continuing to use the tools. They did find areas of doubt sufficient to justify a continuation. Facebook didn’t.

Facebook is usually not slow to act when an important commercial interest is threatened. Not here. Facebook rolled over.

Facebook is trying to reshape its image

Facebook is determined to appease and reach out to the privacy lobby. That is plainly an overriding corporate objective that trumps all others. Given the company’s previous lack of care and respect for their users’ privacy it is not hard to work out why they want to reposition themselves  in this way.

But children are paying the price for their inglorious corporate history.

Until this is put right – as it surely will be – how many pictures of children being sexually abused will continue to circulate on the internet? How many paedophiles will manage to connect with children?  How many existing victims will be harmed further, and how many new victims will there be? We will never know, but is unlikely to be zero.

Does Facebook really still have a Safety Advisory Board? Were they consulted about this, if so when and what did they say?

The anti-suicide and self-harm tools?

What about the tools which try to detect a child contemplating suicide or self-harm? Have they also been suspended? Maybe they haven’t but essentially they work in the same way as the anti-grooming tools and the classifiers used to detect possible csam. Facebook should put out a statement specifically commenting on that point.

Concrete results

Last month NCMEC published a letter  to MEPs in which they gave some hard numbers.

In 2019  NCMEC received 16.9 million reports referencing 69 million items of csam or child sexual exploitation. Of these “over 3 million” originated in the EU. That is 4% of the total, or about 250,000 per month. 95% of these reports came from services affected by the  EECC.  From these reports 200 children living in Germany were identified, as were 70 children living in Holland. In the same letter we see the 2020 numbers are going to be higher.

Knowing Facebook accounts for the great majority of reports to which the NCMEC letter refers, we can see the likely dimensions of what Facebook have done.

Shame on Facebook. Let’s hope they succeed in “bringing them back” as soon as possible. Then they can announce they are dropping or modifying their plans to encrypt the very same services.

UK exempt?

Why do the tools continue in use in the UK?  It seems because we adopted laws at national level which provide a good enough legal basis. Can it really be the case that no other Member State did the same? And if one or more did how can Facebook justify cutting them off?

This has been a bad day for children in Europe.

We are heading for a strange world

Privacy laws were never intended to make it easier for paedophiles to connect with children. They were never intended to make it easier for pictures of children being raped to be stored or circulated online. And it would be a strange world indeed if that is where we are heading.

If there truly is a legal problem here it cannot be one of substance. It can only have arisen because various bureaucrats and lawyers did not get all their ducks in a row and take all the right steps at the right time.

Instead of a brave stance in defence of children, Facebook has buckled in front of the remediable incompetence of others.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Default settings, Facebook, Google, Microsoft, Privacy, Regulation, Self-regulation. Bookmark the permalink.