Meta says money matters more than children

Late yesterday UK time, Meta took a massive backwards step. A strong policy which protected children is being abandoned. Meta is doing this in the name of privacy. However, their new approach tramples on the right to privacy of some extremely vulnerable people – victims of child sexual abuse – and ignores the victims’ right to human dignity while  also introducing new levels of danger for a great many other children. The company will face a hail of fiery and well-deserved criticism, not least because there were alternatives which would have allowed them to achieve their stated privacy objectives without putting or leaving any children in harm’s way.

What have they done?

In 2018, using clever tools such as PhotoDNA, later added to by programmes they developed in-house, Facebook established some new systems. These allowed the company to find, delete and report to the authorities, images of children being sexually abused. These were all illegal still pictures or videos which someone had published or distributed, or was trying to, either via Facebook’s main platform or via their two main messaging apps, Messenger and Instagram Direct.

Meta’s systems worked like lightning. In practice some of the material was never seen by anyone but the person trying to post or distribute it, and the cops. People posting, trying to post or exchanging the images were apprehended. Child rapists were put in prison. Children were relocated to a place of safety and provided with support to recover as best they can from the harm done to them. The sooner such support could be brought to the child the greater their chances of making a good recovery. Time is of the essence in matters of this kind.

A victim’s right to privacy

Among other things, by deploying these systems what Meta was doing was recognising the right to privacy and human dignity of the victims, the children depicted in the images. No child could consent to being sexually abused, much less could they consent to unlawful images of their pain and humiliation being published for the whole world to see.

Every victims’ group expresses the same opinion. They want the images found and gone in the shortest possible time. They want their privacy restored as best it can be, as fast as it can be. Most definitely they do not want the number of people in possession of the images to grow.

The continued circulation or publication of such images not only puts the victims depicted at risk of further harm, they also help sustain or encourage paedophile behaviour and, as such, they represent a threat to children as yet unharmed in all parts of the world where the internet is available, which pretty much means everywhere. Children in countries with the least well-developed, legal, educational and law enforcement structures are probably disproportionately at risk.

Not just images were being taken out by Meta

With similar tools Meta was also able to identify potential grooming behaviour and conduct which suggested a child might be contemplating suicide or serious self-harm. Moderators were able to intervene or get others to intervene. This too could happen very quickly.

A leader no more

Through efforts of the kind described Meta established itself as an acknowledged leader in online child protection. However, on 6th December 2023, Meta started to introduce end-to-end encryption (E2EE) to Messenger, and it has made clear it intends to follow suit with Instagram Direct. Thereby, it is wilfully blinding itself to all the possibilities that were there before. It is a truly shocking decision which I for one never believed they would actually implement. I thought they would back off. More fool me.

Massive numbers

We know something of the scale of the problem in respect of still pictures and videos of child sexual abuse. In the four-year period 2019-2022 inclusive, the global reporting hub in this space,  NCMEC,  received a total of approximately 100 million reports of child sexual abuse content which had been found on the internet. Roughly 80 million of these reports came from Meta. In excess of 80% were linked to Messenger and Instagram Direct. The clever tools appeared to be doing their job extremely well.

Of the 100 million, the number of reports which resolved to the UK escalated as follows: 2019 74,3302020 75,578; 2021 97,7272022 316,900. Over 550,000 in total.

To get some sense of what these figures represented, in 2022, NCMEC received reports of 49.4 million still images of child sexual abuse of which 18.8 million (38%) were unique and 37.7 million videos, 8.3 million (22%) of which were unique.  “Unique” means there were no duplicates in among the larger number. 49% of all these reports were “actionable” by law enforcement agencies while NCMEC classified the remainder as “informational”.

Each report could be about one child or multiple children.  By introducing E2EE Meta is therefore condemning an unknowable number of children to prolonged pain and misery, possibly even death. The police cannot arrest a child rapist or otherwise act to safeguard or support a child if the initial reports are just not there. Meta cannot intervene to deflect a paedophile or channel a troubled child away from suicide towards sources of help if the company can no longer pick up the prompts or signals.

Meta opts to cover it up

Meta say they will pursue “other measures” to protect children, and they will carry on scanning the main Facebook platform as before but they can do these things anyway. In whatever way they try to minimise or apologise for what they have done, or in the case of Instagram Direct are about to do, there is just no avoiding one simple fact: Meta is ensuring they will no longer  be able to see what is happening in Messenger or Instagram which is where the vast majority of the abuse happens. The abuse won’t stop. Only the company’s ability to do anything about it will. Taking 2022 as the baseline year, if E2EE had already been in place likely Meta would not have made over 20 million reports. If anything, when bad guys learn their chances of getting caught have been reduced because Meta can no longer sniff them out, one would expect there to be an uptick in abusive behaviour.

Apple dodged the same bullet. Unfairly

I can see why Meta might feel a little aggrieved that they received so much bad publicity about the amount of child sexual abuse associated with their platforms. It only happened because years ago they decided to do the right thing, even though they knew the results would become public.

Contrast the position of Meta with Apple. They acknowledged  they were likely to have a major problem with the same issues but because it implemented E2EE a lot earlier and had never introduced systems to detect this kind of content and behaviour, the spotlight never shone on them.

Of the 100 million reports received by NCMEC over the four years mentioned above, only 864 came from Apple. No commas or zeros have been missed off by mistake. If Apple had been doing what Meta was doing it is possible the volume of reports reaching NCMEC over the same period would have been nearer the 200 million mark.

What is doubly galling about Apple in this context is, after acknowledging they had a problem, they came up with a great solution but then, under pressure from a techno-elite within the privacy lobby, they abandoned it. The solution is what is known in Geeksville as “client side scanning”.  Essentially what happens is content is examined on the user’s device before it enters the encrypted tunnel. As a result, nobody needs to fiddle with or try to break the encryption itself. Nobody needs to read or see any words. The tools can only see and comprehend images and don’t forget these are either images which have already been classified as being illegal, or they are very likely to be illegal and can be checked before any further action is taken.

Yet that aside, there is not now, nor has there ever been, any evidence of any of the tools used hitherto getting it wrong. There are no known instances of any innocent person being wrongly named or shamed, much less arrested or prosecuted. Nobody’s privacy has been invaded in any way that led to any kind of unwarranted detriment to them.

Perhaps anyway the key point is two wrongs do not make a right. The fact that Apple is delinquent does not give Meta permission to behave just as badly or worse. I say “worse” because in the case of Meta there is no ambiguity about the consequences. We can only hope, in the UK at least, the recently adopted Online Safety Act forces Meta back part of the way if not all of it. And with any luck it will also require Apple to follow suit.

The money angle

Of all the major players on the internet Meta probably has the worst reputation for abusing its users’ privacy. Think Cambridge Analytica, to name but one rather large example of “surveillance capitalism” red in tooth and claw.

Mark Zuckerberg came to the conclusion that because of egregious behaviour of this kind by his own and other internet businesses, in future, privacy was going to matter more and more to end users (us). He decided Meta needed to “pivot to privacy.”  This is what led to yesterday’s announcement. If it didn’t make the “pivot”  Zuckerberg feared Meta would lose out to other businesses, in particular Apple which had made a big thing of user privacy, even  to the point of refusing to help the FBI deal with the iPhone of the terrorist shot dead in San Bernadino in 2015.

The bad publicity Meta is about to get over yesterday’s appalling announcement has doubtless been factored in and could be exactly what they want. In 2015 Apple was a very public badass over privacy. In 2023 Meta will overshadow them. They want to become King Privacy Badass.

Aside from helping Meta become more Apple than Apple, going over to E2EE has immediate financial benefits for Meta’s bottom line. In 2021 Meta said it employed 40,000 people to work on “safety and security”. A large proportion of those people are moderators: people who look at potentially illegal or unacceptable content or behaviour. Some estimates suggested year-on-year it cost Meta around US$5 billion for these operations and the pressure was going only one way: up. Encryption stops that and puts it into reverse. You cannot moderate what you cannot see. You can dispense with the services of a great many moderators. This is exactly what Zuckerberg wants.

On top of the money Meta will save it will be less likely to be exposed to bad publicity over how much criminal content or behaviour they have facilitated. Potentially the company’s exposure to a range of legal risks will also reduce. You cannot be liable for stuff you cannot see? Can you?

The only losers are children

It goes without saying nobody at Meta or Apple wants children to be sexually abused, to have images of the abuse distributed, or for any child to be groomed via their services. However, they are unwilling to act within the sphere of their own competence to try to keep such things to a minimum. They rate other things higher. Shame on them.

Every organization which has Meta and, for that matter Apple, in membership should therefore consider whether their presence among them is appropriate. They have shown their ethics are wanting, and not in a small or insignificant way.

Obviously, it will still be necessary for those concerned with children’s interests to continue to engage with both companies, but this should be in the same way Friends of the Earth engages with Shell and BP, the anti-smoking lobby engages with Philip Morris or Ralph Nader did with the car industry. At arm’s length. Not by awarding them halos for their “great work” protecting children, not by implying we are all good chaps working towards common goals.

From now on it is simply impossible for Meta or Apple to claim children are front and centre of their concerns as a company. Plainly they are not.

In future, if asked, Meta’s representatives should say “we take children’s safety and welfare very seriously but definitely not as seriously as we take beating  or preserving our  position vis-a-vis our commercial rivals while accumulating or hanging on to as much money as possible”. Apple’s employees can riff off that.

I could probably make that a bit snappier, but you’ll catch my drift.

In the long run I believe Meta and Silicon Valley as a whole will come to regret what happened yesterday. Why? Because it will greatly strengthen the resolve of Governments everywhere to bring Big Tech to heel. If a company like Meta is willing to act in such a callous and calculated way in respect of children, it is anybody’s guess what else they and businesses like them will be willing to do in other areas. Maybe ChatGPT can tell us.