Years ago I was a member of a research group which was part of the then UK Council for Child Internet Safety. The Council was the British Government’s main national agency for co-ordinating online child safety activity. Gathering data on children’s experience of making reports to different platforms was the focus of the group’s work. Aside from me, Disney, O2, Facebook, as it then was, Meta as it now is, and several academics were also members.
Having initially been keen to be part of the research, promising to supply all kinds of numbers, the original, enthusiastic Meta employee was suddenly replaced. Not sacked. Indeed he later rose to great heights within the company. Another guy came along in his place. He was unambiguous. The only information Meta would give the British Government on children’s use of its services was information “it was legally obliged to publish anyway”. Meta was not then legally obliged to publish anything (the Online Safety Act 2023 has changed that).
This made something of a mockery of the idea of “partnership” and in effect brought the project to an end. But it got worse.
Implied confidentiality
In his very next breath the enforcer argued for an entirely novel concept. He said a doctrine of “implied confidentiality” applied to the meeting he was attending to deliver the bad news.
In other words, not only was Facebook entitled to behave badly, everyone in the room was obliged to become complicit by not telling anyone. We had to keep up the pretence. Not a ripple on the calm surface of the pond, we were all still co-operating to achieve agreed common ends, seekers-after-truth, motivated only and urgently by the best interests of children.
Plus ça change. We now know from Frances Haugen and Arturo Bejar, to name but two, the encounter I have just outlined was not wholly out of character. Meta can put a great deal of effort into denying or distorting information about themselves. Last week they were at it again. However, first a little background.
The silence that came before
6th March 2019 was the fateful day Meta first announced its intention to do what it finally confirmed on 6th December 2023.
Meta has now abandoned some of its world-leading child protection systems and introduced E2EE to Facebook Messenger. It will soon be turned on in Instagram Direct, by default in both cases. These two Apps are major areas where child exploitation occurs on the Meta platform. E2EE prevents the child protection systems from working.
In the 2019 statement the only mention Mark Zuckerberg made concerning children was as follows
“When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation……
He had the good grace to accept the new regime he had in mind would mean Meta
“will never find all of the potential harm we do today”
Upfront Zuckerberg was acknowledging his idea was going to be bad news for children. Nowhere does he try to point to or suggest any compensating good news for children arising from his decision.
Fast forward to today
In seeking to explain and justify their action on E2EE, last week Meta made a big deal of how they had consulted experts. Given a major focus of the story in much of the media was on the consequences of the decision for children, there was at least the implication that among these experts were disinterested parties with a professional background in children’s health, welfare and rights. By “disinterested” I mean experts or groups who were not on Meta’s payroll or materially dependent on or tied to Meta in any other way.
I wanted to know who these disinterested parties were. When and about what were they consulted ? What did they say? I ask because it is self-evident Zuckerberg appears not to have moved an inch since March 2019.
In particular I wanted to know if, before 6th March 2019 or any time before last week, any child rights or child welfare person or organization of standing told Zuckerberg his thinking was sound. Did they make any suggestions he accepted or any he ignored?
Every children’s organization of standing that has spoken on the subject has condemned Meta’s action so who and where are these experts? Do they exist at all?
I am aware that since 2019 all kinds of discussions have taken place with statutory or other agencies with an interest but to the best of my knowledge these were not on the basis of any of these agencies agreeing with or supporting Meta’s decision. The discussions were solely about how to mitigate its impact, in terms of harms to children, or how the procedures would work going forward.
I have had zero success in obtaining answers to my questions and neither has anyone else as far as I can tell. Meta has clammed up.
Media blackout
Following last Monday’s excellent front page lead by The Times and my piece in The Economist towards the end of the previous week I did several live media interviews on TV and radio about the Meta decision. Meta declined to field anyone to defend their position, not just against me, they never appeared live anywhere. They put out a written statement. The journalists read out bits of it but you cannot cross-examine an empty chair. This is what cowardly politicians do when they know they cannot win an argument. They run for cover. On an issue of such momentous importance might we have hoped Meta would have the courage of its convictions and fielded someone in the flesh? Yes. I think we might.
Trying to be as bad as the next guy
Meta’s written statement ended with a claim they will continue to provide reports on child sexual abuse material and these will be in line with the number of reports made by similar companies. That isn’t saying much when compared with what they used to report. It’s not a massively ambitious target, is it? To be about as bad as the next guy. However, it does provide an insight into a significant part of Meta’s motivation. They hated those big numbers of csam reports constantly being associated with their brand. They wanted to construct at least the appearance that they are only in the same ball park as everybody else. Context is irrelevant. Only the headlines matter.
On such slim whimsies have children’s rights to privacy and human dignity been sacrificied.
But for Meta that wasn’t the only driver, or rather it was inextricably tied in with another.
This was a decision about money
We need to be crystal clear. Meta’s move was about money. I cannot over-emphasise this. Would Meta have done it if it was likely to jeopardise or reduce future revenues?
Meta believe the move will help improve its longer-term competitive position vis-a-vis other messaging Apps or at least stop any further deterioration. It will do this because they think it will help reshape their image, making themselves “privacy dudes à la Apple”, thereby ridding themselves of their historic position as “privacy abusing dudes à la surveillance capitalism”.
The chutzpah almost takes your breath away. One of the companies which has done most to stoke or create anxieties about privacy now seeks to exploit those anxieties for its own ends.
Thus, a secondary but not inconsiderable and more immediate financial aspect means Meta can now sack tens of thousands of moderators. This will yield big cost savings and is consistent with the policy of huge layoffs elsewhere in the company.
It won’t work.
In any foreseeable scenario, in a random word association test, if your interlocutor says“online privacy” nobody’s next word is ever going to be “Meta”.
The opposite of what everyone should be doing
“Safety by design” has long been a buzz phrase. Everybody nods their head when it is mentioned. Yet here we see the exact opposite. Whatever else Meta may say they will do to keep children safe, they have intentionally designed the messaging parts of some of their systems to be more dangerous today than they were yesterday. That wasn’t part of the script.
Others will now follow their lead.
In the UK, when it gets going the Online Safety Act 2023 should force Meta some way back down the track. I hope when this succeeds it will be a catalyst for a coalition of willing Governments around the world to come together and propose similar measures in their own territories.
Given the current and likely near-future state of geo-politics we cannot wait for the preferred route of international treaties. The only interests who benefit from that are the online businesses. For them delay is the same as profit. For children it is loss.