A PS about scale, proactivity and not listening to the world

I reproduce below an extract from Facebook’s latest “Community Standards Enforcement Report”.  It is for the quarter ended June 2021. Very up to date. Published on 18th August it hardly needs saying there was no appreciable time lag between collecting the data and publishing them. This speaks of highly automated processes. Bravo.

I looked at the list of distinguished academics who, in 2019,  scrutinised the methods used by Facebook to generate these reports. The legitimacy of doing things this way is a discussion for another day. Suffice to say I doubt an arrangement like that will be  acceptable in a more regulated future.

What is  astonishing though not surprising ( is it ok to say that?) about Facebook’s document is the  picture it paints, by which I mean the  diversity and sheer volume of activity described.

Proactive rate

Moreover the company makes much of its “proactive rate” which, it explains, refers to actions taken by them to remove material “before a user reported it to us”.  

In other words, while Facebook has systems to allow individuals to report matters of concern, the report shows they don’t rely on them to get most of what they target.

Prohibited child related content

The figures on Covid misinformation, hate speech and self-harm are staggering but no less so than those for action taken against content prohibited on child safety grounds.

  • Child nudity and (child) physical abuse content…
    • On Facebook: 2.3 million with a proactive rate over 97%
    • On Instagram: 458,000 with a proactive rate of over 95%
  • Child sexual exploitation content…: 
    • On Facebook: 25.7 million with a proactive rate of over 99% 
    • On Instagram: 1.4 million with a proactive rate of over 96%

Bear two things in mind. These numbers refer to a single quarter. They are not a cumulative total for the year so far. And rather obviously they address items or activities which are currently viewable by the company.

Inside Mark Zuckerberg’s head

For more than one reason you have to wonder about who posts material of this kind to spaces which they know or ought to know can be seen either by everyone or at the very least by Facebook itself.

Zuckerberg’s declared intention to introduce encryption to major parts of the company’s services is likely to push the amount of illegal or prohibited activity taking place on his watch up, not down (and not just in respect of child related matters).

That being the case why does Zuckerberg think such a course of action is acceptable? 

Even if one believes people want more privacy, that is no reason to give it to them if you know, as night follows day, more children and others will be hurt if you do. People want all kinds of things that public policy forbids or limits because of their harmful effects.

On the other hand the decision makes sense if you are convinced such a move is essential in order to preserve your business as the dominant brand. That means the decision is about ego or money, probably both. It most definitely is not a decision made in pursuit of improving life on Earth. Zuckerberg personally will not be hurt by it but he is willing to allow others to be.  That is just not right.

Some cynics suggest  the move to encryption is all about reducing the costs of, and potential lilabilities in relation to, Facebook’s activities as a moderator. Not me. Oh no. I would never suggest such a thing. I’m not a cynic.

Enough already with the ad hominen attacks?

And just in case anybody was minded to accuse me of ad hominem attacks on the person of Mark Zuckerberg, please do not forget he holds a majority of the voting stock in the company. Every key decision Facebook makes is made by him. That’s what personalises it.

There is only one toga that counts at the Court of the Young Emperor.

Facebook can talk until the cows come home about the “preventative strategies” they intend to introduce to accompany the shift to encryption, about  how they intend to identify bad actors before they do bad acts (aren’t they already doing that anyway?), but the inescapable fact is, as things stand, once material goes into an encrypted tunnel it becomes invisible to them.

My hunch

My hunch is, some time ago, probably in the immediate aftermath of yet another privacy scandal involving the company, Zuckerberg decided the clamour for more privacy was likely to grow therefore, to preseve or increase his company’s revenues Facebook needed to embark on the famous/infamous “pivot to privacy.  As others have observed, after 15 years of prioritizing growth over privacy Facebook decided to switch tracks.

It’s a big gamble. A  big bet. But I think it will be shown to be a wrong one, either at large, as the mass of people cease to believe  meaningful privacy on the internet is possible anyway or in particular as it concerns Facebook.

Give a dog as bad name and it’s very hard to shake it off. This dog not only has a bad name when it comes to privacy it pretty much became a synonym for the lack of it.

Closing your ears

Which brings me to a story in today’s “Sunday Times”. It has a very familiar ring to it.  Seemingly, after “years of apologising” Zuckerberg has decided to stop. Apologising.

Here is what it says in the  article

“The company is now so used to getting bad press that they’re starting to not care….  They don’t like it, but they now think the whole world is against them and they’re retreating to their bunker.”

The New York Times reported last week …. Zuckerberg has agreed to take a step back from the stream of controversies, leaving human shields such as Sir Nick Clegg and Sheryl Sandberg to take the arrows.”

Bunkers? Let me think about that. 

Shields? That won’t work. It will only emphasise and underline that there is an organ grinder somewhere who is too scared or too arrogant to face his critics.

The company is now more valuable than it has ever been so if you believe the only thing that matters is the evidence of how popular your products are then you can see how this “up yours” strategy could emerge. This is another error of judgement and has more than a hint of Clegg about it, a man who is no stranger to major errors of judgement.

The only way to have a chance of getting actual or potential regulators off your back is to stop doing the stuff that attracts their attention. 

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Facebook, Internet governance, Privacy, Regulation, Self-regulation. Bookmark the permalink.