Facebook moving in the right direction

That is not a headline I thought I would write any time soon. However, I have to say I applaud Facebook’s announcement earlier this week concerning the company’s plans to address the standards they will adopt in relation to content published on their platforms.

Facebook have been saying for a while they alone should not have to ajudicate on a range of “significant” and “difficult” questions concerning the type of content they should permit or forbid.

One person’s free speech can be seen as a threat or as a huge insult by another. How and where do you strike the balance particularly when you, as a business, have a financial interest in the outcome? Facebook thinks it has found a way. I think they may be right.

An independent Oversight Board

Contained in a charter  are the details of how Facebook is going to establish an “Oversight Board”.  Here is how their principal spokesperson put it

“The content policies we write and the decisions we make every day matter to people. That’s why we always have to strive to keep getting better. The Oversight Board will make Facebook more accountable and improve our decision-making. This charter is a critical step towards what we hope will become a model for our industry.”

If you read the detailed proposals, you will see how the first Board is to be chosen. There will be high levels of transparency around how it operates and, once fully established, the Board will have the power to set its own rules and appoint additional members.

An intermediary body, a Trust, will handle all the money aspects so Board members will be isolated as well as they can be from any sense of depending on keeping Facebook executives happy in order to retain their position on the Board. Facebook the company, Facebook users, and the Board itself can refer matters for consideration.

As ever the proof of the pudding will be in the eating but I am taking Facebook at face value and in that context I cannot fault their approach. Fingers crossed it will work. If it does it will inevitably set the standard others will have to follow. Several phoney Advisory Boards will cease to exist. I’m not sure Facebook will be thanked for that but from Facebook’s point of view this is smart.

Artificial Intelligence is unlikely ever to be good enough on its own

A lot of people have invested heavily in the idea that AI would solve everything. All we need to do is set down clear rules, the algorithms will be instructed accordingly and we can all sit back and relax. Moderators will be spared having to look at terrible stuff or take decisions. Mathematics will set us free. This is only partially true.

If only life were so simple.  Nuance can be important. Contemporary mores change over time and with geography. Context matters.  If a particular group has been under attack one might interpret certain postings close to the time of the attack in a different way compared with six months later, assuming things have calmed down.

We are a very, very long way from being able to entrust machines to make good decisions about matters of this kind.

A question of scale and speed

The obvious challenges are going to be around scale and speed. Of the two scale is going to be the easier one to solve. I say that becaue it is plainly going to be impossible to look at every complaint or request for take down that comes in, or to initiate major enquiries into everything of interest, so some rational system is going to have to be devised to determine the workflow.

While the Board is bound to look at individual cases, particularly edge cases, what I imagine they will want to do, at any rate early on, is consider those which raise issues of wider significance so that their decisions can guide the company’s moderation/content policies. It might be a while before they look at some topics but you have to start somewhere.

In the beginning there will therefore be areas of uncertainty but that is unavoidable when building any new system. We will face exactly the same challenges in the UK when our new Regulator starts developing codes of practice. However,  as a body of decisions starts to evolve a form of jurisprudence will also evolve and in time that will provide greater certainty and predictability.

You cannot deliver a fully finished system with all the whistles and bells on Day 1. As long as there is scope for external judicial review we need to give Facebook some  space.

As for the speed at which issues are identified and decisions are taken, that is  going to be altogether more challenging.

It will be important for the Board to have sensitive and smart “point people”  linked to systems to spot things bubbling up in a particular jurisdiction. Then there will need to be Board members  available to give a view within a reasonable i.e.rapid, timescale.

Any repeats of the delay on deciding the fate of the picture of the Vietnamese girl  or the postings on the Rohinga crisis could fatally undermine the Board’s credibility.

Mission creep?

It will be interesting to see if the Board reach a view on how Facebook determines content questions, as it were, separately from wider operational principles e.g. would they ever say “This content would be OK if we could be sure only adults were viewing it?”

Nothing ventured, nothing gained. I wish this project every possible success. I don’t imagine I will like every decision they are going to make but that is not the point. If the process is seen to be fair, independendent and in the hands of trusted and respected individuals I can learn to live with it and so will the vast majority of people.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Default settings, E-commerce, Facebook, Internet governance, Privacy, Regulation, Self-regulation. Bookmark the permalink.