The price other people pay

I have just finished reading  The Secret Rule of the Internet,  an extremely interesting and insightful account of the realities of how moderation works in practice within large social media companies. It is rather long but this appears to reflect the considerable amount of research that went into the writing. I feel embarrassed I hadn’t seen it before.

The sub-heading speaks of the murky history of moderation and how it’s shaping the future of free speech.  However, it is by no means a dreary or predictable rant in favour of zero controls or maximum latitude to hurt, insult or offend.

I will not try to summarise the entirety of Secrets. It deserves to be read in full by everyone with a serious interest in internet policy. Look in particular at what it says, almost as an aside, about s.230 of the Communications Decency Act, 1996. Moreover, while there is a bit of good news, there was nevertheless one aspect that conjured disturbing reflections.

Near the beginning of the text we find an account of a moderator who had to look at a video someone had posted of something horrible being done to a child in a hotel room. Ten years later the image still haunts her. The issue of the welfare of the people who do the actual moderating is a major theme.

We are reminded that, ostensibly in the name of free speech, there is a human price which has to be paid in order to provide platforms for what are, in reality, some really sick individuals. But it’s not a price routinely paid by the golden, wealthy elites who own or have senior positions in the companies giving this stuff an airing. Neither is it a human price normally paid by those who campaign so ferociously to defend the status quo. On the contrary. Moderation remains a relatively low-wage, low-status sector, often managed and staffed by women.

And guess what? A lot of it takes place offshore in developing world countries where the scourge of poverty drives people to do what few of us would ever do.

Secrets cites Digital Refuse, by Sarah Roberts in which the author shows us that the same places that are sent the unwanted physical waste of the more affluent world are now also being sent “our” virtual toxins.

child abuse and pornography, crush porn, animal cruelty, acts of terror, and executions — images so extreme those paid to view them won’t even describe them in words to their loved ones…

…there they sit in crowded rooms at call centers, or alone, working off-site behind their screens and facing cyber-reality, as it is being created. Meanwhile, each new startup begins the process, essentially, all over again.

 There is a reference to a Wired story from 2014 where Adrian Chen documented the work of front line moderators operating in modern-day sweatshops. In Manilla, Chen witnessed a secret army of workers employed to soak up the worst of humanity.

However, in offices located  elsewhere, we are told

To safeguard other employees from seeing the… images… (the moderators were) sequestered in corner offices; their rooms were kept dark and their computers were equipped with the largest screen protectors on the market. 

Even so….

Members of the team quickly showed signs of stress — anxiety, drinking, trouble sleeping — and eventually managers brought in a therapist. As moderators described the images they saw each day, the therapist fell silent. The therapist… was “quite literally scared.”

I wonder how many therapists there are in Manilla?

It is not all doom and gloom. Reassuringly we are told

 Some large established companies like YouTube, Pinterest, Emoderation, Facebook, and Twitter are beginning to make headway in improving moderation practices, using both tech and human solutions.

Let’s hope that is correct but again we have to take it all on trust because everything is surrounded by secrecy.

Poor people in poor countries, desperate for work, are not often described as being the foot soldiers, the poor bloody infantry, of free speech. Yet that is the reality.

In another part of the forest and under a variety of guises, there has been an entirely proper focus on the supply chain companies use to manufacture or deliver their products or services. Typically these initiatives have been designed to eliminate child labour, slavery or environmental harms. Isn’t it time Silicon Valley was pressed to do something about the huddled masses who daily have to face the unfaceable?


About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Internet governance, Regulation, Self-regulation. Bookmark the permalink.