The events leading up to the 2017 death by suicide of Molly Russell are being forensically examined in the North London Coroner’s Court. Meta and Pinterest are in the dock. The reports are difficult reading not just because of the concentrated focus they provide in respect of a known and identified child, but also because of what they say about children not represented in the court room but who shared similar fates. The true number of victims likely will never be known.
First, I have to say my heart goes out to the Russell Family. I cannot imagine what it must be like to sit in that Inquest listening to the detailed evidence about everything their daughter witnessed and was likely to have experienced up to the moment she took her own life. After viewing some of the content to which Molly had been exposed, even a professional child psychiatrist acknowledged, they had difficulty sleeping for a few weeks.
Second, my admiration for the family knows no bounds. They didn’t let it go. Overwhelmed by grief many of us would have shrunk to nothing, turned inwards and crawled away. Not so the Russells. Their grief has transformed itself into a righteous mission. Established in Molly’s memory the Molly Rose Foundation targets and provides support to under 25s who are contemplating taking their own lives.
Then there’s the Inquest itself. It is anticipated it will last for two weeks, and the way it has galvanised an extremely high level of attention from the world’s media is further testament to the serious and detailed way Molly’s people have been working over several years. They have engaged an incredibly strong legal team to make sure no stone is left unturned.
To Meta’s Goliath you can certainly see the Russells as David. I doubt Meta will come tumbling down but, with regulating the internet on the agendas of so many Governments and legislatures around the world, the echoes of what is said will be greatly magnified.
The third and fourth thoughts I have sort of merge in the middle and become hard to separate. There’s the cold-hearted effrontery of Meta who, even in the face of tragedy, sought to defend their actions, suggesting they too had a righteous mission. That mission was “to enable users to cry for help”.
Meta’s representative was asked if the company had undertaken any research into the impact of self-harm content on users. Seemingly she was not aware of any adding that it would have been difficult to conduct. “The impact of certain material can affect people in different ways at different times . . . It’s really complicated”.
Self-evidently not complicated enough to give Meta pause for thought. Not complicated enough to invoke the precautionary principle.
We’ve been here before. I remember years ago Facebook deliberately left a video up showing an obviously mentally ill mother beating a ten month old child while her four or five year old son looked on.
The company defended their decision saying the video became a point of focus for public outrage and campaigns for better mental health services. Not one children’s organization that I knew of anywhere in the world thought leaving the video up was in the public interest or in children’s interests. But we all knew that video was making money for Facebook. Horror sells. Weird sells. Shocking sells. That’s no place for vulnerable children and if you know or ought to know vulnerable children are in among your users you must act accordingly. Plainly Facebook didn’t.
When it was put to Facebook’s field staff that, within the company, the commercial arguments for leaving the video up might have counted for more than any other considerations, we were met with hurt looks and expressions of injured feelings. How could anyone think they, Mother Theresa and Albert Schweitzer reincarnate and made flesh, could even think about money when issues of that kind were being discussed?
Instagram is not a psychiatrist’s couch or consulting room. It is not a counselling service. No service provided by Meta or its predecessor is or was. Meta is an advertsing platform which makes money by attracting people and keeping them there. I repeat. Horror sells. Weird sells. Shocking sells. That’s why the company “likes” it.
The s230 harm
Which brings me to my final if rather obvious point. Why did Meta think it could behave in the way it did? Because s230 gave it legal cover. If any civil or criminal legal liability could have been traced back to Meta does anyone seriously think they would have allowed any of this to happen? Wouldn’t they have invoked a precautionary principle of their own?
Opponents of reform of s230 speak about the likely “chilling effect” it might have, meaning the way it might make platform owners be more cautious about what they allow and prohibit on their sites and services.
Would that be such a bad thing?
So policies were changed. Eventually
After Molly’s death Instagram changed their policies because “experts advised it that graphic self-harm imagery could encourage users to hurt themselves.”
Words do not fail me but I am not going to use the ones prominent in my mind right now.
s230 is a licence for recklessness. It is a shame it took the death of a child to force the company to listen.