A couple of weeks ago I had the honour to be in Winnipeg to attend an extended seminar organized by the Canadian Centre for Child Protection (C3P).
In the room or connecting remotely were some of the world’s leading thinkers and practitioners who are engaged with protecting children from the worst forms of internet-based sexual crimes. The event was conducted on Chatham House terms so I cannot tell you who said what.
There were lots of important, detailed take-aways from the seminar which will resonate and ripple outwards globally for some time to come. More of which anon, but for me three headlines stuck out.
The first was in the keynote opening address from C3P’s charismatic and energetic CEO (Chatham House rules do not apply to the person who organized the event!).
The CEO’s point was brutally simple:
No excuses
“There is no reason why any already known image of child sex abuse should ever be re-uploaded to or exchanged by any internet service anywhere in the world.”
Why? Because the technology which makes it possible to prevent such re-uploads or exchanges already exists. It is well-established, with no known instances of error resulting in anyone suffering any kind of legal or reputational harm.
That being the case, at one level at least part (only part) of the notice and take down processes practised by or through hotlines or law enforcement agencies since the 1990s ought to be redundant, moth-balled, or at any rate used a lot less. But that isn’t the situation in which we find ourselves.
The bottom line comes top
Too many platforms are still not deploying the tech either at all, well enough or consistently. Why? Because they don’t have to. They are therefore free to choose to spend their cash and management time on other stuff, and they do. It’s usually stuff which makes money not costs it. Doing your best to protect children is not an unavoidable requirement of doing business online. So they don’t.
Older images matter
Among law enforcement agencies and child protection experts there is a very understandable emphasis on identifying previously unseen images or abusive behaviour. “New” suggests there is a child out there who is currently being abused, has been abused recently, is in imminent danger or is in some other kind of on-going difficulty (for example caused by the distribution of a seemingly self-generated image).
Finding a new image and victim may lead to a previously unknown or recidivist perpetrator who, in turn, may lead investigators to other hitherto undiscovered victims. It may therefore prevent any fresh abuse that might be being planned for these other children or the very same child.
In addition, and crucially, if a new child can be rapidly identified and located there is at least the possibility the abuse can be brought to an end or prevented and the child helped or supported in a way which maximises the chances of achieving a good recovery or minimises the damage already done.
I am not saying that law enforcement and child protection agencies are only interested in new images or behaviour but this feeds into another set of genuine concerns.
Scale should threaten nobody
Older images could be helping to create or sustain new offenders thus putting children as yet unharmed at risk. But the children depicted in the older images are themselves, as often as not, now no longer children.
They are adults, maybe in their 30s or 40s and, as such, they fall outside the immediate purview or urgent concerns of traditional children’s organizations and in respect of law enforcement agencies they represent a volume of potential cases which can consume a huge amount of time and resources just processing them when the “only” or principal likely outcome is an arrest or a caution for downloading.
That is not an unimportant outcome but, in a world of scarce resources, police officers will weigh it a lot less heavily than trying to locate and protect children currently in danger or recently abused.
With older images the child will often already have been identified and located or they never will be. The perpetrators will already have been identified or located, or never will be. That’s how and why older known images fall down the list.
Yet what if you are the person in the older image?
You are that child who, 15 or 20 years ago was sexually abused and an image was made of you being abused. You know or believe it is still being viewed every minute of the day somewhere or other on the internet, maybe even by people living on the same street as you or by people you could easily bump into in the shopping mall. They might interview you when you are seeking employment or a place at College. That hurts. A lot.
I have met young adults whose continuing hurt sits alongside justifiable anger because they know things do not have to be that way. These young adults are not victims of an unknowing or a technologically incapable or innocent yesteryear. They are being allowed to be revictimized by contemporary indifference. That hurts even more.
For the pain of these young women and men to be fully cauterized and have a chance of being driven out or reduced they have to know that they or their legal representatives will soon stop receiving notices from law enforcement telling them images of their humiliation are still being found. That has to be our collective goal and it is within our grasp because, to repeat the point for the last time in this blog, the tools exist to make that possible.
So no more protests about whackamole please
The second Winnipeg headline tells me the police and Tech need to stop complaining about the number of reports of csam. I have a strong feeling both would like to try to limit or control, or at any rate substantially reduce the volumes, if for different reasons.
In the case of the police it is to avoid the embarrassment of being thought impotent in the face of such horrible crimes. In the case of the companies just to avoid the embarrassment. Period, as they say on the other side of the pond.
Such attitudes confuse two entirely different concerns which can and should stand separate from each other.
We all get the difficulties law enforcement agencies face because of the number of csam reports coming in from hotlines or companies. Better ways of managing the processes may well be advantageous to everybody. I doubt all of the systems or definitions being used are perfect. Things rarely are.
Moreover, just as there is no reason why an existing known image need be re-uploaded or exchanged so there is no reason in principle why a police officer anywhere in the world need be detained by or have to process an image from a case which has already been closed (bar identifying whoever on their patch recently downloaded it again).
But one thing we must never do is artificially limit or manipulate the system or supress or withold information about the volumes just to save face either on the part of the cops or the companies. Managing workloads or PR are not my problem. Protecting children is.
Victims, like the rest of us, would certainly like the police to have more resources so the need to carry out triage could be entirely eliminated or reduced but there remains a paramount concern to get the images of abused children off the internet rapidly. The faster the better. The longer the delay, the greater the harm done. And there is no need for any delay if the images were never uploaded or, having slipped through the net been found.
Arrests, prosecutions and admin processing can catch up. Young people’s lives can’t.
The companies can look after themselves.
Legal but disgusting
The third Winnipeg headline was linked to some of the images C3P showed us. These were images which they had repeatedly reported to online platforms and repeatedly been told were not illegal, did not violate their terms of service and therefore were staying up. “Awful but lawful” was the phrase used to describe them. “Legal but harmful” in UK and EU parlance.
They were utterly disgusting even if not instantly overtly sexual (which would have taken them across the legal line). The faces of the children were plainly visible. The what-comes-next-or-soon scenarios were not hard to guess but anyway the humiliation of the children was as plain as plain could be, standing as a stern rebuke to the companies who sought to defend their presence on their site or service on the grounds of free speech or artistic expression. A principle without a heart is no principle at all. It most certainly is not a sustainable principle.
How any decent human being could look at images of that kind and still say “no we will leave them up” beggars belief. I suggest any such platform in future amends its Ts&Cs to include the following paragraph which they should be obliged to put on their home page and repeat at least 10 times during the on-boarding processes and periodically thereafter in case anyone forgets.
An invitation and a warning
“Because we value free speech and artistic expression so highly and to the exclusion of all others, we want you to know that if you join or remain part of our service you may well be confronted by or receive images of children in various states of undress or humiliation which, while in our view they stay just on the right side of the law, most people will find them utterly repellent, being thoroughly disgusted and horrified.
We may not like them either but we are quite happy and willing to help you publish any you might have or help you find similar ones to add to your collection. We disclaim all responsibility for any of the likely or predictable consequences of our value-free commitment to publishing. This is what pays the rent and to us that is much more important than not causing pain to someone else, even a child. An btw we can only do this because self-regulation allows it. Long may that continue.”
And the good news is
The UK’s Online Safety Bill will address all of these matters and put an end to them, at least in the UK bit of the internet. Watch out for my next blog. It will appear soon.