Companies that deploy tools such as PhotoDNA to detect, delete and report child sex abuse material deserve a round of applause, but I am not sure how intense or prolonged the applause should be. We live in a strange world if we congratulate people simply for not aiding and abetting a crime, or if we praise them for not compounding or expanding the humiliation and degradation a child feels when they know that images of themselves being raped are being published and distributed rapidly on a large scale across the whole world.
I get that in many jurisdictions because of laws on platform immunity companies might not be under a legal obligation to deploy tools like PhotoDNA. That is a point, but it is not the point. If you can do the right thing to protect children, why wouldn’t you?
That said, we need to go further. If we are committed to upholding children’s rights across the full spectrum we all have to do better. If we intend to honour the letter and the spirit of the UNCRC, companies need to say ‘not only will we aggressively deal with child sex abuse images, in addition we want to be clear we will not help with the publication or distribution of any pictures or videos which infringe a child’s right to human dignity or privacy. And that applies whomsoever may have posted the picture or video.”
Whether or not this involves expanding the role of hotlines or other reporting facilities is an extremely important question. But it is a different one.
How to judge when an ostensibly legal image nevertheless still infringes a child’s dignity and right to privacy? Yep, I can imagine that might pose tricky questions in particular circumstances or edge cases, but the first step towards addressing them is to say you agree it is important to try. The whole tech world should say they want to try.
We used to refer to these sorts of images as coming from a “grey area”. Not any more.