Let there be light

In September the New York Times produced the first in a series of articles in which they focused on the internet industry’s response to the explosive growth in the detection of online child sex abuse material (csam).

They started with statistics supplied by National Center for Missing and Exploited Children  (NCMEC). In 1998  NCMEC received 3,000  reports of csam. 2018’s number was 18.4 million, referencing 45 million still pictures and videos of csam.

We were informed in a later article in 2013 fewer than 50,000  csam videos had been reported whereas in 2018 it was up to 22 million. Video has been the major area of growth. The Canadian Centre for Child Protection and the UK’s Internet Watch Foundation have witnessed similar  increases.

Shocking though these numbers are, probably what they demonstrate is simply the increased proactive deployment and effectiveness of tools used to detect csam by a comparatively small number of internet companies.

However, what the New York Times articles principally showed was the inadequacy of the wider internet industry’s response and indeed the inadequacy of the response of some of the industry’s leading actors. We have been led up the garden path.

If child safety and security was really embedded in a company’s culture, stories of the kind published by the New York Times would simply not be possible. Yet they have been appearing for years if never before with such forensic detail.

The Technology Coalition

In 2006  the Technology Coalition was established. Here is its stated mission

Our vision is to eradicate online child sexual exploitation. We have invested in collaborating and sharing expertise with one another, because we recognize that we have the same goals and face many of the same challenges.

This is the standard rubric. You hear it all the time. From everybody. It isn’t true.

Child Abusers Run Rampant as Tech Companies Look the Other Way

That was the headline for the second article in the New York Times series. It completely blows away the facade of an energetic, purposeful collective industry drive to get rid of csam from the internet.

Here are some extracts from the piece:

The companies have the technical tools to stop the recirculation of abuse imagery by matching newly detected images against databases of the material. Yet the industry does not take full advantage of the tools.

Specifically we were told

The largest social network in the world, Facebook, thoroughly scans its platforms, accounting for over 90 percent of the imagery flagged by tech companies last year, but the company is not using all available databases to detect the material….. (emphasis added).

Apple does not scan its cloud storage…. and encrypts its messaging app, making detection virtually impossible. Dropbox, Google and Microsoft’s consumer products scan for illegal images, but only when someone shares them, not when they are uploaded.

…other companies, including…. Yahoo (owned by Verizon), look for photos but not videos, even though illicit video content has been exploding for years. 

According to the Times

There is no single list of hashes of images and videos all relevant companies can use.

Google and Facebook developed tools for detecting csam videos. They are incompatible. A plan to create a process for sharing video “fingerprints” (hashes to speed up  detection) seemingly has “gone nowhere”.

There’s more

Tech companies are far more likely to review photos and videos and other files on their platforms for…. malware detection and copyright enforcement. But some businesses say looking for abuse content is different because it can raise significant privacy concerns.

Amazon, admittedly not a member of the Technology Coalition but the world’s largest provider of cloud services, scans for nothing.

A spokesman for Amazon…. said that the “privacy of customer data is critical to earning our customers’ trust,”…… Microsoft Azure also said it did not scan for the material, citing similar reasons.

At some point it will be interesting to deconstruct what “customers’ trust” really means.

And we know all this because…

How did we learn of all this? Did it emerge as the result of open declarations by tech companies?  Obviously not. Following careful analysis by a dedicated team of academics? No.  Has the truth been exposed by a law enforcement body, NGO or a governmental agency that finally decided omertà was not in the public interest? No.

We have gained these insights because the management of the New York Times decided to give two journalists, Michael Keller and Gabriel Dance, the space and the resources to pursue a self-evidently important story.

I met these guys for the first time when I visited the New York Times offices last Monday. However, I had first spoken to them in June. They had been investigating csam since February, flying around (literally), talking with a multitude of people, piecing things together from on the record and off the record sources.

It was a gigantic effort that made a commensurate splash on the paper’s front page. It seems to be having the desired effect.

A letter from five Senators

One immediate consequence of the New York Times articles emerged last week when five US Senators (two Democrats, three Republicans) wrote an impressively detailed letter to thirty six tech companies. They include all members of the Technology Coalition and plenty more besides. The Senators want answers by 4th December.

Let’s see  how the companies respond. The letter contains all the right questions. They are precisely the kind techbusinesses should be legally required to answer. Once the UK election is over let’s hope we can move swiftly to establish  a strong regulator who can ask them confident they will receive truthful replies. Any hesitation or refusal by the US companies to respond to the Senators’ letter will only add to a sense of urgency here.

The New York Times has helped children the world over

Children the world over owe Keller and Dance and their bosses a lot but it is little short of scandalous that it took a newspaper to let in the light. Where is the public interest body that has the resources and the ability to track and report consistently over time on matters of this kind? It does not exist. It should.

I have been arguing for ages we need a Global Observatory, among other things to do routinely what the New York Times just did as a one off.  Somewhere there needs to be a properly resourced independent agency that has children’s interests at its heart and high tech industries in its sights. But such a body needs to be sustainable over time. That is a big and expensive thing to do. I’m going to have another go at doing it.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International, Technical Adviser to the European NGO Alliance for Child Safety Online, which is administered by Save the Children Italy and an Advisory Council Member of Beyond Borders (Canada). Amongst other things John is or has been an Adviser to the United Nations, ITU, the European Union, the Council of Europe and European Union Agency for Network and Information Security and is a former Board Member of the UK Council for Child Internet Safety. He is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John has advised many of the world's largest internet companies on online child safety. In June, 2012, John was appointed a Visiting Senior Fellow at the London School of Economics and Political Science. More: http://johncarrcv.blogspot.com
This entry was posted in Child abuse images, Internet governance, Regulation, Self-regulation. Bookmark the permalink.