Last month the OECD published an interesting document entitled “Transparency reporting on child sexual exploitation and abuse”.
It’s an instructive read and the OECD should be congratulated for doing it.
They looked at the practices of the world’s top 50 content-sharing services. We’re talking about the likes of Dropbox, Snapchat, Discord, Twitch, Google, Meta, TikTok, Reddit and so on. All 50 are listed.
Here are some headline points from the report.
“…only 10 of the 50 services define CSEA with sufficient detail to understand what is prohibited on their services, and only 20 of the services issue a transparency report on CSEA.” (emphasis added)
Not a great start.
However, among those services that do reference CSEA there are
“significant variations in what behaviour is captured in their definitions, and the metrics, methodology and frequency of transparency reports differ across platforms. “
The OECD notes that
“While good practices exist, the report reveals a fragmented response to this complex and evolving problem, which limits comparability and makes it challenging to conduct a thorough assessment of the overall impact of platforms’ efforts to combat CSEA.”
Independent verification?
What I didn’t see discussed was whether any of the published transparency reports (TR) had been or were routinely verified or vouched for by an independent, trustworthy external agency of some kind. Or were they marking their own homework?
That said, unless we discover there is something positively deceptive or manipulative going on in among the twenty apparent good guys, publishing a transparency report is likely a lot better than not publishing anything at all. The thirty who do nothing have more to answer for than the twenty who do something.
Some other key extracts
- Fifteen of the 20 TRs provide data on the proactive rate of detection. Just three of these give detail on how the CSEA violations are further classified or categorised, instead relying on aggregate statistics in terms of specific services, features or offences.
- Twenty-nine of the 50 top online content-sharing services state that they deploy a combination of staff, automated tools and community user reporting to detect CSEA content on their platforms. A further 21 services provide limited or no information at all about their approach to monitoring compliance on their platforms.
- Concerning CSEA in particular, just 16 of the top-50 services provide detailed information about their detection methods. Thirteen of the top-50 services provide less detailed information either in their policy and governance documents or in the respective TRs.”
Which 20 issue reports?
See table 2-2 on page 17. They include most of the big ones you would expect.
Which 30 do not issue reports?
Same table, same page. I guess the big one that doesn’t issue a report is Apple, but since they encrypt all their traffic how could they anyway? Many of the services listed among the 30 non-publishers are not all that widely used in the UK and Europe but in among there is Flickr, Baidu, Tumblr and Vimeo and they are.
Transparency is vital
Without transparency truly we are lost. And on matters of this kind, after all we have been through it is unreasonable to expect anyone to take transparency statements wholly on trust.