States, csam and grooming. A proactive future?

The Council of Europe has brought out an extremely useful and timely report. Quoting directly from multiple sources it draws together many of the principal legal instruments which have a bearing on the obligations of states actively to protect children from sexual exploitation and abuse online.

The report does so within a very particular frame of reference. It examines the role automated technologies and tools can play in helping states discharge their obligations to children. Moreover, while the legal instruments cited apply directly to and are generally enforceable against states, it is clear by extension the legal principles ennuciated apply also to private entities e.g. companies which operate within or from signatory states.

The report was prompted by the debacle in the EU over the e-Privacy Directive. Readers will recall the central issue there was the legality of actions being taken by businesses which, on a wholly voluntary basis, had been deploying automated tools to protect children in respect of the distribution of child sexual abuse material (csam) and grooming.

The Council’s report (p11) describes the various types of technologies currently available to deliver either or both of those objectives.

With already known csam, online businesses claim to have been using one or more automated tools since at least 2009 (when Microsoft released PhotoDNA). Recently, we are told, automated tools have also been used to detect suspected grooming behaviour and to pick up images which are likely to be csam but have not yet been confirmed as being such by human moderators.

But were these automated tools and technologies legal to begin with? Would they still be legal after the European Electronic Communications Code (Code) came into effect? Would any potential legal difficulties be resolved by the interim derogation from the Code proposed by the Commission? Those were the questions which sparked controversy.

The States Parties to the Lanzarote Convention asked the Council of Europe to prepare a report which looked at the challenges raised in the debate.

As you would expect, and as you can see from the title shown above, the Council of Europe was concerned to ensure any and all actions taken in pursuit of the self-evidently desirable objective of protecting children nonetheless conformed with the Rule of Law, in this case human rights law.

The Council’s starting point is clear

On page 5 the following appears “States have a positive obligation to protect children from sexual abuse and exploitation”

For the avoidance of doubt the report then provided the following non-exhaustive list of legal instruments and references in support of its unambiguous, unequivocal statement.

  • “the UN Convention on the Rights of the Child and its Optional Protocol on the Sale of Children, Child Prostitution and Child Pornography; 
  • the Council of Europe Convention on Human Rights, the European Social Charter and the Conventions on the protection of children against sexual exploitation and abuse, on Cybercrime  and on Data protection (also known as Convention 108+);
  • the EU Directive 2002/58/EC of the European Parliament and of the Council (e-Privacy Directive) and the European Electronic Communications Code.

The relevance of the European jurisprudence is also highlighted through the analysis of the case law of the European Court of Human Rights and the European Union Court of Justice.”

I am not quite sure how this happened, it could simply be a matter of timing but, referring back to the point about businesses, what was not cited in the report was paragraph 35 of General Comment No. 25, the relevant bit of which reads “Businesses should respect children’s rights and prevent and remedy abuse of their rights in relation to the digital environment. States parties have the obligation to ensure businesses meet those responsibilities.”

It is put equally crisply in the Council of Europe’s own Handbook for Policy Makers on page 69 where we find this:

“Member States have the obligation to secure to everyone within their jurisdiction, including children, the rights and freedoms that are laid down in international and European conventions. At the same time, business enterprises have a responsibility to respect these rights and freedoms. Together, States andbusiness enterprises should aim at achieving the right balance between protecting children and ensuring equal access and opportunities of all children in the digital world.”

The report’s nine recommendations

On page 54 the Report’s recommendations begin:

Recommendation 1: Successful prevention and combating of the current forms of Online Child Sex abuse (OCSEA) requires State actors to stay up to date and react to constant technological developments in this area, facilitated especially by the prevalent use of continuously evolving ICTs. The use of automated technology in the fight against OCSEA is, in this regard, essential.

Recommendation 2: To ensure a proper balance between privacy and protection of children against sexual exploitation and abuse fostering a dialogue between private sector companies and policymakers/regulators is of the utmost importance. Such dialogue should primarily aim at securing adequate transparency on the choice of the technology used and processes around its use.  

Recommendation 3: Initiatives aiming at improving coordination in this area should be indicated and supported as they are vital to the reliability of the reference databases. In this regard, it is also necessary to secure more clarity on how the accountability mechanisms are managed, including the recruitment and training of individuals employed by private sector companies who are responsible for the assessment of illegal content, such as CSAM. 

Recommendation 4: To better maintain a balance between privacy and protection of children against sexual exploitation and abuse, defining the proper level of safeguards should take place as early as possible in the process of development of technology. Policymakers and regulators should place particular focus on the dataset used by that technology to train complex combinations of algorithms.

Recommendation 5: In order to enhance privacy while prioritizing protection of children against sexual exploitation and abuse it is necessary to promote technological solutions that are the most efficient for the purpose considered. 

Recommendation 6: Initiatives oriented at cross-sectional dialogue should be identified and supported. 

Recommendation 7: The weight that is accorded to positive obligations against OCSEA under international and European human rights law, bearing in mind the best interest of the child, needs adequate appreciation in the legislative debate going forward.

Recommendation 8: Acknowledging the current legal lacunae, consideration should be given by CoE Member States to the need for a harmonised and sustainable legal framework which can provide legal certainty to SPs and address future technological developments. 

Recommendation 9: The CoE Member States are strongly encouraged, in line with their positive obligations to protect children against OCSEA, to establish a public interest-based framework grounded in the Lanzarote Convention, enabling SPs to automatically detect, remove, report and transfer OCSEA-related information under data protection and privacy conditions and safeguards…….

Fighting yesterday’s battle?

That sub-heading is not intended as a criticism because I recognise, without reservation, the great scholarship and thought that has gone into preparing the report. With its abundant references it will be an immensely useful tool for many child rights advocates in the months and years ahead. But let me quote from the report.

On page 5 this appears:

While recognising the benefits a mandatory regime could bring, this report focuses on the practice of voluntary detection and voluntary reporting of OCSEA by service providers….. As a consequence of this approach, the choice of technological solutions analysed in this document was limited to this context“. (emphasis added)

And on page 8:

“…recognising the benefits a mandatory regime could bring in terms of legal clarity and certainty, ……. this report focuses on the practice of voluntary detection and voluntary reporting…” (emphasis added).

In other words the report looked at the status quo and what was going on yesterday.

Things have moved on. We need to know more about the options for tomorrow, their strengths and limitations, both legal and technical. As the report itself hints in the two quotes I have just given (and in many others I have not given) voluntarism poses many problems and mandatory is likely to be better.

Villains and angels alike

If we have learned anything from the 25-year plus failed experiment with internet self-regulation it is that unless something is mandatory too many companies will not do it at all or, even allowing for the overriding principle of proportionality, their levels of commitment will vary enormously. There will be inconsistencies which owe nothing to alleged complexities or differences between platforms but owe everything to differences in the attitudes and the priorities of senior managements in different companies.

And if there is scope to shelter behind any degree of legal immunity or uncertainty it will provide a safe haven for villains and angels alike.

Moreover, absent an acceptable transparency regime (a point referred to many times in the Council of Europe report) how can we be confident anyway that we know what companies are doing or have done, even those with self-affixed halos permanently hovering above their heads?

Meanwhile back at the ranch

Going full circle to the reason for the report, the European Commission has since published its proposal for a Digital Services Act (DSA). In Article 6 they make clear relevant service providers will not lose any legal immunities “solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to illegal content…”.

Grooming not covered?

Article 6 thus clarifies and establishes a legal basis for what many had assumed was the status quo, which is good, but by referring only to “content” it appears not to include attempts to detect illegal behaviour such as grooming. Which is not so good.

Article 7 is the big problem

Article 7 expressly says

No general monitoring or active fact-finding obligations

No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.”

Why is the Commission suggesting platforms are allowed to be indolent or indifferent to any law breaking which they are facilitating? Why are they saying this when they know it is online intermediaries which are the largest facilitators of online crimes against children?

Elsewhere in the DSA there are nudges which will encourage companies, especially the larger ones, to look into illegal content and behaviour which threatens children, particularly where (as would almost always be the case,) this breaches their own terms and conditions of service. And yet, and yet.

Article 7 – a magnet and a multiplier for the feckless and reckless

Surely where the technical means exist which would allow the detection of illegal behaviour towards children, its use should be mandatory not optional?

I say that while remaining mindful of the overarching rule of proportionality and insisting always that all privacy rules are properly respected.

If or when stronger rules kick in governing safety by design one would expect the necessity to mandate protective measures of this kind will diminish because they have been designed out of the devices or Apps. However, we are a long way from reaching that happy state.

It’s important to get things in perspective

When I walk into a public building I normally have to go through a metal detector. If it spots any metal I have to stand ready to show or explain what it is. Otherwise if I have no metal I have to show or explain nothing, nothing is recorded. No data about me is processed. I remain data intacta (to coin a phrase).

It’s the same at airports with my suitcases and backpack. Sniffer dogs and scanners are looking for suspected contraband or prohibited items. Are my rights to privacy being violated? I don’t think so, or if they are it is minimally and for a wider social good. If airport officials or the police zone in on me they are doing so because the scanning or sniffing provided reasonable grounds to suggest I might be a law-breaker. Absent such indications I go on my way, again data intacta.

If we can reproduce systems such as these online why wouldn’t we? If it is only because of a lack of trust in the supervisory arrangements which would need to guarantee the integrity of the processes involved we should address that not use it as an alibi for inaction.

Deal with it later this year? Hmm

We are all being told that “later this year” the Commission will publish a proposal to establish a policy for the longer term concerning the points raised around the interim derogation and more widely in respect of the strategy to combat child sexual abuse, online and off. All the points I have just mentioned can be dealt with then. Seemingly. But really?

A few things to say about that:

  1. “Later this year” is a moveable feast.
  2. “Later this year” MEPs and others could be sick to death with anything and everything digital. Their attention might slip as other pressing matters compete for their time.
  3. “Later this year” will happen against a background where we know the Commission started infringement proceedings against 23 out of then 28 EU Member States for not properly implementing the 2011 Directive.
  4. Are we seriously being asked to believe that only months after European Institutions decided not to impose any obligations to “actively seek facts or circumstances indicating illegal activity” they will then reverse themselves and allow a cross-cutting and overarching exemption for children? I have my doubts so let’s have it out now or at least put down major markers for when the EU institutions do come to consider a way forward. Assuming they actually do.
  5. In other words “later this year” could mean the current, highly unsatisfactory status quo might continue for a very long time. We do not want that to happen.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised. http://johncarrcv.blogspot.com
This entry was posted in Uncategorized. Bookmark the permalink.