The importance of Clause 36(3), money and general monitoring

OK. I am going to shout it out loud, or rather I am going to put it in writing, in public, which is sort of the same thing.

There is a great deal in the UK’s Online Safety Bill (OSB) I like. A lot. Stuff we have been campaigning for over many years. However, it is also clear “le diable sera dans le détail”or, as in this case, “les codes de pratique et règlements.

If you don’t mind being accused of being, er, a poseur, if you are going to say something utterly banal it probably helps to say it in a foreign language. It suggests this is no ordinary, banal banality.

In other words, on top of what appears on the face of the Bill, the success of the OSB in no small measure is going to be determined by a whole series of codes of practice and regulations which Ofcom and the Secretary of State will draw up. Remember “whole series”. I will return to it. But first:

Clause 36 (3)

Clause 36 (3) of the OSB tells us why, in particular, the codes of practice matter:

A provider… is to be treated as complying with [the] safety duties for services likely to be accessed by children…if the provider takes the steps described in a code of practice…”

The OSB says similar things in respect of other codes that will be published on reporting, record-keeping and transparency duties, terrorist content, legal but harmful content, and the like. Codes of practice and regulations are going to carry a heavy burden. For now I will focus on children-related dimensions.

Thus, in terms of legal compliance and liability it seems if platforms do what the codes prescribe they will retain the same broad legal immunity which up to now has protected all intermediaries, irrespective of their size. The OSB does not expressly say that but broad immunity is an established part of the background radiation (the eCommerce Directive?) so at least one eminent lawyer believes that to be the case.

I have no quarrel with that. In my view, if a platform meets the terms of the OSB, the codes and regulations, they are entitled to retain broad immunity in relation to items posted by third parties where, prior to notification or discovery, they had no knowledge.

After all, the codes will be detailed and will decisively shape the behaviour of intermediaries. Turning to child sexual abuse material, for example, there is no doubt or ambiguity in relation to precisely what is expected of platforms (see below).

The logic of the codes of practice

And if an intermediary does not follow the codes, regulations or the terms expressly stated in the OSB? What then?

There will be a system of fines and other penalties. These are set out in the OSB or will be in what follows. However, the likely effectiveness of these fines and penalties are being argued about, not least because of doubts about Ofcom’s ability or inclination to mount and sustain an enforcement regime on the scale required.

The risk is obvious. If platforms conclude Ofcom is a paper tiger or is so overstretched they have little to fear any time soon we will have failed.

Platforms must believe there is a serious risk they could be turned over, held accountable, and not in the far distant future.

Ofcom needs an ally. Children need an insurance policy. I have one.

No compliance? Lose the relevant immunity.

Thus, for the avoidance of doubt, somewhere in the OSB it should be made explicit that where a platform governed by a code of practice or other regulations fails to honour the terms, not only could it become subject to the penalties the OSB will usher in, it will also forfeit any and all criminal and civil immunities from which it would otherwise have benefitted.

To be clear: I am not suggesting if platforms fail to honour the terms of a code or regulations they forfeit all immunities in respect of everything they do. That would be unreasonable.

But where a reasonably foreseeable actual harm has resulted or is alleged to have resulted from a failure to implement the terms of a code or regulations then whoever can be said to have been injured as a result should be free to bring an action which would previously have been barred or would have failed because of the immunity. The immunity is therefore lost only insofar as it concerns and is limited to the reasonably forseeable harm suffered by an identifiable individual or group.

Something like this would focus the minds of every Director or senior manager of every platform and would relieve Ofcom of a great deal of the responsibility for ensuring online businesses are routinely following the law rather than just hoping they never get caught or inspected or if they are it will be some time hence when today’s culprits might have already vanished with the loot.

“Whole series”. Big burden

It is apparent we will soon be seeing a raft of draft codes of practice which Ofcom has to prepare. Doubtless there will also be drafts issued by the Secretary of State in relation to his powers and obligations.

No problem. In principle. But…..how will things work in practice?

A vast army of in-house and trade association lawyers and many lawyers in firms hired to supplement them are going to be able to buy their second or third yachts off the back of the work on the consultation and implementation of these codes and related regulations. Some of the preparatory analysis will already have happened and be feeding into Big Tech’s extremely well-funded lobbying strategies.

Money

So how is civil society’s voice going to be heard? I know of no children’s organization in the UK which has the capacity to engage with these processes to anything like the degree that is going to be required or for the period of time entailed.

Every children’s charity is strapped for cash. A great many Charitable Foundations that sometimes step into the breach similarly are having a hard time. Yet if the proposed new regime is to work to best effect and in the way the Government intends, Ofcom or someone other than Big Tech needs to provide some cash.

I am not suggesting we can ever achieve a level-playing field as between children’s organizations, the civil service and Big Tech but something must be done to ensure the tables are not so vertinginously tipped against children’s interests being represented. The processes which lie ahead are going to require a sustained level of detailed engagement.

If there already was an industry levy which Ofcom administered that would be the obvious solution. But there isn’t so maybe as the OSB progresses through Parliament the Government can address this vital question.

General monitoring? No.

One of the supposedly sacrosanct articles of faith of internet governance hitherto has been that intermediaries should be under no obligation to undertake “general monitoring”. It first appeared in the USA courtesy of s230 of the CDA. We copied it in the eCommerce Directive of 2000. It lay at the root of much that later went wrong for children online albeit it took some time for us all to realise it. However, once we did realise it there was no excuse for sticking with it. Yet that is precisely what the EU appears intent on doing.

In the EU’s draft proposal for a new Digital Services Act (DSA) the immunity provisions are repeated eight times e.g. as here on page 13.

The proposed legislation will preserve the prohibition of general monitoring obligations of the e-Commerce Directive, which in itself is crucial to the required fair balance of fundamental rights in the online world.”

It is then further elaborated and developed in Article 7

“Article 7

No general monitoring or active fact-finding obligations

No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.  (emphasis added)

The bit in bold is a remarkable thing for any organization to say if it also wants to claim it is concerned with upholding the rule of law. I paraphrase:

Dudes. Chill. You don’t have to try and find out if any criminals are using your facilities to abuse children. Nah. Spend more time on the beach. Or innovating. Your choice. No pressure.”

The UK is going its own and better way

I am very pleased to say the UK’s OSB does not repeat the archaic and ridiculous formula of the EU’s proposed Article 7.

But make no mistake, neither does the UK impose a ” general monitoring” duty. It solves the problem in a different way by imposing quite specific, targetted objectives and requirements.

Here’s an example. Clause 21 (2) of the OSB, sets out the duties which all platforms have in respect of illegal content, of which child sexual exploitation and abuse (CSEA) is a priority category. Providers must take

proportionate steps to mitigate and effectively manage the risks of harm to individuals, as identified in the……illegal content risk assessment.”

In 21(3) in respect of providers of search facilities, the wording is even more explicit. They have a duty to:

minimise the risk of individuals encountering priority illegal content.”

Is that an instruction to engage in general monitoring? No it is not. It is an instruction to use available, reliable and privacy-respecting technical tools to detect known illegal content.

What could be wrong with that?

Posted in Child abuse images, Default settings, E-commerce, Internet governance, Regulation, Self-regulation | Leave a comment

Age verification in the EU moving forward

The Advisory Board of euConsent held its first meeting last week.

euConsent aims to deliver a framework of standards which will encourage the development of a pan-European network of providers of online age verification and parental consent mechanisms.

Hugely important – global significance

It is hard to overstate the impact this project could have on the way the internet is used not just in Europe but potentially around the world. Many honest efforts by Regulators to protect children online have found it difficult to solve several key challenges which are rooted in the transnational nature of the medium. One of the most obvious and pressing concerned age verification. The European Commission recognised a pan-European solution is required. They ran a competition to select a team to tackle the problem. euConsent was the result.

Highest levels of data security

With euConsent a solution is in sight. Users will be able to verify their age or give consent for their children to use a site or service without disclosing their identity. All age verification providers who are part of the network will be independently audited to certified levels of assurance. Lawmakers, services, and Regulators can choose how and where the requirements will be applied. All providers will operate to the highest standards of data security.

If it is to be a success such an important project needs vigorous and rigorous scrutiny as it progresses through its different phases.  An Advisory Board has been established and I agreed to be its Chair.  The Board comprises representatives of a wide range of stakeholders: European regulatory authorities, children’s rights organizations, tech companies and politicians.  We held our inaugural meeting last Friday.

The Board will hold the project team accountable, helping them as they establish the standards. The Board’s collective and individual insights will contribute to a system that is workable with existing technology and facilitates the creation and implementation of effective regulations. Any new technologies which may emerge will know what they must be able to do if they are to be recognised as an acceptable tool.

Research evidence

Our first meeting was very encouraging. The initial research phase of euConsent has been conducted by academics from Leiden University, Aston University and the London School of Economics and Political Science, supplemented by further work from the Age Verification Providers’ Association, and the research firm Revealing Reality. These groups presented their key findings to the Advisory Board who were impressed by the scope of what has been done so far. Board member Anna Morgan, Deputy Commissioner at the Irish Data Protection Commission, found the evidence-based foundations of the project really promising. Almudena Lara of Google was pleased the opinions of children themselves are being sought and listened to in the research conducted by Revealing Reality.

Having such a spread of experts all gathered in the same Zoom produced a series of lively interchanges which were immensely valuable! Even at this early stage some key issues were raised. Negotiating the tension between data privacy and child protection lies at the heart of what we are trying to do, and how to cope with the already existing different regulatory approaches across jurisdictions is no less important.

I am looking forward to engaging with the Advisory Board further as euConsent’s technical solutions are developed and released over the coming months.

Posted in Age verification, Default settings, E-commerce, Internet governance, Pornography, Privacy, Regulation, Self-regulation | Leave a comment

States, csam and grooming. A proactive future?

The Council of Europe has brought out an extremely useful and timely report. Quoting directly from multiple sources it draws together many of the principal legal instruments which have a bearing on the obligations of states actively to protect children from sexual exploitation and abuse online.

The report does so within a very particular frame of reference. It examines the role automated technologies and tools can play in helping states discharge their obligations to children. Moreover, while the legal instruments cited apply directly to and are generally enforceable against states, it is clear by extension the legal principles ennuciated apply also to private entities e.g. companies which operate within or from signatory states.

The report was prompted by the debacle in the EU over the e-Privacy Directive. Readers will recall the central issue there was the legality of actions being taken by businesses which, on a wholly voluntary basis, had been deploying automated tools to protect children in respect of the distribution of child sexual abuse material (csam) and grooming.

The Council’s report (p11) describes the various types of technologies currently available to deliver either or both of those objectives.

With already known csam, online businesses claim to have been using one or more automated tools since at least 2009 (when Microsoft released PhotoDNA). Recently, we are told, automated tools have also been used to detect suspected grooming behaviour and to pick up images which are likely to be csam but have not yet been confirmed as being such by human moderators.

But were these automated tools and technologies legal to begin with? Would they still be legal after the European Electronic Communications Code (Code) came into effect? Would any potential legal difficulties be resolved by the interim derogation from the Code proposed by the Commission? Those were the questions which sparked controversy.

The States Parties to the Lanzarote Convention asked the Council of Europe to prepare a report which looked at the challenges raised in the debate.

As you would expect, and as you can see from the title shown above, the Council of Europe was concerned to ensure any and all actions taken in pursuit of the self-evidently desirable objective of protecting children nonetheless conformed with the Rule of Law, in this case human rights law.

The Council’s starting point is clear

On page 5 the following appears “States have a positive obligation to protect children from sexual abuse and exploitation”

For the avoidance of doubt the report then provided the following non-exhaustive list of legal instruments and references in support of its unambiguous, unequivocal statement.

  • “the UN Convention on the Rights of the Child and its Optional Protocol on the Sale of Children, Child Prostitution and Child Pornography; 
  • the Council of Europe Convention on Human Rights, the European Social Charter and the Conventions on the protection of children against sexual exploitation and abuse, on Cybercrime  and on Data protection (also known as Convention 108+);
  • the EU Directive 2002/58/EC of the European Parliament and of the Council (e-Privacy Directive) and the European Electronic Communications Code.

The relevance of the European jurisprudence is also highlighted through the analysis of the case law of the European Court of Human Rights and the European Union Court of Justice.”

I am not quite sure how this happened, it could simply be a matter of timing but, referring back to the point about businesses, what was not cited in the report was paragraph 35 of General Comment No. 25, the relevant bit of which reads “Businesses should respect children’s rights and prevent and remedy abuse of their rights in relation to the digital environment. States parties have the obligation to ensure businesses meet those responsibilities.”

It is put equally crisply in the Council of Europe’s own Handbook for Policy Makers on page 69 where we find this:

“Member States have the obligation to secure to everyone within their jurisdiction, including children, the rights and freedoms that are laid down in international and European conventions. At the same time, business enterprises have a responsibility to respect these rights and freedoms. Together, States andbusiness enterprises should aim at achieving the right balance between protecting children and ensuring equal access and opportunities of all children in the digital world.”

The report’s nine recommendations

On page 54 the Report’s recommendations begin:

Recommendation 1: Successful prevention and combating of the current forms of Online Child Sex abuse (OCSEA) requires State actors to stay up to date and react to constant technological developments in this area, facilitated especially by the prevalent use of continuously evolving ICTs. The use of automated technology in the fight against OCSEA is, in this regard, essential.

Recommendation 2: To ensure a proper balance between privacy and protection of children against sexual exploitation and abuse fostering a dialogue between private sector companies and policymakers/regulators is of the utmost importance. Such dialogue should primarily aim at securing adequate transparency on the choice of the technology used and processes around its use.  

Recommendation 3: Initiatives aiming at improving coordination in this area should be indicated and supported as they are vital to the reliability of the reference databases. In this regard, it is also necessary to secure more clarity on how the accountability mechanisms are managed, including the recruitment and training of individuals employed by private sector companies who are responsible for the assessment of illegal content, such as CSAM. 

Recommendation 4: To better maintain a balance between privacy and protection of children against sexual exploitation and abuse, defining the proper level of safeguards should take place as early as possible in the process of development of technology. Policymakers and regulators should place particular focus on the dataset used by that technology to train complex combinations of algorithms.

Recommendation 5: In order to enhance privacy while prioritizing protection of children against sexual exploitation and abuse it is necessary to promote technological solutions that are the most efficient for the purpose considered. 

Recommendation 6: Initiatives oriented at cross-sectional dialogue should be identified and supported. 

Recommendation 7: The weight that is accorded to positive obligations against OCSEA under international and European human rights law, bearing in mind the best interest of the child, needs adequate appreciation in the legislative debate going forward.

Recommendation 8: Acknowledging the current legal lacunae, consideration should be given by CoE Member States to the need for a harmonised and sustainable legal framework which can provide legal certainty to SPs and address future technological developments. 

Recommendation 9: The CoE Member States are strongly encouraged, in line with their positive obligations to protect children against OCSEA, to establish a public interest-based framework grounded in the Lanzarote Convention, enabling SPs to automatically detect, remove, report and transfer OCSEA-related information under data protection and privacy conditions and safeguards…….

Fighting yesterday’s battle?

That sub-heading is not intended as a criticism because I recognise, without reservation, the great scholarship and thought that has gone into preparing the report. With its abundant references it will be an immensely useful tool for many child rights advocates in the months and years ahead. But let me quote from the report.

On page 5 this appears:

While recognising the benefits a mandatory regime could bring, this report focuses on the practice of voluntary detection and voluntary reporting of OCSEA by service providers….. As a consequence of this approach, the choice of technological solutions analysed in this document was limited to this context“. (emphasis added)

And on page 8:

“…recognising the benefits a mandatory regime could bring in terms of legal clarity and certainty, ……. this report focuses on the practice of voluntary detection and voluntary reporting…” (emphasis added).

In other words the report looked at the status quo and what was going on yesterday.

Things have moved on. We need to know more about the options for tomorrow, their strengths and limitations, both legal and technical. As the report itself hints in the two quotes I have just given (and in many others I have not given) voluntarism poses many problems and mandatory is likely to be better.

Villains and angels alike

If we have learned anything from the 25-year plus failed experiment with internet self-regulation it is that unless something is mandatory too many companies will not do it at all or, even allowing for the overriding principle of proportionality, their levels of commitment will vary enormously. There will be inconsistencies which owe nothing to alleged complexities or differences between platforms but owe everything to differences in the attitudes and the priorities of senior managements in different companies.

And if there is scope to shelter behind any degree of legal immunity or uncertainty it will provide a safe haven for villains and angels alike.

Moreover, absent an acceptable transparency regime (a point referred to many times in the Council of Europe report) how can we be confident anyway that we know what companies are doing or have done, even those with self-affixed halos permanently hovering above their heads?

Meanwhile back at the ranch

Going full circle to the reason for the report, the European Commission has since published its proposal for a Digital Services Act (DSA). In Article 6 they make clear relevant service providers will not lose any legal immunities “solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to illegal content…”.

Grooming not covered?

Article 6 thus clarifies and establishes a legal basis for what many had assumed was the status quo, which is good, but by referring only to “content” it appears not to include attempts to detect illegal behaviour such as grooming. Which is not so good.

Article 7 is the big problem

Article 7 expressly says

No general monitoring or active fact-finding obligations

No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.”

Why is the Commission suggesting platforms are allowed to be indolent or indifferent to any law breaking which they are facilitating? Why are they saying this when they know it is online intermediaries which are the largest facilitators of online crimes against children?

Elsewhere in the DSA there are nudges which will encourage companies, especially the larger ones, to look into illegal content and behaviour which threatens children, particularly where (as would almost always be the case,) this breaches their own terms and conditions of service. And yet, and yet.

Article 7 – a magnet and a multiplier for the feckless and reckless

Surely where the technical means exist which would allow the detection of illegal behaviour towards children, its use should be mandatory not optional?

I say that while remaining mindful of the overarching rule of proportionality and insisting always that all privacy rules are properly respected.

If or when stronger rules kick in governing safety by design one would expect the necessity to mandate protective measures of this kind will diminish because they have been designed out of the devices or Apps. However, we are a long way from reaching that happy state.

It’s important to get things in perspective

When I walk into a public building I normally have to go through a metal detector. If it spots any metal I have to stand ready to show or explain what it is. Otherwise if I have no metal I have to show or explain nothing, nothing is recorded. No data about me is processed. I remain data intacta (to coin a phrase).

It’s the same at airports with my suitcases and backpack. Sniffer dogs and scanners are looking for suspected contraband or prohibited items. Are my rights to privacy being violated? I don’t think so, or if they are it is minimally and for a wider social good. If airport officials or the police zone in on me they are doing so because the scanning or sniffing provided reasonable grounds to suggest I might be a law-breaker. Absent such indications I go on my way, again data intacta.

If we can reproduce systems such as these online why wouldn’t we? If it is only because of a lack of trust in the supervisory arrangements which would need to guarantee the integrity of the processes involved we should address that not use it as an alibi for inaction.

Deal with it later this year? Hmm

We are all being told that “later this year” the Commission will publish a proposal to establish a policy for the longer term concerning the points raised around the interim derogation and more widely in respect of the strategy to combat child sexual abuse, online and off. All the points I have just mentioned can be dealt with then. Seemingly. But really?

A few things to say about that:

  1. “Later this year” is a moveable feast.
  2. “Later this year” MEPs and others could be sick to death with anything and everything digital. Their attention might slip as other pressing matters compete for their time.
  3. “Later this year” will happen against a background where we know the Commission started infringement proceedings against 23 out of then 28 EU Member States for not properly implementing the 2011 Directive.
  4. Are we seriously being asked to believe that only months after European Institutions decided not to impose any obligations to “actively seek facts or circumstances indicating illegal activity” they will then reverse themselves and allow a cross-cutting and overarching exemption for children? I have my doubts so let’s have it out now or at least put down major markers for when the EU institutions do come to consider a way forward. Assuming they actually do.
  5. In other words “later this year” could mean the current, highly unsatisfactory status quo might continue for a very long time. We do not want that to happen.

Posted in Uncategorized | Leave a comment

Safety by Design takes a big step forward

Put it out Tuesday. Get it right by version three”. According to Professor Ross Anderson this was the dominant way of thinking in Big Tech long before Mark Zuckerberg’s updated nostrum – “move fast and break things”. Anderson suggests the reason this approach became prevelant in the industry, and IMHO continues to be, is because of the advantages gained from the famous “network effects”. Being first or early can be decisive in building up market share, providing a springboard for further investment and development.

If all you are doing is writing software for a machine to make better sausages I can step aside and leave you to it. Probably. But if you are designing products or services intended or likely to get into the hands of children you cannot be afforded any such leeway.

You would have thought, with products or services aimed at children or likely to be used by them, nobody would really need to be told to “think it through – look at all the angles, watch out for possible pitfalls, dangers to children, and stop them from happening.” Yet how many disasters have we had to face, particularly around the internet of things, even including toys? Let’s not get into the “adjustments” Google, Facebook, TikTok, Snapchat and the rest have had to introduce. By which I mean retrofit.

From today no tech company anywhere in the world has an excuse not to get it right first time. They need only look to Australia. The country’s e-Safety Commissioner has produced an absolutely stunning set of easy-to-use, free-to-use tools. Because they are aimed specifically at products or services which provide opportunities for social interaction they embrace internet users of every age and disposition but their particular relevance to children hardly needs further elaboration.

I have a sneaking feeling the tools will be used by every type of business or organization that works in digital spaces or builds products which can connect to the internet. To quote Commissioner Inman Grant they help “embed safety into the culture, ethos and operations” and those are things everyone should be concerned about in all types of undertakings.

The e-safety packages from Down Under are destined to become a must-have. There’s one aimed at start-ups and one for the mid-tier or enterprise level. Industry was closely involved in preparing the tools so there is zero fat or flab, highly tailored to be optimally usable by a wide range of people engaged in bringing products to market.

Henceforth, as part of due diligence, every would-be investor will or should demand to know that what they are being asked to put their money into, lend their name and reputation to, is following the processes and steps which the Antipodeans take you through with exquisite care and attention to detail.

I have had a demo. Companies and not-for-profits will be beating a path to the Aussies’ virtual door. Or they will if they have any sense.

Posted in Uncategorized | Leave a comment

General Comment 25 – an event of global significance

The United Nations Convention on the Rights of the Child is widely accepted as the cornerstone or foundational document in respect of children’s rights. Every country in the world bar one has signed and ratified the Convention. The highly regrettable exception is the United States. The reasons are complicated, nevertheless the USA has endorsed two of the Convention’s Optional Protocols. However, the fact that it has not fully signed up for the main event illustrates an important point.

Merely adopting the Convention should not be taken as a guarantee that in every signatory country the position of children cannot be improved. By the same token, not signing it cannot be taken to imply children in that country have no rights and are therefore bound to suffer an endless cycle of privations.

The Convention enshires and describes legal rights but it is not written in what is commonly understood to be legal language. It is a political document written in political language, adopted by politicians in a political institution, the United Nations. It sets out eternal values and timeless principles. It does so using high level language. That is because the UN, of all places, recognises we live in a world dominated by nation states with a rich variety of cultures.

The real significance of the Convention is therefore twofold. It establishes standards by which any and every country can or ought to be judged and held accountable both by the international community and by their own citizens. And it acts as an important reference point or guide to national governments, private entities and individuals everywhere, perhaps particularly or above all children themselves.

The Convention is a pre-internet document

The Convention was adopted in 1989 following a gestation period of at least ten years. The authors could not have forseen how children’s lives were about to be significantly impacted by the digital revolution which was just around the corner. Nobody could have. Nobody did.

Today if we were starting from scratch in key places the language we would use would be different. Maybe not dramatically different, but definitely different.

The General Comment rides to the rescue

Any idea of changing the text of the Convention was viewed as impractical. It would just take too long and with the current strains in geo-politics who knows where it might end up anyway. Nevertheless, recognising the importance of updating the context within which the Convention should be read by affected parties, the Committee on the Rights of the Child, the UN’s guardians of the Convention, began a process of writing what we now refer to as “General Comment 25”. Actually its full title is “General comment No. 25 (2021) on children’s rights in relation to the digital environment.”

It took over four years to prepare, consult on and adopt the General Comment. The Committee was helped and advised by some very smart and knowledgable people led by Beeban Kidron of 5Rights , with Sonia Livingstone as the lead author, aided by Gerison Lansdown, Jutta Croll and Amanda Third who, in turn, talked to other very smart and knowledgable people from 28 different countries, involving hundreds of organizations and hundreds of children. The final text was agreed and published in March 2021. 5Rights produced an excellent commentary.

There seems little point in me reciting the content of the General Comment. If you go to the link provided above you will see it is well laid out, presented in highly accessible language and it is not very long. The writers had to conform to the UN’s prescribed standard length for documents of this kind.

The General Comment makes explicit that children’s rights apply in the digital environment every bit as much as they do in the physical world. Adjustments have to be made to accommodate the manifest differences between the two spaces but to the greatest extent possible there should otherwise be alignment. The notion of “internet exceptionalism” is expressly rejected.

In concluding I refer to what I think is one of the most important bits of General Comment 25.

Section I, paragraphs 36-39 draws on earlier platitudinous puffs from elsewhere but sharpens things up in a completely new and brilliant way, neatly reflecting much that appears in the Council of Europe’s “Handbook for policy makers on the rights of the child in the digital environment.” Sonia Livingstone is the common link between the two with, in the latter case, myself and Professor Eva Lievens also lending a hand.

Here is 36-39 in full:

Children’s Rights and the Business Sector

36. States parties should take measures, including through the development, monitoring, implementation and evaluation of legislation, regulations and policies, to ensure compliance by businesses with their obligations to prevent their networks or online services from being used in ways that cause or contribute to violations or abuses of children’s rights, including their rights to privacy and protection, and to provide children, parents and caregivers with prompt and effective remedies. They should also encourage businesses to provide public information and accessible and timely advice to support children’s safe and beneficial digital activities.

37. States parties have a duty to protect children from infringements of their rights by business enterprises, including the right to be protected from all forms of violence in the digital environment. Although businesses may not be directly involved in perpetrating harmful acts, they can cause or contribute to violations of children’s right to freedom from violence, including through the design and operation of digital services. States parties should put in place, monitor and enforce laws and regulations aimed at preventing violations of the right to protection from violence, as well as those aimed at investigating, adjudicating on and redressing violations as they occur in relation to the digital environment.

38. States parties should require the business sector to undertake child rights due diligence, in particular to carry out child rights impact assessments and disclose them to the public, with special consideration given to the differentiated and, at times, severe impacts of the digital environment on children. They should take appropriate steps to prevent, monitor, investigate and punish child rights abuses by businesses.

39. In addition to developing legislation and policies, States parties should require all businesses that affect children’s rights in relation to the digital environment to implement regulatory frameworks, industry codes and terms of services that adhere to the highest standards of ethics, privacy and safety in relation to the design, engineering, development, operation, distribution and marketing of their products and services. That includes businesses that target children, have children as end users or otherwise affect children. They should require such businesses to maintain high standards of transparency and accountability and encourage them to take measures to innovate in the best interests of the child. They should also require the provision of age-appropriate explanations to children, or to parents and caregivers for very young children, of their terms of service.

Posted in Consent, Default settings, Internet governance, Regulation, Self-regulation | Leave a comment

Shameless self-promotion. By me.

My dad was 13 years old when the Second World War broke out. He was Jewish and lived in Poland. Yet he survived. I have written a book about how he did that. Brought out last year by a small Edinburgh-based publisher, the book is now being republished in the UK and English-speaking Commonwealth by Hodder. On 24th June it is coming out on Kindle and as an Audible, narrated by Sir Simon Russell Beale. On 29th July the hardback will be available. Pre-ordering open now. It will also be available in translation in, so far, six different European languages (Spanish, Italian, Dutch, Danish, Romanian and Polish) with more to come I believe. There will be a US edition in hardback in March next year.

Hope you order it, like it, write reviews on Amazon and elsewhere about it, and tell your friends. I trust you will not only find the story interesting and exciting but you will also pick up on the many modern resonances.

Now back to the day job with apologies for this outburst of shameless self-promotion.

Posted in Uncategorized | Leave a comment

Let’s talk about strong encryption. Again.

I hate to be unfair to Facebook (please don’t tell anyone I wrote that) but they are in the major line of fire in the encryption debate only because we know the scale of apparently criminal behaviour taking place on two of their major messaging platforms, namely Instagram Direct and Facebook Messenger. At least we know it in respect of offending against children because Facebook has for several years deployed tools to detect and report it. And the scale is huge. Which is why what Facebook does in this area is so important.

I make this point because when we finally reach an agreement it should be an agreement which embraces all messaging platforms, not just Facebook. Level playing fields matter. We must not build in commercial incentives to help crooks.

Is it likely Facebook is singularly afflicted? Er, no. Last week we got an insight into why I say that, thanks to the cops in Australia, the FBI and other national police services.

The police created a dummy messaging App and customised phones

Operation Greenlight/Trojan Shield involved creating an App called “ANOM” which was put on customised phones then marketed to criminal organizations and indidivuals by a major underworld figure who had been “persuaded” or paid to help law enforcement.

Planning for the operation began in 2017. Police closed down two messaging platforms that were known to be being used by large numbers of criminals. This created an opening in the market for ANOM and when the moment arrived it was ready.

The customised phones could not send or receive phone calls. They had no camera. The only usable App on it was ANOM and it could only message other ANOM users. As the Australian Federal Police put it “Criminals needed to know a criminal to get a device.” Eventually 11,800 of them were distributed.

Big scale police action

Reports of the results of the police action seem to vary slightly betweeen different media outlets but they are all roughly in the same ball park.

My numbers come mainly from the Washington Post. Police officers in 17 countries took part in the operation which stretched over eighteen months. During this period 27 million messages were exchanged via ANOM and the FBI saw every single one. They could see them in real time.

9,000 police officers were involved in sifting the messages which were exchanged in 45 different languages. The countries with the most users were Germany, the Netherlands, Spain, Australia and Serbia although, in Europe, the largest number of arrests was made in Sweden (75). In total over 800 people were arrested in 17 different countries. The Australian police made 225 arrests. As one police officer put it, an important objective of the action was to undermine criminals’ confidence in messaging apps. I think they have achieved that in spades.

And what did the police see?

As a result of their access to the messages the senders and receivers thought were safely encrypted, law enforcement officers were able to seize eight tons of cocaine, twenty two tons of marijuana, two tons of methamphetamine and amphetamines, 250 firearms, 55 luxury vehicles and more than US$ 48 millions in cash and cryptocurrencies. Murders and kidnapings that were being planned never happened. Nine police officers were arrested because they were found to be in cahoots with the bad guys.

A telling touch

The creators of ANOM even came up with a marvellous marketing strap line to help move the product: “Enforce Your Right To Privacy”. Yes even the criminal underworld is suspectible to good messaging.

What more eloquent but depressing testimony could there be? How have we allowed an important human right, privacy, to become so grotesquely abused and used by some of the world’s worst purveyors of evil and death?

Posted in Uncategorized | Leave a comment

Legal threat to the UK’s Information Commissioner

This should not have been necessary, but I’m afraid it is.

We all know about the damage exposure to porngraphy does to children. There are no serious voices being raised against that proposition, particularly in view of the types of pornography which have become standard fare on the internet of late. Playboy centrefolds they are not.

Earlier this week we learned and saw documented another manifestation of this damage. Ofsted reported on the large scale sexual harrasment of girls taking place in schools and, commenting on it, the Chief Inspector was very clear when she said the Government needs to “look at… the ease with which children can access pornography“. In modern parlance that means porn on the internet. The Chief Inspector added that sexual harassment, including online sexual abuse, had become “normalised” for children and young people. How did it ever come to this?

To borrow a phrase from the Venerable Gail Dines, girls in school have become victims of the “pornification” of our culture. The internet has played a decisive part, not the only part, but a decisive one, in helping create these lamentable conditions.

We all know children, often very young children, are accessing porn sites in gigantic numbers, either driven by natural curiosity or by accident. The porn sites know this. But they continue to receive and process children’s data with the entirely predictable result that this helps draw children back to them time and again. You cannot separate the fact of unlawful data processing from its consequences. This is not a theological or wholly abstract offence.

The porn sites are fully aware

Despite being fully aware of this the porn sites take zero meaningful steps to prevent children gaining access. This was why, in June 2020, I wrote to the UK’s Information Commissioner asking her to call them to account. The Commissioner declined giving what was, in my view, a political answer not a legal one.

Maybe I should have pursued it at the time but the fearless souls at CEASE have taken up the cudgels. Please see their letter threatening legal action and their plea for financial support. I hope you can help.

The CEASE letter quotes my correspondence and the Commissioner’s reply. Here is my original and the reply. And my reply to the reply.

There is something not quite right about a country (us), rightfully proud of being the first in the world to adopt an “Age Appropriate Design Code, putting it under the aegis of the Information Commissioner only then to be told it was not intended to help with stuff that is unarguably age inappropiate.

Posted in Age verification, Internet governance, Pornography, Privacy, Regulation, Self-regulation | Leave a comment

Warriors win. Children win

Three days ago, on Wednesday 9th June, the Canadian Centre for Child Protection (C3P) published a brilliant report containing details of their work over the two year period 2018-2020. I blogged about it the next day.

In the two years covered in their report we learned C3P had seen and verified 5.4 million child sex abuse images and in respect of them issued take down notices to over 760 Electronic Service Providers in all parts of the world. 97% of all the images were hosted on the clear web, not hidden away anywhere and 48% of the notices issued related to images which had already been flagged at least once to the hosting company concerned.

We also learned, inter alia, the Canadians identified a single telecoms company, based in France, responsible for fully 48% of all the child sex abuse material referenced.They named the company. It is called “Free”.

On Thursday 10th June Forbes published an article based on the Canadian report. In the Forbes story they named French billionaire Xavier Niel and published his picture while informing us he owned 70% of Free’s parent group, a company called Iliad.

Free had hosted 1.1 million csam files in 2018-2020. These had triggered 2.7 million notices. The most likely root cause of the problem was Free’s policy of allowing anonymous users. Obviously there was no suggestion Niel personally or senior executives were aware of any of this but that hardly constitues a defence when you think of the pain and misery their lack of diligence had caused and continued to cause over many years.

Yesterday, 11th June I received an email from Wonder Woman, whose real world identity is CEO of C3P. Not many people know that so keep it to yourselves.

In the email I was informed

As of yesterday, Free’s file-hosting service no longer allows anonymous users to upload content — only Free account holders have access to the service now. We believe this has effectively eliminated this service as a means of online CSAM distribution. In addition to this, all 6,500 archives files, containing more than 2 million images/and over 35,000 videos, that were still live prior to the release of the report have been deleted from their file-hosting service”.

What can I say?

Could someone remind me why it is important to “play nice” and refrain from naming and shaming? Yet again we see truth is our most important weapon and truth loses its potency, power and purpose if it is kept hidden away.

This is not just a feather in the cap of C3P. It’s a whole Golden Eagle. And well done Forbes for riding shotgun. Children in general and survivors in particular are forever in the debt of both.

Posted in Child abuse images, Internet governance, Regulation, Self-regulation | Leave a comment

Canada puts another nail in the coffin of self-regulation

The Canadian Centre for Child Protection (C3P) is justly famous for many things. One of them is the quality of their research. It is based on two vital, interdependent pillars.

First is a deep understanding of the position and needs of survivors of child sex abuse, particularly those who have had the additional misfortune of appearing in a picture or video which depicts their abuse where the picture or video has also been circulated online.

Second is CP3’s exceptionally strong grounding in the technologies used to make, publish and distribute child sex abuse material (csam) over all parts of the internet.

More evidence of C3P’s top class work became available yesterday when they published their long-awaited report: “Project Arachnid: Online availability of child sex abuse material”. Its nearly 60 pages do not make easy reading (there is an Executive Summary as well) but it is essential reading for anyone engaged in trying to rid the world of the scourge of csam.

The period covered is 2018-2020. Not a vast span but that only serves to underline the scale of what we are facing. And when you look at the report see what C3P say about their backlog. Scary stuff.

Enormous numbers

In the two years under review C3P examined and verified 5.4 million images and issued take down notices to over 760 electronic service providers (ESPs) in all parts of the world.

Qu’est ce que vous allez faire Monsieur le Président ?

Astonishingly, C3P found that fully 48% of all the material identified was linked to a single, French telecommunications company. The G7 is starting in the UK today. President Macron will be there. I wonder if any journalists will tackle him on this and, if so, what will he say? I expect he will be absolutely horrified because there is no doubt his Administration has been making several moves in the right direction and we expect to see even more.

A dark web problem? Emphatically not

You see all kinds of people rolling their eyes and talking about the dark web, encryption and a variety of subterranean exotica as if they were therefore already resigned to being powerless to do anything about csam. But the unvarnished truth is 97% of the csam detected by C3P was in the clear web. So far from being intractable, online csam is highly tractable. What has been missing is the will on the part of far too many ESPs.

And a massive number of repeats

Perhaps even more schockingly 48% of all the take down notices issused by C3P related to images which they had already flagged as illegal to the same provider. That is truly shameful because the technology exists, and is widely available, which would allow any ESP to detect already known images and prevent them ever seeing the light of day again, at least not on a publicly accessible web page. Table 7 of the report (p 38) shows “recidivism” rates going up, not down. And clock Table 7.3 for the names of the companies involved.

Why don’t more companies use the technology that would allow them to detect csam in milliseconds? Because they don’t have to. No law requires it and this, maybe more than anything else, reminds us why self-regulation – voluntarism – has had its day.

Too slow, too slow

C3P says following the issue of a take down notice the median removal time for the item concerned is less than 24 hours. Bravo! But in 10% of the cases it took more than seven weeks for the illegal material to go.

That is utterly unacceptable and again is a product of voluntarism. And by the way it seems the delays are longest where the images concerned involve older adolescents. This conjures up several unpleasant thoughts about non-expert Sysadmins second-guessing child protection experts meanwhile leaving a child’s sexually abusive image on view until they conclude their own internal interrogation. Not on.

Change is gonna come

From page 48 onwards C3P’s recommendations will be instantly recognisable by children’s advocates in the UK and the EU and in many other parts of the world.

Big Tech obstructionism created space for too many bad guys to flourish

Any and every major thinker in the internet space has known this moment would arrive – the end of voluntarism – but too many industry leaders were determined to drag things out as long as possible to keep the mountains of cash rolling in. It was this obstructionism by Big Tech and their deliberate delaying tactics which created the space for myriad unsavoury characters to hide in the shadows. Until now. Thank you C3P. Keep it up.

Posted in Child abuse images, Internet governance, Regulation, Self-regulation | Leave a comment