The puzzle deepens

In June, 2020, I wrote to Ms Elizabeth Denham, then the Information Commissioner (ICO), asking her to act to protect children from pornography web sites because the sites are illegally processing children’s data. She declined.  I protested. Meanwhile the Age Verification Providers Association began a High Court action for a judicial review of the Government’s decision to abandon the implementation of Part3 of the Digital Economy Act, 2017.

Initially the case looked promising. I stood back, waiting to see if it would produce a result that would deliver the same or similar goods. Eventually it became clear it wouldn’t.

Soon after, in May 2021 we learned another application for a judicial  review of the Government’s decision on Part 3 was getting underway. The timescale for the case to be resolved was and remains unclear and anyway nobody should bet the farm on a particular outcome. One might say something similar in respect of the Online Safety Bill. We still  haven’t seen the final text and still don’t know when it will begin its legislative journey. And then there’s the unavoidable post-legislative delays as new regulations are drawn up, put in in place and enforcement begins.

Long story short, in November, 2021 I wrote to the ICO again. In December I received a reply.   It contained another refusal to act to protect children. 

My reply to that reply is shown below but I can take a hint. I don’t expect this correspondence to continue. 

The ICO appears to have invented a wholly new concept. It certainly has no explicit basis in data protection law. It is counter intuitive. It defies common sense. I am referring to the notion of “content harms”. The ICO obviously believes these so-called “content harms” are important because they rely on them as their reason for doing nothing.

But “content harms” are distinguished from what other kinds of harms exactly? Or is harm of any kind a completely irrelevant or trivial consideration?

It appears as if the ICO believes they have no mandate to act to protect children from pornography sites because, according to them, the Age Appropriate Design Code was never intended to cover “content harms”.   

However, even if that is true, the Code is not the beginning and end of any and every conversation about children’s data. If the ICO intends to continue  saying it is, they really ought to spell it out on their web site. Using accessible or any other kind of English they will struggle to explain why they take a greater interest in Lego than Pornhub when it comes to the protection of children. 

The ICO also suggests I am asking them to “take the lead” in protecting children.  That would be no bad thing, but that is not what I am doing. I am only asking them to do their job.  Data are being illegally processed on a large scale, in shameless plain sight.  Isn’t that enough to set the wheels in motion?  Doesn’t the ICO’s persistent refusal to act in such a clear-cut case risk undermining public confidence in it as an institution?

A new Information Commissioner took up post this week. I hope this will prompt a fresh consideration of the matter.

Dear ICO,

Many thanks for your letter of 17 December 2021 on behalf of The Commissioner.  It was very disappointing, not least because you failed to address what I thought were the most important points I had made in my letter to you of 30th November. In fact, you do not even refer to them.

I have no desire to keep banging my head against a brick wall and no interest in prolonging this discussion but before closing the correspondence I hope you will forgive me if I pick up on a few of the more glaringly problematic aspects of your position.

In a couple of days, on my blog I will publish this letter and your December letter. Thus, I am making these final points for my readers to think about, not in the expectation of troubling you with composing yet another reply, particularly if the conclusion remains unaltered.  

You say, “Where we are strongly in agreement is that the ICO’s Children’s Code plays a key role in delivering protections for children,” but most of the letter seeks to extol the many, various and undisputed virtues of the  Children’s Code (aka the Age Appropriate Design Code) while at the same time explaining why you think it does not afford any protection to children in respect of porn sites.

I doubt anyone ever anticipated the possibility of the Children’s Code being used to justify a refusal to act to protect children.  Moreover, I note I am not the only one who seems to have taken a different view of the intent and ambit of the Code. I refer to para 222 of the recently published report of the Parliamentary Joint Scrutiny Committee which examined the draft Online Safety Bill.

 Para 222 says

“Whilst there is a case for specific provisions in the Bill relating to pornography, we feel there is more to be gained by further aligning the Bill with the Age Appropriate Design Code… this.. would only bring within the scope of the Bill services already covered by the scope of the Age Appropriate Design Code….. This step would address the specific concern around pornography, requiring all such sites to demonstrate that they have taken appropriate steps to prevent children from accessing their content. It would also bring other sites or services that create a risk of harm into scope whilst bringing us closer to the goal of aligned online regulation across data protection and online safety….” ( emphasis added).

I very much hope on the face of the Online Safety Bill pornography will be designated as a specific category in its own right. The designation should be linked to a requirement for any site or service offering, allowing or publishing pornography, in whatsoever form or forum, to have an age verification system in place. 

In the Bill the obligation to act to protect children from porn should be unqualified, not contingent upon meeting any other regulatory requirements or thresholds. If that means Ofcom’s and the ICO’s Codes are identical then so much the better. 

For these purposes, the UK and the 27 EU Member States have exactly the same data protection laws, grounded in the GDPR.  There can be no doubt the data protection authorities in the 27 jurisdictions understand they have a power to act against any sites which illegally process children’s data. None of the 27 have invented a new category of “content harms” to allow them to disavow responsibility for dealing with porn and I do not believe the UK’s Parliament intended the Children’s Code to do that either.  

To conclude, in your letter you tell me what you expect will happen in the future, assuming the Online Safety Bill passes through Parliament in the way many commentators predict it will. That is not a legal argument.  It is an expression of hope.  We have been here before. Look where we ended up.

We are all hugely invested in the Online Safety Bill. Can we rely on it? No, we cannot, and with ever louder discussion of the possibility of an early General Election how can we have confidence in any timetable currently being mooted? We can’t.

But even leaving aside the possibility of an early General Election, should we anyway wait three years, probably more, until Ofcom is in a position to begin enforcing any new pornography related regulations which might materialise once the Bill is an Act? No. Not when there is an alternative pathway. The ICO has the power to open up that pathway now and your continuing refusal to do so is utterly perplexing.

You ask if I have any evidence of porn sites processing children’s data. Please see the attached. There is oceans more.

The screen grab I have sent you underlines the way in which algorithms drive these kinds of sites. To do their work, algorithms require data which only the individual user can provide, and the sites recall the activity of the users from one visit to the next.

Yours sincerely,

John Carr

 

Note to readers: I have not published the evidence I mentioned because I do not want to promote any particular porn site but I did include it with the email that went to the ICO.