No room for complacency

My last blog was fairly upbeat about the UK Government’s interim response to the consultation on Online Harms. True the response was light on concrete proposals but much of the language was excellent. The overall tone was tough and purposeful.

Did I speak too soon or too naively? I say that because the day after the response appeared, in The Times, this article popped up under the headline “Boris Johnson set to water down curbs on tech giants”. 

It had all the hallmarks of an insider briefing, opening with the following

“The prime minister is preparing to soften plans for sanctions on social media companies amid concerns about a backlash from tech giants.

and

“There is a very pro-tech lobby in No 10,” a well-placed source said. “They got spooked by some of the coverage around online harms and raised concerns about the reaction of the technology companies. There is a real nervousness about it.”

Lest we forget, every Government in the world is to some degree conflicted. They want the jobs, prosperity and glitz that inward investment by hi-tech companies brings.

Against that is the day-to-day reality. Members of Parliament in the UK and  their equivalents in many other countries are constantly being visited by, or receiving emails and letters from, concerned parents, teachers and others about something horrible that has happened to one of their children or some other vulnerable individual. Children themselves have not been silent and their views broadly mirror everyone else’s.

So  is the scene set for a titanic struggle? We should assume it is and prepare accordingly because, as I have remarked before, the goodies don’t always win and the baddies don’t always lose.

The Government is going to be in an awkward position. They will not want to be seen as apologists for Silicon Valley.  They will not want to say, in effect

 “Chill. We are all going to have to  learn to live with these dangers to children or threats to us all from terrorists and scam artists. It is the price we have to pay in perpetuity for the benefits the internet brings. And yes we’re sorry the guys who own the companies that allow these things to happen have become obscenely rich off the back of your woes but even so we musn’t be too harsh on them.”

Yet, post-Brexit,  with a Free Trade Agreement with the USA very much in their sights, the pressure on the UK Government to dial it down could become immense. If any real signs of that happening emerge we need to urge Parliament to “take back control” and “get Online Harms done.”

Slogans such as those at least have the advantage of being familiar.

Posted in Age verification, Child abuse images, Default settings, E-commerce, Internet governance, Regulation, Self-regulation, Uncategorized

The long and winding road

It has been a busy week in the UK. Yesterday the Government finally published  its “initial response” to the Online Harms consultation. That’s the consultation which was concluded in July last year (admittedly a lot has happened since then e.g. a General Election and leaving the EU). We are promised a full response to the consultation in the “Spring”, which probably means August.

Two  Cabinet Ministers signed and launched the initial response document. Today there was a Cabinet Reshuffle and one of them left the Government. Baroness Morgan, who was Secretary of State at DCMS has moved on (this was widely expected, not a shock) to be replaced by Mr Oliver Dowden MP. However, the other signatory,  Priti Patel MP from the Home Office, is still there. There is not expected to be any break in continuity of policy. What we saw yesterday is the outline of what we are likely to get.

We also have a brand new Chair of the DCMS Select Committee, the Parliamentary scrutiny body charged with oversight of the DCMS. Julian Knight MP has already been baring his teeth. And he was harrying the Government again today.

What a surprise! There were no surprises

The core idea  is to be a newly created duty of care which online businesses will be expected to observe  in respect of all their users. There will be a major emphasis  on getting businesses to show they have analysed risks and taken appropriate steps to mitigate them. Crucially, businesses will be expected to enforce their own terms and conditions of service.

In the days running up to yesterday I was told explicitly by two different people that one very important idea (age verification for pornography sites) was going to be promoted from being merely “on ice“to being  dropped altogether. It hasn’t been. In fact on one reading of yesterday’s text you could argue the Government is taking a more expansive view. How else do you interpret these passages?

“Under our proposals we expect companies to use a proportionate range of tools including age assurance, and age verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms. This would achieve our objective of protecting children from online pornography, and would also fulfil the aims of the Digital Economy Act.”

and this

” (Our) proposals assume a higher level of protection for children than for the typical adult user, including, where appropriate, measures to prevent children from accessing age-inappropriate or harmful content. This approach will achieve a more consistent and comprehensive approach to harmful content across different sites and go further than the Digital Economy Act’s focus on online pornography on commercial adult sites.” (note the  very welcome reference to “harmful” content)

The Regulator, powers and penalties

The body that is going to be charged with overseeing all this is to be Ofcom but beyond that it is still not clear what range of powers it will have at its disposal. The tone of the language and the approach is tough.  ISP blocking has not been ruled out, neither have heavy fines, maybe criminal liability. The usual suspects are already bridling.

The main focus of the Regulator’s remit is clearly going to be social media services but several of the items mentioned in the response go way beyond social media sites. We are going to need some closer delineation of boundaries.

Interim codes of practice

The government said it

“expects companies to take action now to tackle harmful content or activity on their services. For those harms where there is a risk to national security or to the safety of children, the government is working with law enforcement and other relevant bodies to produce interim codes of practice.

The interim codes of practice will provide guidance to companies on how to tackle online terrorist and Child Sexual Exploitation and Abuse (CSEA) content and activity. The codes will be voluntary but are intended to bridge the gap until the regulator becomes operational, given the seriousness of these harms. We are continuing to engage with key stakeholders in the development of the codes to ensure that they are effective. We will publish these interim codes of practice in the coming months.”

No mention was made of the interplay of all these matters with the AVMSD which is also overseen by Ofcom, and the data privacy laws, which are overseen by the ICO . Both are  highly relevant to the emerging landscape. And one wonders where the work of the Centre for Data Ethics and Innovation will fit in.

Time is not on our side

Having  opened this blog with The Beatles, I will close with The Rolling Stones (sort of). Time is definitely not on our side. I have heard some appalling estimates of likely timescales for getting the Ofcom show on the road although, apparently, it is once again being suggested that at least the measures directed at pornography sites could come on stream sooner. In which case we can all be thankful for small mercies.

Watch this space.

Posted in Age verification, E-commerce, Internet governance, Regulation, Self-regulation | 1 Comment

Facebook urged to rethink its plans for encryption

Today children’s organizations from over 100 hundred countries around the world publish a letter urging Facebook to rethink its plans on encryption. So far the story has been picked up and reported by the BBC, the Financial Times and the New York Times.  UK broadcast media and other newspapers are also picking it up, as are broadcast and print media outlets elsewhere. Bravo the NSPCC for helping put the letter together and bravo ECPAT International for putting their weight behind gathering support from children’s groups across the globe. The letter can be viewed here.

Posted in Child abuse images, Default settings, Internet governance, Privacy, Regulation, Self-regulation

A big step with bigger implications

Earlier this week the British media were full of reports of the “Age Appropriate Design Code”. It had just been published by the Information Commissioner’s Office (ICO), the UK’s  data protection authority. The Code’s provisions are likely to become fully operative in little over a year.

The Code’s title has a slightly nerdy feel, suggesting it might provide advice about the best sorts of layout and colours to use on web sites aimed at young people, but the only other one I could come up with was “The Code That Tells  Online Businesses How To Handle Children’s Data And Respect The Privacy Rights Of Under-18s”.  The ICO’s one is better.

The Code owes its existence to the Data Protection Act 2018. This was the Act which adopted the EU’s GDPR  into UK law. However,  the redoubtable Baroness Kidron, supported by a wide range of children’s organizations, other Peers and the 5 Rights Foundation, spotted weaknesses in the European instrument and proposed an amendment which was then accepted by both Houses of Parliament.

The resulting Code does not conflict with the GDPR in any way. Rather it makes things more explicit and in so doing strengthens them and therefore makes it more likely they will be honoured by businesses and other organizations.

The code further nudges businesses towards making sure they know who is visiting their sites or using their services.   Sites cannot continue to say “this site or service is meant only for adults” and then take no meaningful steps to keep out “non-adults”.

There is a risk of getting overly theological about whether or to what extent the nature of the content of a site can be wholly disregarded when considering the data processing dimensions of its activities.  My hunch is the nature of the site itself will be hugely relevant though I am certain many lawyers will be greatly enriched by arguing the exact opposite.

A child is anyone under 18

The GDPR, the Code and UK law adopt the UNCRC standard of 18 to define who is a child. Recital 38 of the GDPR says the following

“Children  merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data.”

Within the UK jurisdiction the Code puts flesh on the bones of that statement which, incidentally, because it is a Recital not an Article is not in fact law.

There is no point me rehashing the 15 provisions of the Code. The 5 Rights Foundation has published its own handy summary which you can see here.

My two top picks

All 15 points of the Code are important but for me two stand out. These are

5.  ” Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions, or Government advice.”

To clarify things the ICO adds this

“We mean any use of data that is obviously detrimental to children’s physical or mental health and wellbeing or that goes against industry codes of practice, other regulatory provisions or Government advice on the welfare of children.”

and then there’s

6.  “Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction, behaviour rules and content policies).”

In the latter case, against the possibility that you did not fully understand what the ICO intended,  again they helpfully spell it out

“We mean that you need to adhere to your own published terms and conditions and policies.

We also mean that, when you set community rules and conditions of use for users of your service, you need to actively uphold or enforce those rules and conditions.”

Once more, I couldn’t have put it better myself.

Posted in Age verification, Consent, Privacy, Regulation, Self-regulation

Movement on age verification?

R v Secretary of State for the Home Department, ex Parte Fire Brigades Union is  a famous case from 1995. It was decided by  our Supreme Court (then still called the Judicial Committee of the House of Lords). The Government lost.

Under an Act of Parliament of 1988 the Government was meant to bring forward a new criminal injuries compensation scheme specifically in respect of fire fighters. The way things were left under the Act the scheme was to be introduced on “such day as the Secretary of State may by statutory instrument appoint”. 

Long story short, the Government subsequently announced it was not going to name a date. In effect by an administrative decision they had frustrated the clearly expressed will of Parliament.

Key excerpts from the court’s decision (pardon the archaic English legalese) are as follows

“It might come as a surprise to the man on the Clapham omnibus that legislative provisions in an Act of Parliament, which have passed both House of Parliament and received Royal Assent, can be set aside in this way by a member of the executive.”

and even more tellingly

True, [the Sections] do not have statutory force. But that does not mean they are writ in water. They contain a statement of Parliamentary intention, even though they create no enforceable rights. Approaching the matter in that way, I would read section 171 as providing that sections 108 to 117 shall come into force when the Home Secretary chooses, and not that they may come into force if he chooses. In other words, section 171 confers a power to say when, but not whether.”

Is this ringing any bells? It should. It is very close to what happened with the statutory provisions relating to age verification for commercial pornography sites, as enshrined in Part 3 of the Digital Economy Act 2017.

Several companies spent a lot of money getting ready for the commencement of the new policy. Everything was  in place then, out of the blue, so to speak, on 16th October 2019, the Government called a halt. Not an abandonment as such but, in effect, an adjournment sine die.

Frying different fish

At the time the Government had one overriding political objective. To secure a General Election.

Because of the then Parliamentary arithmetic the timing of such an election was not in their gift.  In some quarters the suspicion therefore is someone in No.10 started to fret. Suppose they suddenly managed to get Parliament’s agreement to hold an election (which they did)? Suppose the new regime of age verification for commercial porn sites kicked in just before or even during the election campaign (which it might have done)?

“Boris the Porn Killer”

Could “Boris the Porn Killer” supplant “Get Brexit Done” as a key theme of the election? Very unlikely. Even so might millions of men be angry because their porn supply had been cut off or interrupted while they completed the age verification process? What if unseen glitches emerged? Who would be blamed? Could it adversely affect votes in marginals?

Did a timid soul in the Conservative Party leadership decide it was best to take no chances? Just pull it. Utter a few warm words about wrapping up the policy with a wider initiative on online harms (which is what they did).

This is just a theory that is doing the rounds but we may be about to find out if there is any substance to it because, for their own different but entirely understandable reasons, the trade association which represents some of the companies that spent millions getting ready for the new scheme are seeking a judicial review of the decision and a number of individual companies are also suing for compensation. It is understood their claims run to around £3 million. Add to that the amounts thought to have been spent by the nominated Regulator (BBFC) and the Government itself and you get to around £5 million.

That’s a lot of money to spend on a pusillanimous and unprincipled panic.

 

Posted in Age verification, Pornography, Regulation, Self-regulation

Yesterday in Parliament – news and no news about porn

Yesterday was the final day of debate on the “Gracious Address” in the House of Lords. The Address had been delivered by Her Majesty on 19th December to mark the opening of a new Parliamentary year and a new Parliament following the General Election. The next day a more detailed announcement followed setting out the Government’s legislative programme for 2019-20.

Online Harms

There is to be an Online Harms Bill. This is good. Probably. Originally the Government said such a Bill would be put through pre-legislative scrutiny, which could also be good, but details of how and when remain scarce. This may presage substantial delay. We might be talking about two to three years. Which is terrible.

This question of delay and timescales is particularly significant when set in the context of the ease with which children can currently access millions of hard core pornography web sites. The crazy thing is we already have a law that could help shield kids from such material but the Government has refused to implement it. The law of which I speak is contained in Part 3 of the Digital Economy Act 2017.

Were Part 3 to be brought into effect the UK would become the first democratic country in the world to require commercial publishers of pornography on the internet to introduce age verification mechanisms as a way of restricting children’s access to their wares

Protection delayed is protection denied

Following a sustained campaign led by children’s organizations and a group of mainly women MPs and Peers, the idea of having such a law appeared in the Conservative Party Manifesto of 2015. In 2017 it completed its passage through Parliament with the support of all the major political parties.

Ministers brought forward a set of statutory instruments to establish the regulatory framework within which the policy would operate. A Regulator was nominated by the Government and agreed by Parliament (the BBFC). Millions of pounds were spent getting us to that point. A range of new and existing businesses also spent millions innovating highly efficient ways of carrying out age verification online. Something similar happened before when age verification for online gambling sites was introduced following the implementation of the Gambling Act 2005.

While initially hostile, the commercial pornography publishers accepted this was now law so they too prepared themselves for the new regime. The Information Commissioner was satisfied with the privacy aspects of the policy.

The fateful day

Absolutely everything was in place when, on 16th October, the Government called a halt. Out of the blue, so to speak. No prior warning.

Several media outlets reported the Government had had a change of heart and was dropping the policy altogether. I have seen nothing from Ministers speaking on the record which would justify that conclusion so unless there was lobby briefing to the contrary, I am at a loss to explain why journalists picked up the story in that way.

In search of “coherence”, apparently

The principal justification offered by the Government was that they wanted the measures to protect children from pornography to be folded into or made “coherent” with their evolving thinking on the wider Online Harms Bill which they were preparing. The Secretary of State’s exact words were:

“It is important that our policy aims and our overall policy on protecting children from online harms are developed coherently….. with the aim of bringing forward the most comprehensive approach possible to protecting children.

The Government have concluded that this objective of coherence will be best achieved through our wider online harms proposals…”

Certainly it is true Part 3 was enacted before the Government embarked on its larger odyssey but the question of the role of porn publishers is quite discrete and particular. Part 3 simply insists commercial publishers of pornography take responsibility for ensuring kids cannot access their sites so easily. Whatever the Government might decide to do with social media sites or other online businesses they are going to have to come back to it. Everybody working in the field knows that.

Is it even remotely possible the Government will say, in effect, “Following a rethink we now believe commercial publishers of pornography can carry on as before. They will have no legal obligation to do anything to keep children off their sites“? I don’t think so.

The very next day

Matters did not rest as they were left on 16th October. The very next day in the House of Commons over a dozen MPs questioned the Minister for Digital, Matt Warman MP, about the shock announcement.

In his replies Mr Warman acknowledged that restricting children’s access to commercial pornography sites was “critically urgent” before going on to say “I am not seeking to make age verification (for pornography sites) line up with (the Online Harms Bill) timescale”.

If protecting children from commercial pornography was so “critical” one has to wonder why it was stopped on the eve of implementation? By their actions the Government ensured children who could have been protected from seeing some truly shocking and harmful images, will not be. It did not have to be that way.

Nevertheless, as we have seen, Warman did indicate that moving forward on age verification for commercial pornography sites need not be bound to the same timetable as the promised Online Harms Bill. That does give some grounds for optimism. Might the new age verification regime yet be brought forward sooner rather than later? It could be. It should be. It would be very easy to do. “All” it requires is for the Government to bring one more statutory instrument to Parliament and name a commencement date.

Yesterday in the Lords in winding up the debate the Government gave assurances that they would be bringing forward “interim codes on online content and activity relating to terrorism and child sexual exploitation”. These are welcome but, at the risk of being repetitive, they do not address the responsibility of commercial publishers of pornography to keep kids off their properties. Part 3 of the Digital Economy Act 2017 does precisely and only that. But on this the Government was silent (although they have promised a letter answering a number of questions that were raised in the debate which Ministers did not cover in the summing up).

Alternatively, if the Government believes there is a specific problem with Part 3 as originally envisaged, they should say what it is. There are various rumours but nothing definitive has emerged from Whitehall.

Perhaps there is a legal method or Parliamentary procedure which could be deployed to amend or add to what we already have in a way which would meet the Government’s concerns? Surely the Opposition Parties would happily facilitate such a course of action?

Posted in Age verification, Child abuse images, Default settings, Internet governance, Pornography, Regulation, Self-regulation

Is the cure worse than the disease?

In a blog which focuses on the meaning of “privacy” in the modern world, Privacy International published an excellent summary of key international instruments which address the subject. At the end they announce their conclusion

Privacy is a qualified, fundamental human right. 

Note, they do not say privacy is an absolute right. That is borne out in all of the treaties and conventions to which Privacy International refers.

Yet look where we are headed with strong encryption.

We are creating what are, for practical purposes, impregnable or unreachable spaces.  These confer impunity on any and all manner of wrongdoing. Paedophiles and persons who wish to exchange child sex abuse material are permanently shielded, as are terrorists and an infinite variety of scam artists.

The rule of law is being undermined

We are looking at a world where warrants and court verdicts lie mute, incapable of fulfillment.  The rule of law is thereby being undermined.

Whereas previously a familiar cry one heard, for example in respect of apparently illegal content, was it should not be taken down without the say so of a judge, the same voices now seem content to contemplate a situation where all judges are rendered impotent.

Thus, on top of the long-established challenges associated with the internet: scale, speed, jurisdiction and complexity, we are adding a whole new layer.

Attacking the problem from the wrong end

Obviously, I get that there has been an erosion of public confidence and trust both in political institutions and in online businesses. Moreover I am not against encryption (see my previous blog) but the way it is being rolled out in some areas is disproportionate. The cure is turning out to be worse than the disease.

Limiting the ability of companies themselves to detect and prevent behaviour which  contravenes their own terms of services is wrong and makes a mockery of the very idea of having terms of service in the first place.

Making it impossible for law enforcement agencies with proper authority to see the content of a message likewise is simply wrong.

Sending cannabis through the post

If I decide to open up a sideline selling cannabis could I legitimately enlist the Royal Mail to help my business prosper by delivering weed to my customers? Of course not.

There is no reasonable expectation of absolute privacy vis-a-vis the otherwise sacred and untouchable postal service. Postal services all over the world take reasonable and proportionate steps to ensure their systems are not being used to aid and abet crimes. They sniff, they scan, x-ray and goodness knows what else.

Have people stopped using the post?

When it became known that this could happen did the mass of people abandon the postal system, outraged by this actual or potential encroachment of their right to privacy of communications? No. Neither would they desert Facebook Messenger if they knew that, only with proper authority and just cause, a message could be examined by the police or court officials.

But is it unreasonable to expect Facebook Messenger not to use strong encryption if all of its competitors are? That is a completely different question.

We really do need to call a halt and take a breath. Just because technologists have invented something it does not mean its use must become compulsory. Certain genies can be put back in the bottle if there is sufficient political will.

In respect of forms of encryption which preclude  the possibility of scrutiny by anyone the political will is growing. It needs an urgent push.

Posted in Apple, Child abuse images, Default settings, E-commerce, Internet governance, Privacy, Regulation, Self-regulation