OSB – summary of key points

The Online Safety Bill (OSB) is a huge piece of legislation. Wonderful and welcome though it unquestionably is, as drafted there are gaps and ambiguities which need to be closed or tidied up as the Bill makes its way through Parliament.

A blog is not the place to discuss dizzying detail but if anyone is interested they could do worse than look at the body of additions and amendments which the children’s organizations are putting together. The NSPCC and 5 Rights have been doing heroic work here co-ordinating everybody’s efforts. Carnegie deserves an honourable mention in this context although their remit is much wider.

I have also been involved with a brilliant campaign led by CEASE and Democracy Three which will be promoting specific amendments around the pornography provisions of the OSB. Watch out for them too.

Professor McGlynn of Durham University seems to be leading the charge on online harassment, cyberflashing,  “up-skirting” and similar. The OSB incorporates all or most of the recommendations of the Law Commission in respect of illegal or abusive communications but there is pressure to expand or improve them.

I will Tweet more on the detail of the amendments once they go live online. If you didn’t already know I’m @johnc1912. Below I provide what I consider to be the major child protection take away headlines.

But first a couple of broader observations.

Victims/survivors

In my last blog , aside from the point about platform immunity, I should have added the OSB does nothing to address the position of victims/survivors in terms of their rights to compensation from individuals who have been found downloading images of them being sexually abused. I had in mind something along the lines of the USA’s  “Amy, Vicky and Andy Child Pornography Victim Assistance Act 2018”. We’ll find a different, more British name for it.

The UK’s current criminal injuries compensation scheme is simply not fit for purpose for cases of online child sex abuse. However, a Victims’ Bill is making its way through the works in another part of the forest and maybe that is a better place to address the issue.

Information sharing

I would draw your specific attention to amendments being promoted on information sharing between platforms. It’s crazy that one platform can identify someone as a danger to children and kick them off then do nothing with that knowledge thus allowing the same person to set up elsewhere and start again maybe having already engaged in “breadcrumbing” (luring) someone to another platform, perhaps one with less efficient means of identifying miscreants e.g. because it is using end-to-end encryption with no client side scanning.

One very obvious source of information about actual or would-be bad guys in the UK is the Sex Offender Register. The Register contains people who are not “suspected” of being a risk to children. They are people whom a court has said are a risk to children. As often as not they will be subject to court orders limiting the places they can go and these will generally be places where children hang out. Am I ringing any bells?

I haven’t checked recently but platforms used to forbid anyone on a Register (for any reason) from having an account with them and where the Register was a public document e.g. in the USA, clever means were devised to find such people (who typically lied about their real identify but sometimes didn’t). They were then kicked off.

In the UK support for making the Register public, rightly, is very low but the technology must exist or could be developed to allow a machine-readable way of achieving a similar outcome. Bear in mind nobody would get kicked off or barred unless and until it was confirmed they actually were on the Register because they were a threat to children. I believe this would require some adjustment to the way the Register is currently set up.

Main child protection headlines from the OSB

The Government’s latest summary is here. Mine is as follows:

  1. Scope: in the main, the provisions of the Bill are directed at search engines and  social media, that is to say sites or services which allow the exchange of user generated content. The major exception is in relation to pornography. See below.
  2. The duty of care and risk assessments: these are  the golden threads which permeate all parts of the OSB. No site or service is exempt from key provisions, including any which deploy end-to-end encryption. Sites and services must also spell out what steps they are taking to ensure compliance with their own Ts&Cs.  If you say something in your Ts&Cs the Regulator, Ofcom, will require you to show you are making good faith efforts to fulfil or enforce what you have promised or stated to be a contractual term. The days of Ts&Cs as mere marketing hype are over. Large platforms have additional responsibilities, particularly in respect of legal but harmful content (see below).
  3. The porn exemption: the principal exemption to the user generated content rule concerns pornography sites and services. All porn sites and services will have to introduce age verification mechanisms which are privacy compliant, irrespective of how they channel or allow porn to reach a screen. The previous limitation to commercial pornography sites is no longer there. Round of applause.
  4. Proportionality: is an overarching principle in all things. Quite right too.
  5. CSAM: child sexual abuse material must be reported to the National Crime Agency. Up to now reporting has been voluntary although it was widely assumed everybody did report. Illegal content risk assessments are mandatory for every site or service and where a risk is identified steps must be taken to prevent the distribution or publication of csam. The Regulator has a power to require and direct the deployment of technologies which can prevent or mitigate the distribution or publication of csam.
  6. Legal but harmful: where a site or service is likely to be accessed by children satisfactory mechanisms must be put in place to eliminate or mitigate content which, though legal, is likely to be harmful. There will be a lot of pressure to define what is harmful content on the face of the Bill, which is good, otherwise it will be left to Ofcom to draw up regulations spelling it out. The regulations will have to be reported back to and approved by Parliament. It seems likely the courts will have the final say on key bits of this. It is foolish to pretend it is going to be easy. If it was it would already have been done. But remember, very few laws come out of the traps in a fully-finished and perfect or wholly unambiguous form. While acknowledging it is important to try to establish the highest degree of legal certainty from the get-go, it is not at all unusual for it to be left to courts to define, refine and explain the intention behind the legislation. In reaching their determination a court will be guided and constrained by the whole body of established human rights law.
  7. Transparency: Ofcom needs to develop a transparency regime. Failure to comply with information requests can attract criminal sanctions. This whole issue of transparency is going to be one of the most challenging parts of Ofcom’s role. Yet in many ways it is also one of the most vital. Because of the way things have been up to now, at the off nobody –  or very few people – outside of a company can possibly know with complete confidence what exactly are the right or detailed questions to ask, what the proper context is for assessing the answers to any questions posed or what further information requests ought to be made as a result of what is learned. Why? Because companies have never disclosed anything they didn’t have to and up to now they have never had to disclose very much. Companies have intentionally made their systems as complex as possible to deter or limit what “outsiders” can understand about how they  work. So this is bound to be an iterative process, Either way,  we won’t have to listen to anyone say “we employ half a million human moderators so we must be good guys.”
  8. Enforcement: the Bill contains an excellent range of powers which will enable the Regulator to disrupt the income stream of non-compliant sites. These include the power to order a site or service to be blocked or to levy a fine equivalent to 10% of a company’s global turnover. Platforms’ risk assessments must be fit for purpose, with Ofcom as the judge. Nevertheless there is considerable concern that the enforcement processes the Bill anticipates are too clunky, expensive and time-consuming. This will put Ofcom’s enforcement role in jeopardy. With the internet the ability to act quickly is of paramount importance. A co-ordination mechanism is being established to ensure the several statutory Regulators who will now have a finger in the online pie do not trip over each other or compete. There will also be pressure to expand the role of criminal sanctions beyond their current limitation to failure to comply with transparency or information- providing requirements.
  9. Anonymity: there is no attempt to abolish users’ ability, effectively, to remain anonymous but larger platforms will be required to establish mechanisms to allow individuals to verify themselves. End users will then have the choice as to whether or not only to communicate with or accept communications from other users who have also been verified. I guess it will be something like Twitter’s “blue tick” scheme.