On 19th July I appeared before the House of Lords Communications Committee. They are holding an Enquiry into Children and the internet. On the day I was largely responding to oral questions.
Will Gardner from Childnet was alongside me. Will tended to lead on the (many) questions about what was happening in schools, particularly around bullying. I majored on other issues.
There is little point in having a Parliamentary Enquiry simply to recite or celebrate how well the UK is doing in the online child protection space. Make no mistake I remain firmly of the view that the UK is a world leader in this area. This was therefore a (rare) opportunity to highlight where improvements or changes are still needed.
What follows is a summary of what I said, plus one new point (compensation for victims of child abuse images) which I added when I later sent in a written version. That’s allowed!
Digital Economy Bill 2016-17
The Digital Economy Bill is directed at larger commercial pornography sites, almost all of which are domiciled outside of the UK and therefore, for practical purposes, beyond the reach of UK courts and law enforcement. While nominally these sites are “free” in that they do not charge to look at the bulk of their wares they are nevertheless highly commercial in nature, collecting their income in other ways e.g. through direct sales and advertising.
The Bill is most welcome but it has a fatal flaw. The Bill requires the sites to introduce age verification to prevent persons under the age of 18 from being able to look at their content. The assumption is that the credit card companies will threaten to withdraw payments facilities and the advertisers will threaten to withdraw advertising from non-compliant sites (which would be operating illegally) and this will be a sufficient incentive for most porn publishers either to comply or cease publishing into the UK. This is a reasonable assumption. The Bill will also create a Regulator with a power to compile a list of non-compliant sites. This list will be circulated to interested parties e.g. credit card companies and advertising agencies but neither are obliged to act although, as already noted, it is anticipated most will. However, if a commercial pornography site uses no UK-based payments facilities and receives no advertising from UK sources, or it changes its business model to arrange things that way, it could continue to operate with impunity.
Thus for persistently non-compliant sites the Bill should give the Regulator a residual power to require access to non-compliant sites to be blocked, in a manner similar to that which, de facto, already exists for child abuse images.
Big social media platforms are like public utilities
We need to start thinking about the major social media platforms in the same way as we do public utilities. Certainly in respect of children and young people the platforms’ dominance in some areas means children and young people may feel they have little choice but to join and be part of the social milieu to which all or the great majority of their friends belong. It is unacceptable for there to be no way for the public or parents to be reassured about the efficacy and appropriateness of these businesses’ internal systems for dealing with complaints from or issues raised by children. An independent regulator (perhaps Ofcom) should have the legal power to compel at least the larger platforms to open their books and allow independent inspection and verification of their public-facing processes to ensure they are working satisfactorily.
Filtering in the UK
The UK’s system for providing filters to customers of the UK’s “Big Four” domestic broadband providers is excellent but there appears to be significant variations in the levels of take up between the different ISPs. At first sight this seems strange because the demographics of their customer base do not look as if they are wildly different. In any event the claims the ISPs make about levels of take up have not been independently verified. When the last (and so far only) checking exercise was carried out, Ofcom merely asked the ISPs to inform them of their take up levels. Ofcom sought neither to verify the claims the ISPs made nor to explain the reasons for any differences. This is not satisfactory. Moreover the current voluntary system for providing filters only extends to the customer base of the “Big Four”. It seems they reach only 90% of households. Children in the other 10% deserve the same level of protection.
The system of filtering for mobile networks appears to be working satisfactorily but it has never been thoroughly inspected and verified by an independent agency.
Ditto in relation to “Friendly WiFi” i.e. the system where the providers of internet access via WiFi in public spaces take steps to limit access to adult content and illegal materials. A key question here would be to determine how extensively it is operating and perhaps also to identify any major enterprises or concerns that had not adopted “Friendly Wifi”.
There has never been a proper, independent evaluation of the optimal age limits for using social media platforms. The single lower age limit of 13 is the product of a US Federal law which was passed in the 20th Century before social media platforms existed. With one or two exceptions e.g. Spain, the rest of the world acquiesced rather than sought to examine critically the appropriateness of that age standard. Perhaps we need more than one age level depending on the nature of the platform and the type of activity in question. In addition the absence of any obligation to verify the age of customers is leading to a huge level of non-compliance. This is not satisfactory.
Compensation for victims portrayed in child abuse image
A new law is required to allow victims of child sex abuse to claim compensation from persons found in possession of images of that abuse. The USA has a similar law specifically designed for this purpose. Aside from assisting with victim recovery it could also act as a major deterrent to a certain class of person who collects these images. The MoJ is currently considering this idea.
An unambiguous duty of care
We ought to establish that the providers or suppliers of digital services have an unambiguous legal duty of care to consider the online child safety aspects of any and every service before it is released. One of Facebook’s founding ideas was “Move fast and break things”, otherwise expressed as, “it is easier to apologise after the event rather than seek permission before it”. It is understood that this has now been formally renounced by Facebook yet it remains a dominant idea across the whole of the internet industry.
There are several notable weaknesses in internet governance institutions and processes: one is their failure to take proper account of the fact that children and young people are a very substantial constituency of users and that they have rights under international law which are routinely ignored. ICANN in particular has been woeful in several key regards. HMG has an important leadership role in this area.
Finding ways to help parents to help their children get the most out of the internet while remaining safe is a major and urgent societal challenge. We cannot blithely assume it is a problem which will solve itself with the passage of time. In this context schools have an important role to play but if we see them as the sole or principal route to parents we will fail because too many schools continue to be seen by too many parents as unwelcoming places. A public health sort of approach may therefore be worth considering as an additional or complementary strategy. What we are talking about, in essence, are the skills needed for 21st Century parenting. That repertoire of skills must now include a knowledge of how the internet fits into young people’s lives and how best to support children and young people in the use of the technology.