More on online child protection and net neutrality

In the leaked draft prepared by the Latvian Presidency,   to which I referred in an earlier blog,  I “revealed” the author’s apparent intention to do two (new) things:  (a) make it a requirement for prior explicit consent  to be given before parental controls software could be deployed  on an end user’s account by an internet access provider and  (b) to insist  that such consent could be withdrawn at any time, presumably either temporarily or forever.

This is problematic in the UK because, for example, in respect of our mobile networks at the moment minors  simply cannot consent to being exposed to adult content.  The Presidency document would have ended that.

Right now in the UK if an end user is unable to prove he or she is over 18 the adult bar stays in place. Full stop. Incidentally the overwhelming majority of people are able to prove their age online, or via a phone call.  It’s easy and quick. There is an option to “go into the shop” to carry out or conclude the age verification process but, contrary to impressions that may have been given elsewhere, that is definitely not the only way.

The system of classification  used to determine whether particular content is or is not “adult” is underwritten for the mobile networks  by the British Board of Film Classification, an entirely independent, much respected body in the UK. Its brand is almost universally recognised.

Before the BBFC got involved some stupid mistakes were made. All new processes tend to have teething difficulties but now a procedure is in place which will swiftly rectify any errors that may occur – I mean will occur – in the  future.

For kids but not by kids

Say you bought a mobile for someone under the age of 18, or indeed if they bought it themselves (perhaps for cash in a supermarket) under the terms of the Latvian draft, precisely because they are under 18, it seems they would not have the legal standing to ask for the filters to be turned on. Thus a measure designed to protect children from age inappropriate content, in theory, cannot be activated by children. Brilliant.  Who thought that one up? And were we meant to understand that while only an adult could ask for parental controls to be turned on  anyone of any age can get them turned off?

A lot worse for WiFI?

In respect of providers of WiFi in public spaces where kids are present,  another problem presents itself in relation to consent. Under current arrangements in such environments nobody is asked if they consent to access to porn being blocked. It follows nobody can withdraw what they haven’t given.

Do some kids deserve protecting while others don’t?

In respect of any ISP or other type of provider that applies default-on filters in the way described above,  it is one thing to argue that the filters are rubbish and valueless but if one accepts that filters have some value in protecting children from age inappropriate content, why is it that only some kids should be allowed to benefit from them, those being the ones whose parents have the knowledge, time, inclination and competence to initiate their use?

Do we just say “tough luck”  to those kids unlucky enough  not to have parents like that? “Not my problem. Move along.”

A question of hoops

If parents want to jump through hoops to liberalise or completely abandon filters that’s fine, it’s their choice but it should never be the other way around.

When you buy a bottle of bleach in a shop it comes with a safety cap on it. Recognising that huge number of ever younger kids are internet users, the same should apply. Filters are a virtual safety cap. Not the only thing we should do to help keep kids safe when they go online but one of them.

The internet is a mixed environment

It is no longer acceptable in my book to assume that the internet is a predominantly adult environment where “special measures” may need to be taken from time to time or in particular circumstances to take account of the fact that kids might be users.

The internet is a mixed environment where kids  will always make up a very substantial proportion of all users.  Everybody’s thinking about almost any and every aspect of internet policy should be framed with that cardinal fact constantly in mind. It should never be an afterthought, irritating or otherwise.

Looking a little wider than the EU, in some parts of the developing world sub-18s are close to being about half of all users and there will be a significant proportion of child users who have no parents at all.  Do we owe them no duty of care? Is that double bad luck for them?

I appreciate people’s concerns about the way “bad governments” could misuse the notion of online child protection  to “slip in” other forms of societal control or oppression but that is a larger, wider, bigger and different political problem. It should not be solved at the expense of protecting children.

Censorship is not the issue here

Remember we are not talking about censorship. No legal content that is on the internet will disappear or be changed as a result of anything I have advocated here or elsewhere. This is about seeking to replicate in the online space policies and practices that have long been taken for granted in the real world to keep age inappropriate materials out of the reach of minors. I acknowledge that this might cause minor irritations or small delays to some adults who want to access material that is not intended for children but only the most curmudgeonly nerds will truly resent such measures if they accept the wider benefits that they can bring.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Age verification, Default settings, Internet governance, Regulation, Self-regulation. Bookmark the permalink.