The children’s privacy debacle – Part 3

We all know that in January 2012 the Commission of the European Union began a process of consultation and discussion in relation to the formulation of a General Data Protection Regulation (GDPR). In the following almost four years wide ranging debates and public consultations were held to try to hammer out the all-important detail.

I am sure like me many of the readers of this blog prepared written submissions, attended conferences and did all of the things that, in democracies, we take for granted as being part and parcel of how our systems of government work.

In respect of children the first draft of the GDPR, at Article 4, 18, defined a child as being anyone under the age of 18 but Article 8,1 made clear that for practical purposes in the online space 13 would be the operative age.  Below that age companies would need to obtain parental consent before they could collect or process a youngster’s personal data. However, if a site set the bar at 13, with one bound they were free of any such troublesome and potentially expensive obligation.

The Commission advanced no argument or evidence to support setting 13 as the standard other than a pusillanimous acknowledgement that it was already in widespread use because under US Federal law all the US companies were obliged to follow it. I hadn’t realised we had contracted out our policy making quite so comprehensively. Moving on.

To the best of my knowledge what nobody ever argued for, at any rate not in public, was what was finally adopted. I say “not in public” but the truth is I’ve never heard anyone argue for it in private either.

Just to remind you: what we now have is 16 as the default age with Member States being given an option to adopt 15, 14 or 13 instead. Either way this will change the law in the UK and probably also in several other countries.

So where did this idea come from?

I haven’t a clue. But following some modest super-sleuthing I can now tell you where it didn’t come from:

  1. The UK’s data protection authority – the Office of the Information Commissioner
  2. The Office of the Children’s Commissioner for England
  3. The UK Council for Child Internet Safety
  4. Any of the UK’s Children’s Charities
  5. I believe the Article 29 Working Party wasn’t consulted either but I’m still awaiting final confirmation of that

The proposal to establish 16 emerged during the Trialogue so the UK Government must have been made aware of it at some point but they did not seek to engage any of the above. Maybe they were bound by rules of confidentiality but, whatever the reason, because the outcome was so ridiculous we have to consider the possibility this was due at least in part to the process.

My guess would be that 16 sprang from nowhere as a rushed political fix that emerged in Brussels on or about  27th November  and it was a done deal by 17th December when the LIBE Committee met in Strasbourg to adopt the final text. Four years reduced to two and a bit frantic weeks where transparency ceased to exist.

Further comment seems superfluous.

At this stage. But watch this space.


About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Age verification, Default settings, E-commerce, Internet governance, Regulation, Self-regulation. Bookmark the permalink.