The GDPR – still huge uncertainties

Courtesy of European Schoolnet, Ghent University and KU Leuven,   a great many leading experts from the privacy world and the children’s world gathered last Friday in Brussels for the first open and serious EU-wide discussion on the implications of the GDPR.  Better late than never? Definitely. Several earlier requests for the Article 29 Working Party to organise something similar had fallen on deaf ears so well done to all concerned for taking the initiative to put this event together. However, I regret to report that almost none of the many critical questions raised received clear or definitive answers. And in 11 months the GDPR will be law in every EU Member State, including the UK.

Too many important problems

It was good to hear from a Commission representative that the EU’s Data Protection Authorities (DPAs), the European Data Protection Supervisor (EDPS) and the Commission itself are now meeting monthly to try to agree on a variety of things but not so good to hear children’s interests, as such, were unlikely to receive any special attention.

We have far too much on our plate. Children will be dealt with as necessary as we come to each heading….. 

That put us in our place. Given that the room was full of people who had only come to discuss the GDPR and children I am not sure this got us off to the best possible start.

Moreover, although it is clear that, in future, DPAs are going to be major players in the online children’s space, the DPA world is not famously over-endowed with people with a track record of engagement with children and cyberspace.  You might equally say the reverse is also true i.e. the online children’s world does not have an abundance of privacy experts although Jeff Chester and Kathryn Montgomery had come over from the USA and they were energetic participants in the debate.

What was refreshing, however, was that there seemed to be a genuine openness on the part of all the participants to try to work out both what the law meant and how it might best be applied in the interests of children. We all need to build on that.

Article 8 and the age limit – sharp intake of breath

There was one particular matter which caused a major and sharp intake of breath across the whole room. I will try to explain it.

At the moment there is a de facto lower age limit which applies to online services. In most, not all, EU Member States it is 13. That is the age at which a young person may decide for themselves whether or not to hand over personal data to a commercial online service without that service having to obtain verifiable consent from a parent of the young person.

Back in 2012, when the GDPR process kicked off, the Commission proposed to make 13 the legally defined lower age limit in every country. No exceptions would be allowed. Countries such as Holland, Spain and the UK would have had to change their laws to accommodate this but at least there would be EU-wide uniformity.

It all went wrong.

At the last minute (December, 2015) politicians rebelled. Based on no advice or research of any kind we ended up with a new default age limit of 16 but each Member State was given a choice to opt for a lower age limit as long as it was not below 13.

We moved from having a single age to having up to four. That alone would have/will create all manner of complexities but on Friday we learned the following:

1. If a service is based in a country which has decided its minimum age is 13 it may be the case that this will be the operative age for that service in every EU Member State. This is consistent with the “country of origin” principle which applies in other contexts. By implication I guess it would also apply to non-EU Member States but let’s stick with the EU for now.

Google and Facebook are both registered in Ireland. Thus, if this interpretation of the law holds and Ireland opts for 13, in effect nothing will change in relation to age limits anywhere within the EU for any of the services these companies provide.

It is likely any service in a country that opts for 16 will at least consider moving to another country that chooses 13. Article 8 will become a laughable dead letter and people will be left wondering what all the fuss was about.

The COPPA law in the USA establishes 13 as the minimum age in every country in the world where a US company operates but where an individual jurisdiction creates a higher threshold e.g. Spain, which opted for 14, typically the US companies will honour that rule. If the above reading of the law turns out to be right, the EU could be taking a different tack. It could be saying that local decisions count for nothing. All that matters is the age limit in the country of origin.

Remember – this is not definite. It could be that every service has to adapt to the age limit agreed upon in each Member State. Yet that is quite a gulf in interpretation. What is remarkable is that such a gulf exists at all – with 11 months left.

2. However, apparently, if a service sets up in Country A where the age limit is 13 and they target people in Country B where the age limit is, say, 16,  some Member States are arguing that could be construed as “cheating” so won’t be allowed. Go figure.

3. In addition, seemingly there is a possibility that if, for example, a service was targeting French citizens, irrespective of where they lived the age limit should be whatever France had decided upon in respect of age. Again, go figure.

  1. The final possibility is there could be some permutation of these. Ditto.

To repeat my earlier point: the fact that nobody from a DPA, EDPS or the Commission could say what the position was in respect of some of these questions is……


There is continuing uncertainty about the rules on “profiling” as they affect children. The GDPR says

Controllers must not carry out solely automated processing, including profiling, that produces legal or similar significant effects (as defined in Article 22(1)) in respect of a child.

Quite what is a “legal or similar significant effect”? Answer came there none yet here is an issue of enormous significance to the business models of probably every online service of any size.

Age verification

Doubts and reservations were expressed about the potential for age verification to be part of the solution to any of the unresolved challenges. Questions were raised about its technical feasibility, particularly in a transnational environment, its potential to conflict with the principles of data minimization, to restrict children’s rights unlawfully, and also in relation to the possibility of it becoming a stalking horse for governments and the security services to exercise even greater control of their citizens.

These are all self-evidently important questions. For my own part, I believe it is both inevitable and desirable that we will make greater use of reliable, trustworthy, privacy compliant, privacy-enhancing age verification solutions that will work entirely to the benefit of children and young people.

Article 35

This was not discussed at any length but it was acknowledged that it has potential in respect of children. Here is the text

Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data. 

At the very least my hope or expectation is that every business providing relevant services will need to consider whether what they are about to provide could constitute a high risk to some or all of its users. That implies that it must know who its users are, what the risks might be and what steps they ought to take to mitigate or eliminate those risks.

Definition of a child

The only point where there seemed to be little doubt – although even here there was some – was the definition of a child. The consensus seemed to be a child is any person under the age of 18 but unless and until someone – either a DPA, the EDPS or, eventually, a court rules we cannot be completely certain. This is acutely important because of the prevalence of profiling.

Size isn’t everything

At least one voice was raised in defence of small start-ups. How can they be expected to do all this stuff? Or even know about it? Suffice to say there was not a whole lot of support for this position. 1995 is, er, 22 years ago. The internet has moved on. As one perceptive commentator put it

If two youngsters set up to manufacture teddy bears with glass eyes which could easily be detached and swallowed by a young child, causing that child to choke to death, we wouldn’t feel a great deal of sympathy for them or excuse them because they didn’t know about the dangers or the applicable rules. Why should it be any different here?


About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Age verification, Default settings, Facebook, Google, Internet governance, Regulation, Self-regulation. Bookmark the permalink.