The UK’s draft guidance on children and the GDPR

The British data protection authority (DPA) – the Information Commissioner’s Office (ICO) – kept its promise and released a guidance note on children and the GDPR before the end of the year. In fact, it came out on 21st December. People can comment on it up until 28th February.

I am pretty sure the ICO is the only DPA in the European Union to have tried to do anything as ambitious as this in respectof children.  Well done them. My guess is all the other DPAs will be watching what happens next so, vicariously, this might be a sort of EU-wide consultation. I say that only because I think it highly unlikely the staff of the ICO wrote what they did without there having been at least some prior discussions between colleagues from a number of DPAs in different Member States (MS) and the Commission.

The document makes clear it is possible to see elements of continuity between the old data protection regime and the approaching one, but it is also obvious there are some radical new elements. These should definitely enhance the position of children.

The ICO paper is well laid out and is presented in commendably simple language. This does not mean I didn’t struggle with bits of it. I did, but that wasn’t down to the language as such. Some of this is unavoidably complicated and unless you are already steeped in privacy law……we need more case studies to illuminate and thereby vanquish the darkness.

I have not tried to summarise the whole of the ICO’s draft and I doubt it is necessary to comment on all of it anyway. Instead, I have picked the bits I think are interesting or acknowledged to be unresolved. If I have missed anything important please let me know.

UK only – the Data Protection Bill

Parliament is currently going through the process of adopting the GDPR into UK law. During the debate in the House of Lords earlier this month Baroness Beeban Kidron won Government and cross-party support for an amendment which will require the ICO to devise a code of practice to entrench a number of specific improvements to the broader provisions of the GDPR.  The ICO says the effects of the amendment are not reflected in their paper. Nor could they be. The amendment was only agreed in mid-December and it has yet to go to the Commons. It may be mid-February before we know for sure what the final version will look like.

That said, and assuming the final version remains the same as or similar to that agreed in the Lords, it is still too early to say to what extent such a code might lead to material differences in substantive practices between the UK and other EU jurisdictions. Even if it did, it is exceptionally unlikely the unique UK arrangements would give rise to enforcement action by the Commission before Brexit or, after Brexit, to a finding that the UK had not made “adequate” arrangements.

Risk assessment, “available technology” and co-ordination

There are countless references in the ICO document to the importance of risk assessments. In other words, while there are a number of “hard” rules, context is key.

This means, for example, if a company offers a service which could result in harm to a young person then it will be expected to go to greater lengths to ensure, among other things, that persons below its minimum age are not able to gain access to it.

There will also be an escalating expectation that Information Society Service providers (ISS)  have an accurate idea of to whom they are providing a service, at least insofar as that knowledge may be relevant in relation to possible harms.

There are many mentions of the importance of having regard to “available technology”. In other words, as more and better age verification solutions become available, or more and better ways of obtaining verifiable parental consent come on-stream so ISS will be expected to avail themselves of them.  In a modern-day equivalent of the old adage about living and dying by the sword,  you need to keep up.

While the ICO does not allude to it in their note, it should be recalled that in relation to a number of points concerned with age and parental verification, the Article 29 Working Group (Article 29) referred to the desirability of DPAs across the EU developing a co-ordinated approach. Differences in available infrastructure within jurisdictions may make that difficult and it is hard to know what the consequences of these differences might be. We shall all just have to hold our breath and see how that works out.

Ignorance will no longer be an excuse

In the past companies said they did not know the real age of their users. They were not under any obligation to verify anyone’s age so they didn’t. At most, they would simply kick someone off if they discovered they had lied about it. Zero requirements meant zero incentive. This bred inertia.  No longer good enough. The ICO says to every ISS

If you aren’t sure whether your data subjects are children, or what age range they fall into, then you usually need to adopt a cautious approach.

This may mean:

  • designing your processing so that it provides sufficient protection for children;
  • putting in place proportionate measures to prevent or deter children from providing their personal data;
  • taking appropriate actions to enforce any age restrictions you have set; or
  • implementing up-front age verification systems.

The choice of solutions may vary depending upon the risks inherent in the processing.

Definition of a child

The section headed “About this guidance” opens by unambiguously stating everyone below the age of 18 is a child and it cites the fact that the UK has ratified the UNCRC as authority for that proposition. Every MS has ratified the UNCRC.

This is important  for two main reasons

  • because there has been a common misperception that, for GDPR purposes, a child is simply and only someone who is below the Article 8 age. In the UK this is going to be 13. In other countries, it might be 16, 15 or 14
  • because we are reminded (in Recital 38) that everyone below the age of 18 is entitled to “particular protection”.

The note also makes clear that a child enjoys all the same rights as an adult in relation to data privacy. In essence the GDPR therefore creates extra rights which benefit children i.e. a right to have their data considered and treated differently from that of persons over 18.

Moreover, the ICO reiterates the Article 29 view that counselling services are not required to obtain parental consent in respect of children below the Article 8 age.

The ICO further suggests, as a general rule

It is good practice to consult with children

when companies design their processes, and we are reminded that Article 12 of the UNCRC makes the same point.

 Offered directly to a child

The ICO repeats the position taken by Article 29 when it says

….. an (information society service) is offered directly to a child when it is made available to all users without any age restrictions or when any age restrictions in place allow users under the age of 18.

Competence still matters

Hitherto the dominant legal consideration in the UK and many other countries was that of the individual child’s actual ability to understand the nature of the transaction being put in front of them. This is a subjective test and in the UK it is known as the “Gillick Principle” following a decision of our Supreme Court in 1985. This notion was taken up in 1989 in the UNCRC where it is framed as a requirement to have regard to the evolving capacities of the child.

To an extent the GDPR cuts across this. Article 8 establishes a “hard” age limit, in the UK’s case 13, and says a child of 13 can give consent and there is no reference to the nature of the proposition to which they are consenting. Although is there a hint that, actually, in some cases, even the 13 age limit may be a lot less “hard” than we thought?

The ICO tells us the

concept of competence  remains valid


fairness and compliance with data protection principles remain key concepts

Then these words appear

“In many circumstances, you (a data controller) may wish to continue to allow an individual with parental responsibility for a young child to assert the child’s data protection rights on their behalf, or to consent to the processing of their personal data.

 Likewise, if an older child is not deemed competent to consent to processing or exercise their own data protection rights, you may allow an adult with parental responsibility to do this for them.”

Hmm. A rich vein for lawyers to mine and companies to worry about I fear.

However, part of the significance of this becomes apparent later because while it is clear that children below the Article 8 cannot consent, for example to joining an ISS, this seemingly does NOT mean that a child below the Article 8 age has zero data privacy rights independent of their parents because

All data subjects, including children have the right to:

  • be provided with a transparent and clear privacy notice which explains who you are and how their data will be processed.
  • be given a copy of their personal data;
  • have inaccurate personal data rectified and incomplete data completed;
  • exercise the right to be forgotten and have personal data erased. See How does the right to erasure apply to children?
  • restrict the processing in specified circumstances;
  • data portability;
  • object to processing carried out under the lawful bases of public task or legitimate interests, and for the purposes of direct marketing. See What if I want to market children?
  • not be subject to automated individual decision-making, including profiling which produces legal effects concerning him or her or similarly affects him or her;
  • complain to the ICO or another supervisory authority;
  • appeal against a decision of a supervisory authority;
  • bring legal proceedings against a controller or processor;
  • claim compensation from a controller or processor for any damage suffered as a result of their non-compliance with the GDPR.

And the ICO goes on to say

A child may exercise the above rights on their own behalf as long as they are competent to do so.

Note the absence, there,  of any reference to a specific age.

Right of erasure

The same principle applies in respect of the right of erasure i.e. the child holds this right on their own behalf and, at least in theory, its exercise does not depend upon the concurrence of a person with parental responsibility. Here is what the ICO says

Children have the same right to have their personal data erased as adults

 This right is particularly relevant when an individual originally gave their consent to processing when they were a child, without being fully aware of the risks.


It should generally be as easy for a child to exercise their right to erasure as it was for them to provide their personal data in the first place. 

In practice a child may need their parents’ help to exercise this right, and they can even authorise their parents to act on their behalf in such matters.

The three main routes to processing

The ICO outlines three principal ways (there are six in total) in which a child’s data may be lawfully processed: consent, legitimate interests and performance of a contract.


The issue of consent is the one we are all reasonably familiar with and we know that for children below the Article 8 age this will involve the business in obtaining verifiable parental consent.

However, even for children above the Article 8 age, not only will it be important for the young person to understand what it is they consenting to, their consent may nonetheless still be found to be invalid if it is apparent what they are agreeing to is against their own best interests. More work for the lawyers.

Legitimate interests

In respect of legitimate interests, I suspect lengthier explanations, with several case studies or illustrations, will be required before many in the child protection world (me included) will fully “get” what is intended. For example, what are we to make of this, addressed to ISS?

Using legitimate interests as your lawful basis for processing a child’s personal data puts the onus on you, rather than the child (or adult acting on their behalf), to make sure that their data protection interests are adequately protected. You need to consider what the child might reasonably expect you to do with their personal data, in the context of your relationship with them.

 In practice, this means that if you intend to process children’s personal data you need to design your processing from the outset with the child, and their increased need for protection, in mind.

Performance of a contract

In the UK and many countries, persons below the age of 18 do not have full contractual capacity and therefore almost all of the agreements they might enter into are either void or  “voidable”.  The ICO refers to voidable contracts but it would be useful for concrete examples to be given of how this aspect might work in practice in respect of some of the apps and sites most commonly used by persons below the age of 18.

Applicable law

Here the ICO says they are still not sure what the final view will be on the question of the applicable law, save to say that, regardless of where an ISS is based if they offer a “UK version” of their service or they “actively target” UK children then UK law and rules will apply. But if a UK business targets children in other MS it will need to be aware of and comply with their laws and rules. Apparently

In practice this may mean that the child needs to select, or confirm, their main country of residence when they give their personal data to an ISS; so that the ISS …. provider knows which age limit to apply.

That’ll be fun.

Profiling and codes

It is clear that individual industry codes of practice and industry standards are going to acquire considerable legal importance, particularly those devised by or through BEREC or made under the auspices of the Privacy and Electronic Communications Regulations.

And as for the rules about profiling….. how these will work is going to be of absolutely vital importance but their true meaning still remains elusive. Concrete case studies are needed.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Age verification, Consent, Default settings, E-commerce, Internet governance, Privacy, Regulation, Self-regulation. Bookmark the permalink.