Problems with the GDPR

The EU recently undertook a review of the first two years of the operation of the GDPR. If you missed it you are not alone. The existence of the review was not well publicised. It will report soon but the focus was anyway too narrow. I hold out no great hopes of seeing any much needed improvements which will help children. However, CNIL (the French national data protection authority) is conducting its own review and here the terms of reference are much wider. I pass on these thoughts to them about the points in their questionnaire and other matters.

A  neglected group

Children make up 1 in 3 of all internet users in the world. In some countries this rises to 1 in 2. In high income countries such as France and the UK the proportion hovers around 1 in 5. Thus, whichever way you look at it and whatever else you might believe or want the internet to be, unquestionably it is a medium for children and families.  Too many people appear not to know or accept that.

Arguably children are the world’s largest single, identifiable constituency of internet users.  Unarguably they are the world’s largest single group of vulnerable users.  Yet time and again children appear to have been put in a box marked “too difficult” when it comes to data protection and privacy concerns.  Children are constantly forgotten or overlooked, which is another way of saying “neglected”. In the whole of its life the Article 29 Working Party produced only one substantial report on children. That was in 2008 and principally it concerned schools’ handling of students’ data. Important but not exactly hitting up against the far edges of the techno-horizon being ushered in by the internet.

Like moths to a flame

The modern internet evolved largely as a set of services floating on a sea of data used to fuel targeted advertising.  It was shaped and developed by techno-advertising companies presenting themselves as disruptive rebels. Creative spirits of a new age. Fabulously wealthy, cool and at first sight overwhelmingly benign.

Against this background it is hardly surprising lawyers were drawn to the new frontier along with the technical experts who support them.  Either as employees or retained consultants most of these lawyers and geeks consequently developed a sophisticated understanding of the immediate, cash-generating needs of their paymasters. They did not contemporaneously develop a comparable appreciation of the position of children as end users. There were no laws compelling businesses to do that so they didn’t whereas there is always a compelling need to increase sales and stay ahead of the competition.

Platform immunity and no obligation even to try to confirm a user’s age pretty much guaranteed what happened next. Like moths to a flame,  albeit for different reasons, gigantic numbers of children were also drawn to the “new cool”, places not meant for them, not understood by parents or teachers. Which made the whole thing even cooler.

A lack of expertise

Whatever you make of my reading of history, it is incontrovertibly the case that there has never been the same incentives or possibilities to develop a countervailing body of legal or technical knowledge, expertise or institutions which are readily and continuously accessible to impecunious children’s groups. The playing field remains massively tilted. Ad hoc pro bono assistance is welcome when and where it is available, but no way is it a substitute for solid, on-going professional engagement.


Expertise,  research, guidance and clarification

  1. National Governments, the European Commission, the European Data Protection Board and national data protection authorities must strengthen their own expertise and understanding in relation to children as actors in the digital environment.
  2. Inter alia, this should be built upon a solid and substantial, publicly available evidence base regarding children’s use of digital devices, Apps and spaces. In the USA the FTC is being urged to engage in such a major evidence gathering exercise as part of  a process which may lead to revisions to COPPA. Maybe there is be some scope for  Europeans and others to co-operate in that endeavour? As usual, any changes made to operating rules in the USA will have a global impact so this  would be logical.
  3. Civil society organizations should be helped to improve their understanding of the position of children as data subjects.
  4. Ways should be found to ensure civil society organizations  have access to professional expert technical and legal advice when pursuing privacy issues relating to children as actors in the digital environment. It should be made easier for class actions to be brought to settle disputes which are likely to affect significant numbers of children.
  5. Individual companies and industry-based regulators should be given detailed guidance in relation to what is expected of them. The  UK’s Age Appropriate Design Code  amplifies key provisions of the GDPR in ways which business can readily understand and act upon. The Australians have been considering moving in a similar way.
  6. There is persistent confusion about the nature and scope of what constitutes “sensitive” data, particularly in respect of inferred data which, almost by definition, cannot have been given with explicit, informed consent. In respect of children what are the rules governing how it might be processed and stored?

Correcting a major error

  1. In the original proposal for the GDPR, issued by the European Commission in 2012, there was no mention of ICANN or WHOIS. In none of the subsequent proceedings in the European Parliament, either in Committee or plenary session, neither at the Council nor during the Trialogue, was ICANN or WHOIS mentioned in any way whatsoever, directly or indirectly.
  2. This led to two different but quite specific difficulties which need to be urgently addressed. They have profound and wide ranging effects on children’s rights.
  3. The first concerns the ease with which WHOIS data might be accessed and by whom it can be accessed. Commercially driven forces within and around ICANN used the failure of the GDPR to address WHOIS to bring an end to practices which had existed since Day 1 of the internet. The cost, complexity and time it now takes to access WHOIS data mean levels of online crimes against children, and many other kinds of online crimes, continue unabated or they get worse. It is hard to believe this is what the European Institutions intended or anticipated. How could it have been when the issue was never discussed?
  4.  Secondly, a distinct problem concerns the accuracy of data within WHOIS. Whatever the rules about access might say, if a someone intent on distributing child sex abuse material via a web site knew their name, address and contact details had been accurately recorded by anyone anywhere on Earth, it is hard to believe they would still allow such a site, linked to their name, to be used for criminal purposes. Yet within WHOIS accurate data are the exception not the rule. ICANN has recently taken decisions which make it likely the levels of inaccuracy will increase not decrease.
  5. There are provisions within the GDPR which refer to the importance of  maintaining accurate data in databases but  these provisions are poorly enforced and the penalties are in no way a sufficient deterrent. The penalties for internet Registrars and Registries not verifying WHOIS data prior to selling, renewing or recording a domain name should be substantial and the penalties for persistent failure should be tough.  As the body ostensibly with the power and initial authority to enforce WHOIS rules ICANN should be drawn into the line of fire if they too persistently fail to ensure their own rules are fit for purpose and are honoured.
  6. ICANN should be placed under an explicit obligation to have regard to the way in which their systems facilitate unlawful behaviour or make the job of potential plaintiffs or law enforcement agencies more difficult,and costly than it need be. Studied indifference towards the real world impact of their behaviour must become a thing of the past for ICANN.

Strong encryption

  1. The early drive towards more widespread use of encryption was hugely important. Should an organization’s defences fail and hackers get on to their servers the fact that all stored data are encrypted is a vital, last line of defence.   Neither should it be possible for hackers to intercept data moving across a network. Vital communications between critical national infrastructure facilities, supply chains, online banking and other use cases remain obvious candidates for the deployment of strong encryption.
  2. However, we are approaching a point where encryption is being used at large in generic environments not as a defence against crime but as an enabler of or cover for it.
  3. Measures which have been developed to work at scale to defend children, for example PhotoDNA, are threatened with redundancy by the deployment of encryption. Look, for example, at what Facebook is threatening to do with Messenger and Instagram. However, Facebook is only in the limelight because it previously released data showing the level of criminal abuse of their service. There is no reason to suppose things are significantly different elsewhere.
  4. Moves towards encrypting even metadata will complicate the fight against online crimes against children yet further.
  5. The idea of the Rule of Law presupposes the possibility that the law can be implemented or enforced yet the way strong encryption is spreading threatens to create large spaces where courts in every country in the world will be rendered impotent. For all practical purposes their subpoenas and orders will be nullities.
  6. Careful consideration needs to be given to how this problem should be addressed.
  7. At the very least companies offering services which are used by children will need to explain why they have intentionally deprived themselves of the ability to protect children, both generally and specifically in relation to personal data.

A question of age

  1. The GDPR strongly suggests that where a service is provided for or is meant to be limited to groups or individuals defined by reference to their age, the service provider should take all reasonable and proportionate steps to ensure those provisions or limits mean something.
  2. This should be the case wherever  an age limit is stipulated in a company’s Terms and Condition of service. Where age is also stipulated by law, penalties for breach should be higher than they would otherwise be.
  3.  In Germany there appears to be a greater willingness to engage with and approve technical solutions to assist with determining a person’s age while at the same time remaining respectful of the individual’s privacy.
  4. Age is an obvious and important reference point, but what matters is whether or not children are actually using a service,  irrespective of whether or not the service is intended for them.