Article 29 and children

In the UK our Data Protection Authority (DPA) – the Information Commissioner’s Office (ICO) – has shown a real and sustained interest in how the new data privacy regime – the GDPR -is likely to impact children.  ICO officials attended an all-day seminar organized by Professor Sonia Livingstone at the LSE. They shared their thoughts and interacted with a group of leading online child rights and child protection experts who came not just from the UK but also from other EU Member States. Several of the latter travelled to the LSE precisely because they knew the ICO would be represented.

The ICO subsequently issued a consultation paper  and draft guidance specifically addressing the position of children in respect of all relevant headings of the GDPR.

The Article 29 Working Party

Each DPA in every EU Member State remains a sovereign and independent body but they stay in touch and try to achieve a degree of consistency through something called the  Article 29 Working Party.  Article 29 has existed since the mid 1990s. It is being replaced by  the European Data Protection Supervisory Board.

Historically, Article 29 has not shown an excessive degree of interest in children as data subjects. One suspects they have a box marked “Very Difficult. Avoid If You Can. Only Open If You Absolutely Have To”. Children are in it.

In fact the only major Opinion Article 29 seems to have issued which concerns children dates from 2009 and it addresses children’s personal data held by schools. If you were to ask any randomly selected group of parents, children  or policy makers about their main worries in terms of what is happening to young people’s personal data in the internet age, I doubt  that how schools are handling it would be at the top of their list. It would be there, but not at the top. Silicon Valley meanwhile….

FabLabs

Once the GDPR became law Article 29  decided to organize a series of “FabLabs” to discuss what would happen next and get feedback from interested parties. I attended. There were three such meetings. One in July 2016,  one in April, 2017 and finally October, 2017.

In plenaries and working groups at the FabLabs a number of people suggested there should be a  specific session on  the position of children. On behalf of the European NGO Alliance for Child Safety Online and the UK’s Children’s Charities’ Coalition on Internet Safety I wrote to Article 29’s then Chairperson,  Isabelle Falque-Pierrotin of the French DPA, to support that suggestion.

Article 29 did not agree. Neither did they expressly disagree.  The request was just ignored. Thus, at an EU level there has been no open or extended engagement between the online child rights and child protection communities and the privacy community.

Since the GDPR was adopted Article 29 has published no letters about children. They have issued no press releases. No consultation documents or position papers on children  have been sent around. Article 29 have issued no guidelines on children although mentions of children are scattered among several of those that have appeared.

Save for the British texts nobody has tried to bring together or discuss all the parts of the GDPR which concern children. Such a publication would help a lot of people get a better take on or overview of the diverse ways the landscape is changing.

In the report of the first FabLab no mention of children or young people appears.

In the report of the second FabLab the following words appear

Minors  are  a  priority  but  resolution  lies  at  Member State  level  and  the  age verification  of  a  minor  is problematic. The verification of consent by the holder of parental responsibility is also problematic.

Interesting use of the word “priority”.

In the report of the final FabLab this appears

Practical challenges for (data) controllers are arising since, while the GDPR recognizes the need for extra protection for children, it was pointed out that there is no clear indication of how the different stages of the development of a child should be taken into account when providing information.

Amen to that.

What are EU-wide institutions for?

Isn’t one of the points of having EU-wide institutions that, for example as in this case, the larger DPAs and the Commission itself can help out the smaller ones and together they can help each other? Are we seriously expected to believe we are going to see 27 different attempts to sort this out?

Why this catalogue of moans?

Less than three weeks ago,  the pattern of neglect of children reasserted itself.  The new Chair of Article 29 published a letter that had been sent to Facebook on the question of facial recognition and how it might be deployed in the future.  No mention of children.

Facebook have said facial recognition will not be available to children, i.e. persons below the age of 18.  Good decision. That being so, why didn’t Article 29 ask about the steps the company is planning to take to ensure the policy works?

Apps or services which broadcast location data also raise safety and security issues for children. This is not just about Facebook. It’s about the whole social media space.

Article 29’s ear-shattering silence does not inspire confidence

I appreciate there are lots of things that need sorting out in relation to the GDPR. I appreciate also that many DPAs are still angry about the way the GDPR turned out in respect of children. It is clear the privacy community wanted a single age for the whole of the EU and they wanted that age to be 13, the then de facto status quo. Echoes of resentment can still be heard about how, at the last minute, politicians stepped in and “messed things up” by allowing multiple ages.

Of the many lawsuits that lie ahead where the scope and meaning of the GDPR will be clarified, there is unquestionably going to be one on children and in that lawsuit Article 29’s lack of leadership is bound to be noted and criticised.