Should I laugh, cry or emigrate?

A little while ago I wrote a blog in which I lamented the way in which the EU had decided – in the name of net neutrality no less – to ban internet access providers from turning on child protection filters by default. This had particular consequences for the UK but it also, rather obviously, impacts on what ISPs, mobile phone companies and WiFi providers in every EU Member State might think about doing  in the future to protect youngsters.

The decision was taken within a “Trialogue“. There had been no notice or any kind of prior indication that such an option was even on the table. It goes without saying there was therefore zero consultation with child protection organizations  or independent experts who are engaged with the online space in relation to the potential impact of such a decision on children. Officials within the Commission who have a great deal of knowledge of these issues  were unaware of what was being proposed until it was too late to do anything about it.  This makes a mockery of the idea of “evidence-based policy-making” and  the idea of multi-stakeholderism. But it also suggests something, deeper and much more worrying. I will return to this dimension in a moment because it illustrates the core point I want to make in this blog.

At a conference on data protection and privacy

Last Thursday (10th December) I was in Brussels for a conference on data protection. The opening speaker was Madame Jourova, EU Commissioner for Justice, Consumers and Gender Equality. Commissioner Jourova has lead responsibility for children’s rights. Her speech focused on the much anticipated General Data Protection Regulation (GDPR) and the implications of the Schrems case. She failed to mention children.

The GDPR is currently in a Trialogue process. I was keen to hear what was happening.

In the Q&A that followed Madame Jourova’s speech I was called. This was my question:

In the wider context of the EU’s push towards a Digital Single Market what impact do you think the GDPR will have on the position of children?

Here is Madame Jourova’s verbatim reply:

The GDPR is of fundamental importance to the development of the Digital Single Market.

I was given the opportunity to come back so I repeated my question, underlining that I was asking specifically about the position of children and the GDPR. Madame Jourova looked puzzled. At this point another person on the platform jumped in to answer the question on Madame Jourova’s behalf. This was Julie Brill, a US Federal Trade Commissioner.

Julie told the audience that the EU’s GDPR contained provisions similar to those which operated in the USA. This is the so-called Rule of 13. The rule requires commercial companies to obtain parental consent before they knowingly collect any personal data from persons under the age of 13. It is the legal basis on which under 13s are formally banned from US social media platforms because the companies that own them do not want to engage in the process of obtaining parental consent.

Julie Brill was wrong but I did not want to point it out right there in front of the 200 or so people present because that would have been unkind given that she had so obviously been motivated to intervene by a generous impulse to help out Madame Jourova.

At one stage there certainly had been a suggestion that the EU would adopt the Rule of 13 but some time ago the authoritative word on the street was that idea was dead. I had been told the most likely replacement  was the status quo i.e. it would be left to each Member State to determine their own age of consent for data transactions. My question to Madame Jourova was therefore also partly a fishing expedition to see if she would bring us all  up to date. As you can see I had no luck there

Later, while on her way out of the conference hall, Madame Jourova very kindly came over to me and most graciously apologized for not having heard my question correctly.

But why did no one else step in?

There were at least two other senior officials from the Commission in attendance: Paul Timmers of DG Connect and Paul Nemitz who works in Commissioner Jourova’s Directorate. Neither then, nor later in the day did they choose to clarify, comment on or provide any further information about what was happening in the Trialogue in relation to the position of children. I guess it’s possible they didn’t know.

However, another brave soul did volunteer. Sort of.  Wojciech Wiewiórowski was also  in the room. He is the Assistant European Data Protection Supervisor.  In a later session, again following a question from me from the floor, he pointed out that because no one from his organization was involved in the Trialogue he couldn’t be absolutely certain what the latest position was but he told us he understood the issue of the minimum age at which a young person could surrender their data to a commercial third party was being hotly debated.  I was soon to discover  Wojciech was the nearest to being right. Don’t go away.

How quickly things can change

The following day – so that’s the early afternoon of last Friday (11th December) – I got back to London. As I stepped off the Eurostar in St Pancras my phone rings. It was a friend from an internet company. Had I heard the news? The news turned out to be that a week earlier in the Trialogue it had been agreed to make 16 the minimum age at which commercial companies can solicit or knowingly accept personal data from individuals without first obtaining parental consent.  I was told it was a done deal. 16 was going be the new standard for Europe.

Obviously I hadn’t heard anything of the kind. Moreover if any of the people at the previous day’s conference, whose names I listed earlier did know about this development but had chosen to say nothing to the people attending I am rendered speechless. Almost.

A bolt from the blue

Precisely because 16 was such a radical departure I was sceptical. I wanted some sort of official confirmation.  Late Friday afternoon I wrote to the Commission. I have not yet had a reply but on Saturday I received confirmation from another source (one directly involved in the Trialogue) that 16 was in play but this source pointed out that the Trialogue has at least a week to run.  Does that mean it is not in fact a “done deal” and it could still be changed? Who knows?

A campaign is underway

As Friday progressed and Saturday arrived I was being bombarded with requests to write a letter of protest objecting to 16. I have not acted on any of those requests. I know this blog probably sounds a bit like such a letter but in reality it isn’t, at least not in relation to the substantive point. Stay tuned.

I was told by  some of the individuals bombarding me that Google, Facebook and the US companies are “furious”. I can well believe they are not too happy  but keeping companies cheerful is not my main role in life.

Why has 16 been proposed?

In my email to the Commission I asked, if the Rule of 16 had  indeed been proposed, what rationale or justification for it had been offered? The fact that I have never before heard anyone argue for 16 as a new baseline is one thing. Maybe I had missed a step somewhere. Is there new research I don’t know about? On a matter of such moment surely 16 wasn’t just plucked out of thin air by a politician to split an imagined difference?  Was it?

I can quite understand why US online businesses would like to stick to 13 as the baseline age. They have built their systems around it but here’s the thing: 13 was adopted as the age limit in the USA under legislation passed in 1998 and brought into force in 2000, in other words before the age of social media. The law was designed to protect children (defined as sub-13s) from receiving ads from commercial companies unless their parents specifically agreed to allow them to receive such ads.

It was not envisaged or intended that 13 would be a baseline for young people’s privacy across the board.  Moreover the idea that there should be a single privacy standard which applies equally to 13 year olds and 17 year olds is ridiculous.

Young people do a lot of growing up between being 12 and becoming a legal adult at 18. Some things that it might be appropriate for a 17 year old to speak about or post would not be appropriate for a 13 year old. We therefore probably need more than one age standard depending on the type of social media conduct or data being discussed. I appreciate this would make life complicated for online companies but that ought not to be the acid test.

Not a child protection measure

Aside from being unsatisfactory in the way I outlined in relation to young people and privacy the current set up does nothing, or at any rate very, very little, to protect children (sub-13s) from engaging in interactive environments that are not meant for them.

In a study of seven EU Member States an average of 39% of 9-12 year old internet users had Facebook accounts. This rises to 66% in Romania.  Children younger than 9 are also present. How can this be?  Easy. Under US Federal law companies are not obliged to check or confirm ages. So they don’t.  As I said earlier they just ban under 13s.

Now I know that the larger, better known and responsible social media sites energetically look for any sub 13s who are on their sites  and kick them off if they find them but it is plainly absurd to form policy for the whole of the EU for the indefinite future based on the proposition that everyone behaves in the same way that Facebook, Google, Microsoft and Ask fm do at the moment.  We know there are many rogue social networks today and there could be a lot more tomorrow.  Self- regulation has not been working well enough. An opportunity like this may not present itself again for a long time. Carpe diem.

We all know why the Rule of 13 does not work  but whatever the reasons for its laughable, large-scale irrelevance the  present situation is unacceptable. It brings the whole idea of rules into disrepute.

The Rule of 13 is not fit for purpose . It only survives in the USA faute de mieux but nobody is (admitting to) looking for anything better. That being so why should we adopt or embrace it as our standard? That would be to give it a degree of credit or acceptance it does not deserve. We would be helping to consolidate or perpetuate it.

Where is the research evidence?

Where is the study which specifically supports 13 as a privacy standard? I know of none.  I know people infer that 13 is OK by reference to other research but that is really not good enough. Thus, more than anything we need a solid research base to guide policy makers when taking a decision about age and young people’s competence to make different kinds of decisions about their privacy without having to involve their parents.

Unless and until we have that I refuse to get drawn into a phony argument about 13 versus 16. It is just so much hot air being heated by vested interests or a timid desire to stick with the familiar. In addition, absent an obligation to verify a stated age limit you can draw a line wherever you like as far as I am concerned.  Nothing will change. Instead of lying about being under 13  more young people will tell fibs about being under 16.

I can see a case for the GDPR not setting any uniform age standard for the whole of the EU, thereby leaving it to each Member State to set their own. That’s the status quo but it sits ill with the idea of  promoting a digital single market. Yet this is almost certainly preferable to becoming wedded to a useless standard and being stuck with it, possibly for a great many years. Children and young people have been ill-served by this muddle and I am afraid the Commission must shoulder most of the blame.

I applaud anyone who does not want to roll over and acquiesce in an ineffective approach pioneered in the US  – just because it already exists – but the Commission could and should have taken it upon themselves to ensure there was no doubt or dispute about what a better alternative would look like and how it would work. In that they have failed.

Must do better

Thus, here is my main point and I apologise for it taking so long to get to it. What this sorry saga reveals is not just that Trialogues are a dodgy way of making decisions (they are currently being investigated by the European Ombudsman)  more than anything they show the world  that children’s rights as internet users  are not being treated with the respect or seriousness  they deserve within or by the Commission as well as other key EU institutions. They can deny it as often as they like but actions and inaction speak louder than words. We have to find a way to make this better and that must include helping some of the Member States’ governments and both national and European Parliamentarians to get up to speed and stay up to speed.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Age verification, Consent, Default settings, Facebook, ICANN, Internet governance, Microsoft, Privacy, Regulation, Self-regulation. Bookmark the permalink.