The EU’s Digital Services Act

On 8th September an EU consultation closes. It concerns a proposed new Digital Services Act (DSA). The Act will provide a once-in-a-generation opportunity to change the internet’s ground rules. Children’s advocates need to get busy.

Reforming the e-Commerce Directive

In the past twenty years or so the internet has changed almost beyond recognition. When the EU adopted the e-Commerce Directive in 2000 in many EU Member States children were a very small proportion of internet and mobile or smartphone users. Social media sites and services barely existed. Apps as we now know them were some way off. Tablets  you got from the doctor.

In 2000 the technology was still relatively new and poorly understood outside a narrow circle. Business asked Governments to “stay out of the way and let us innovate”. They got their wish. If problems arose, tech companies assured everyone they would “do the right thing”. This was called “self-regulation”.

It hasn’t worked. Or rather, its successes have been far too limited and inconsistent. By giving online businesses an almost unique form of legal protection, the e-Commerce Directive created a perverse incentive to do nothing. Many did exactly that. Nothing, or not enough.  Every tech company says they take children’s rights, including children’s safety, “very seriously”. Yet look where we are.

Today in the EU 90 million internet users are children. That is one in five of all users. Families and children are a major and persistent presence in the world of digital technology.  Whatever else it might be, in 21st Century Europe the internet is a consumer product. The internet and its associated access devices must start to comply with standards commonly found in the consumer space.

Need to press the reset button

The terrible things that have happened online to far too many children are not an unavoidable price which has to be paid in perpetuity so as to continue enjoying the many benefits of the internet. But to change the paradigm requires a major act of political will. The EU needs to press the reset button. The e-Commerce Directive is in need of a major overhaul. That’s what the DSA will do.

Beware the Brussels-Strasbourg cocktails-and-lobbying circuit

In any lobbying or campaigning work any of us might do as the DSA processes evolve – and we are probably talking years – it is impossible to over emphasise the importance of not falling into the trap of thinking everything will be settled by Commission officials and, assuming it ever gets going again, the Brussels-Strasbourg cocktails-and-lobbying circuit.

A major part of the  decision-making machinery is the Council of Ministers.  This consists of Ministers from each country, typically supported by civil servants  and advisers in their own national capital and their permanent delegation in Brussels.

On matters such as these it is vital each of these elements and MEPs know how strongly people feel “back home”.

Briefing document heading your way

I have prepared a (short) briefing document which sets out my own views on the key strategic reforms that are needed. Although I wrote it, it is the product of many discussions with experts from several different disciplines and geographies.

The briefing paper is linked to another (slightly longer) document which acknowledges while the EU has been a  major world leader in many areas connected with children’s safety and children’s rights online (and being held safe is a major right) there have also been some spectacular failures that need to be corrected. Now is the time.

Watch out for these documents in your inboxes in the coming days. Use, adapt or ignore  them as you like in any campaigning you undertake. Hopefully we can join together in some way to make our collective voice louder.

The signs are good

In the past couple of months we have seen Vice President Šuica’s initiative, “Delivering for children: an EU strategy on the rights of the child” and Commissioner Johansson’s Communication on a “Strategy to combat child sexual abuse and exploitation” with its major emphasis on the position of victims of sexual abuse, online and off. There has also been a Communication  on areas of the GDPR that need another look in relation to matters affecting children. Many pieces of the jigsaw are coming together about now.

Add that to the fact almost every major tech company accepts reform of the rules is required, and you can see why I am feeling optimistic. But not naively so.

Optimism can be the graveyard of fools

For all the fine words we are hearing ahead of the match, we have to expect two things.

Whatever  large or small tech companies say in public about how much they recognise the need for a new regulatory framework, when it comes down to the nitty-gritty detail don’t expect their views and ours to be same.  We will not all be holding hands and singing in harmony from the same hymn sheet. That will put a strain on some of the vaunted “partnerships” that exist.

Then there’s the usual suspects in civil society. Many are not quite as starry-eyed as they once were about the, as they saw it, “freedom-loving, insurgent Mother Theresa goodness of Silicon Valley” but we know from bitter experience they will generally find a reason to put children’s interests, children’s rights, lower down the list.


Posted in Default settings, Internet governance, Regulation, Self-regulation

Beware the harmful algorithm

These past few days British media outlets have been full of stories about the scandalous way 17 and 18 year olds have been dealt with following the cancellation of ‘A’ Level exams because of the virus.  At the root of the problem was an algorithm. Or rather, it was probably not the algorithm that was the problem, as such,  but how it was applied.

‘A’ Level results essentially determine which University or other educational or training opportunity you end up with as your life journey moves to another level at the end of your time at school. It is usually a pivotal moment in a young person’s life. Not necessarily decisive, but hugely important.

Teachers’ predictions

For readers outside the UK:  this year every young person’s teachers were asked to predict the results they would have obtained had they sat  ‘A’ Levels.  This happens every year so it is possible to compare predictions with actual outcomes.  A great many children perform exactly in line with predictions. A great many do not. That gap is important because it is populated  by people, not robots.

In the usual way, on the basis of teachers’ predictions University places and the like were offered, or not.

Devising an algorithm 

In the absence of actual exams how were the authorities to determine the final results? The answer they all came up with was to devise an algorithm then apply it to the mass of data showing the predicted grades.

The process of devising the algorithm appears to have started by looking at historic data for the subjects concerned, and the type of school concerned, for which read the demography of its intake.

These data would show that in school A, in a certain kind of area x% of  students could be expected to receive top level grades in Maths,  y%  would get the lowest grades in History and so on, subject by subject, school by school.

Class size matters

One of the other factors in the equation was class size, which is typically a proxy for parental income. Children in smaller classes tend to do better than children in larger classes. Who knew? “Smaller classes”  is often just another way of saying “private school”  or a school in a prosperous part of town. Which tends to get us back to parental income.

According to this way of looking at the matter, year after year a predictable proportion of young people will get a certain spread of grades and therefore end up going to a certain spread of Universities or whatever.

But here’s the kicker. Teachers’ predictions for each child were the very last factor to be entered into the calculations. And it looks like they counted for a lot less than the “framework” established by the historic data.

“A process” became a cruel farce.

Marked down

The process has had terrible consequences for huge numbers of children. Why? Because in making the end result fit the pattern of previous years,  large numbers of youngsters were marked down from their teachers’ predictions.

In defence of the system, some officials tried to argue that because teachers from certain types of schools, seemingly with equally certain mathematical predictability, are more prone to overestimate a child’s performance than teachers from other schools, the  large scale marking down was justified.

Dazzled by the maths. Lost sight of the young human being

If you happened to be in an improving school about to register its breakthrough moment, well that’s just bad luck according to the faceless ones who gave their blessing to all this. In  allowing themselves to be dazzled by the logic of the maths they lost sight of the humanity required when handling young people’s dreams.

I listened to two distinguished statisticians explain that “Algorithms work extremely well for populations, but not necessarily for individuals.” They said this quite dispassionately and not in any way to justify what happened this year.

A system that only works for populations and allows for substantial injustice to be suffered by an individual is a system not worthy of the name. Young people can improve their individual performance between mocks and finals. Poor mock results are often the spur to pull your finger out. That’s not something a blunt instrument like an algorithm can detect. With young people’s lives we need precision lasers not hammers.

With a shrug of the shoulders a bureaucrat cannot be allowed to sweep aside and crush a young person. I am tempted to invoke historical examples of other forms of brutal indifference to the individual in the interest of “the plan” but the point is obvious to anyone with a brain and a heart.

Maths first. People nowhere. Not acceptable.

Worcester College, Oxford shows the way

Starting with Worcester College, Oxford some Universities have declared they will accept the teachers’ original predictions and ignore any marking down. I applaud them. How  easy it will be for others to follow suit I don’t know, but they should  all certainly make the effort and stand ready to explain why they didn’t.

GDPR no help but…

Article 22 of the GDPR says individuals shall have the right  “not to be subject to a decision based solely on automated processing, including profiling……”

That is  pretty close to what has happened here although because it isn’t exactly the same, Article 22 is of no use (the process was not based ‘solely’ on automated processing).

Yet the spirit of Article 22 is clear. There is an appeals process but it costs money and it may not be possible to complete the vast numbers of appeals now anticipated before the University  and other terms begin.

With the inevitable drop in students from overseas (maybe even all Chinese students) why can’t everyone do what Worcester College did?  In the next two to three years perhaps there will be a higher drop out rate but at least each student would know this was down to them, not an invisible, unaccountable hand in Whitehall, Holyrood, Belfast or Cardiff. And set against that there will some students who get in to University or other place who otherwise might not have done. And they will shine.

We should not have to choose between injustices

One last word: I have heard people complain that children from private schools and more affluent parts of town fared less badly than children from elsewhere. That is the inevitable result of how algorithms work. If there is bias in a system it will be reflected in the data and be reproduced by it.

This is not a reason to punish the children of well off families. The individual is what matters here, not their parentage. Insisting that little Billy Rich or Jenny Affluent does not get the University place he or she dreamed of because Frankie Poor and Lucy Skint from down the road didn’t get theirs is the wrong answer to the wrong question.

There are lessons here for all of us as algorithms seem set to play an ever more important role in the way all kinds of things work. Particularly over the internet.

Posted in Privacy, Regulation, Self-regulation, Uncategorized

“Good Pictures Bad Pictures”

I have just finished “Good Pictures Bad Pictures. Porn proofing today’s young kids.” by Kristen Jenson.

I have to say, right off, the book is excellent. It is not preachy, moralising or judgemental. In crisp, clear and concise prose it describes a conversation between a mother and, I would guess, her nine or ten year old son. Dad makes an appearance towards the end so it’s not just Mum on a solo run.

The author suggests a parent sits down with the book and talks it through with their child, chapter by chapter, going at their own pace.  The style and clarity of the writing  should make that very easy to do. There are two or three questions at the end of each chapter which will help embed the learnings of the previous pages.

It’s all about the science

What particularly impressed me about the book was the way it bases its explanations  and advice on the science of adolescent brain development. Whatever your view about adults’ consumption of porn, you will be left in no doubt about why it is important to keep it away from kids.

Porn sites are not educational aids. They are purveyors of lies. Harmful lies.

Posted in Age verification, Pornography, Regulation, Self-regulation

Age verification. Movement in South Africa

The South African Film Board is charged with developing regulations to give effect to their recently adopted law requiring age verification for pornography sites. A consultation  was held on how the regulations might be implemented. It closed yesterday. After thanking them for the opportunity to comment this is what I said:

I only have a few, limited points to make about your proposed regulatory regime.

  1. You should ensure your regulations make clear the Board will not accept as valid any age verification solution provided or supplied by any company or organization with economic or other material ties to any entity connected with the publication of pornography. This is vital to maintain public confidence in the age verification process.
  2. The tone and manner in which the age verification solution is presented or marketed should stay tightly focused on the protection of children from age inappropriate content.
  3. Age verification is about upholding children’s rights. It is about meeting states’ obligations to protect children, for example under  Articles 19 and 36 of the UN Convention on the Rights of the Child .
  4. Thus the policy should not appear to be intent on making access to pornography difficult, neither should it be possible to  see it as or believe it to be an “anti-pornography measure”. Practically every publisher of porn acknowledges their material is not meant for children’s eyes but without a law requiring them all to use age verification it is impossible to make an age verification policy work.
  5. Any age verification solution should have only one objective: determining that the person who wishes to access a pornography site is 18 or above. The individual’s  identity e.g. name, address or other identifying features are utterly irrelevant, excessive, and over intrusive. Collecting them will be seen as a threat intended to discourage people from accessing pornography sites. There are several  age verification solutions available which do not require  personally identifiable data to be recorded or retained yet can provide robust evidence that an age verification process has been completed.
  6. The public needs to have the highest confidence in the age verification solutions providers’ respect for their privacy. Compliance with strict privacy rules should be mandatory from Day 1.
  7. It would be greatly to everyone’s advantage if age verification was not solely or principally associated with pornography. There are other classes of audio visual materials which are supposed to be restricted to adults where the same regime should apply.
  8. A power to block access to non-compliant sites is essential.
  9. A power to direct payments companies, advertisers or ancillary service providers  not to  engage with non-compliant sites is also essential.
  10. Pornography sites should not be allowed to promote or be associated with  VPNs or any marketing or other measures  which would be likely to facilitate or encourage evading the age verification regime.


Posted in Age verification, Pornography, Privacy, Regulation, Self-regulation

The EU’s strategy for a more effective fight against child sexual abuse

I said in an earlier blog I would write more fully about the announcement last week of the European Commission’ s new strategy for a more effective fight against child sexual abuse. I wrote that when I had only skimmed the document. I have now  read it in full from end to end and I have changed my mind. The reason is obvious. The document, known in Commission parlance as a “Communication”,  is itself a summary and a pretty intense and dense one at that.

It is both brilliant and comprehensive, packed with statistics, references and ideas. It would be ridiculous to try to compress it further. I’m afraid you will just have to read it yourselves. It’s worth the effort. And to be clear the document is categorical that any strategy worthy of the name must recognise that addressing child sexual abuse in the physical environment is every bit as important as and integral to tackling it online.

Internal structures?

If it is light in any area it is in respect of the Commission’s own internal structures and processes. Children’s issues are cross-cutting in nature. The atomised way Commission Directorates operate militates against anyone having a 360 degree overview.

This must change. There needs to be someone close to the very top, supported by an expert team, who can see things early in the policy development process. They should have the  authority to step in and insist that the impact on children is weighed in the balance.

I recently wrote about the near miss with the European Electronic Communications Code but there have been other examples before that.

The example of postal services

I remember ages ago attending an event in Brussels where a Commission official was speaking eloquently about smoothing out inconsistencies in relation to sending goods by post across national boundaries. The purpose of his proposals for a new approach was to stimulate more economic activity between Member States.

In Country A it might cost 2 Euros to send a 1 kilogram package to Country B, the border of which might be only a few kilometres away, yet to make the same trip in reverse it might cost 90 Euros because a border was crossed, or only the same 2 Euros if the identical item went for hundreds of kilometres entirely within the territory. The formalities in Country A might also be very different from those in Country B.

I could see the logic of smoothing out inconsistencies in relation to the higher objective of creating a  single internal market but I pointed out that, in respect of children, not all EU Member States had the same regulatory environment in terms of what children could buy online or consume in the physical world so wasn’t there a risk that, without more, this measure was going to threaten…. you see the point.

The chap was disarmingly frank in his reply. “No one has ever mentioned that before. It’s not the sort of thing we ever think about.” Or words to that effect.

A new European Centre to prevent and counter child sexual abuse

It would be absurd to suggest this was the crowning glory of the Commission document but the importance of creating a new European Centre to focus on child sexual abuse cannot be over-emphasised. Few Member States have the capacity to keep tabs on all the research going on around the world or all the developments that impact on this area of work. There are even fewer that, on the scale required, can initiate research and evaluate outcomes.  This means learning about new and valuable ideas can be a bit haphazard. People on the inside track find out fast. People on the outside track don’t.

This proposed new Centre could change all that. It will be able to strike up partnerships and engage with key players outside the EU. Disseminating information in optimal ways will be a core task.

The Centre should aim to become a global resource working cooperatively with existing and new poles of expertise. There is plenty for everyone to do.

Again a word of caution. For this Centre to be truly effective it must find a way of strengthening the capacity of civil society to participate in its work. There has to be a strong independent voice that will not feel constrained about speaking truth to power because power is tied up in trade talks with a third party nation about whale meat or because it might be uncomfortable to highlight the shortcomings of a prominent partner. When things get cosy they go wrong. But who doesn’t like cosy? Cosy is the new entropy.

Challenges ahead

The Communication contains many references to the Child Sexual Abuse Directive (2011/93) a truly ground-breaking initiative at the time which, if memory serves, was inspired by or grew out of an earlier Communication of some sort. The new Communication acknowledges there have been difficulties in securing full transposition and implementation of the Directive. So much so that it has already initiated infringement proceedings against 23 out of a possible 26 Member States (Denmark is exempt) and, according to footnote 22, in the cases of Ireland, Cyprus and The Netherlands we learn there is a “dialogue on conformity” which is seemingly “ongoing”. 

That means not a single EU Member State to which the Directive applied has, in the view of the Commission, satisfactorily discharged what they think are the obligations imposed by the Directive. Food for a great deal of thought.

Does anyone know of any instances in other areas where everybody has either been proceeded against for non-compliance or become involved in a “dialogue on conformity”?

Posted in Child abuse images, Privacy, Regulation, Self-regulation, Uncategorized | 1 Comment

Facebook’s very poor example

I have lost count of the number of times child safety advocates from around the world have got in touch with me because, in relation to something they think is important and urgent locally, they cannot get any response from a big, usually American, company.

Typically these colleagues were from smaller countries – for which read smaller markets – where the company concerned did not have an office or staff. Happily I was often, not always, able to help out and make a connection because I knew senior people in the business either in Europe or on the other side of the pond.

It goes without saying it shouldn’t be like that. But here is a case that takes the proverbial biscuit.  This news report was sent to me by a lawyer friend from South Africa.

Threat of gang rape

Two months ago on Instagram a teenager was threatened with gang rape. Anonymously. Obviously. Sounds like the threats were graphic and detailed. There was reason to believe one of the people behind them attended the same school as the targeted victim. The threat was close to home. It felt immediate.

The police were hopeless.  It would likely take six months to get them to issue a subpoena to discover who the culprits were. Not good enough but equally not an alibi for inaction on the part of Facebook.  With the backing of a rich local philanthropist the victim’s parents instructed lawyers. Here’s where it gets really bizarre.

Facebook’s lawyers cannot accept service of a writ

Nobody disputed the teenager was entitled to receive the information she requested. However, Facebook’s lawyers in South Africa not only said they could not accept service of a writ on behalf of the company, neither would they provide an email address or other information to help the teenager progress her case in a different or more expeditious way.

In the end, thanks again philanthropist, the South African lawyer instructed a Californian lawyer to drive to Facebook’s HQ and on 6th July the necessary papers were physically served under US Federal or Californian processes.

Here is what the South African based lawyer said

“….Facebook Inc has constructed an impenetrable fortress around it which makes it almost impossible for users of Facebook or Instagram to obtain basic subscriber information that would identify the perpetrators of crimes committed on these platforms”.


Posted in E-commerce, Facebook, Regulation, Self-regulation

The agony and the ecstasy – all in the same day

Regular readers will recall last year I got very  engaged with discussions on the new e-Privacy Regulation being drawn up in Brussels. They were threatening to outlaw the use of PhotoDNA and similar tools by certain types of electronic communications providers. A network of children’s groups got busy lobbying Member States and we managed to get the matter put on ice until after the EU Elections. Which is where we are now.

Talks on the Regulation have restarted. “Robust discussions” are underway. However, imagine my surprise when, a few weeks ago, I discovered streaking towards us from an obscure geeky corner in left field,  there was this thing called the “European Electronic Communications Code”  due to come into force on 22nd December 2020.  That meant it would have applied in the UK. I  therefore had an immediate parochial as well as a wider reason to be interested.

Here is what Recital 270 of the relevant Directive says (emphasis added)

“In the absence of relevant rules of Union law, content, applications and services are considered to be lawful or harmful in accordance with national substantive and procedural law. It is a task for the Member States, not for providers of electronic communications networks or services, to decide, in accordance with due process, whether content, applications or  services are lawful or  harmful.”

This would have done exactly what we had been trying to avoid in the discussions on the e-Privacy Regulation. In fact I am given to understand it would have done more and worse because the list of qualifying communications to which it would apply was extended. And what do you make of the suggestion that providers of electronic communications networks or services cannot decide if  something is harmful? Nah.  It can’t mean that. Can it?

Because the Directive was due to come into force on 22nd December there was no way matters could be corrected by reverting back to or relying on the renewed e-Privacy Regulation discussions. They just could not be completed in time. It was looking grim.

By default PhotoDNA would have been killed off

In other words, by default on 22nd December across 28 countries, the UK included, voluntary measures being taken by companies since 2010 to protect children would become illegal. Businesses would be no longer allowed to detect, delete and report illegal child sex abuse material. Companies already doing it would have to stop unless and until the legislature of the relevant jurisdiction expressly made it lawful. Companies that were thinking about doing it presumably would just drop the idea.

Up this morning ready for action

A little while ago a much smaller number of us went into action again.  One of the things I was going to do today was write a blog finishing with a clarion call to step up the lobbying, asking for the matter to be raised in national Parliaments and so on. Then this morning a copy of the EU Commission’s new strategy  for a more effective fight against child sexual abuse dropped into my inbox. These magic words caught my eye on pages 4 and 5.

“The Commission considers that it is essential to take immediate action….. It will therefore propose a narrowly-targeted legislative solution with the sole objective of allowing current voluntary activities to continue. This solution would allow the time necessary for the adoption of a new longer-term legal framework.”

I danced a metaphorical jig. So we are not out of the woods but we have some breathing space. It was the agony and the ecstasy all in the same day.

More on the strategy BUT

I will write soon and more extensively about the EU’s new strategy. On a quick skim it looks brilliant. Makes me proud to be an Irish/EU Citizen, (should I change my name to Sean Gluaisteán or will my distinctive Yorkshire accent betray me?).

But here’s the obvious question: how did an idea like that get so far, so near to becoming law, to doing so much damage? Obviously it wasn’t done with the intention of putting children in peril. Some good soul likely stuck it in there for an excellent reason completely unaware of the unintended consequences.

Asleep at the wheel or just no wheel?

Where was the person with the requisite authority, either in Brussels or in a national capital, but probably Brussels, who could spot something like this and be able to step in?  The case for the Commission having a very high level person in the machinery who understands online child protection and child welfare issues is made once again.

And how come no one in the children’s lobby picked up on it sooner? Including me. Actually I know the answer to that. No resources. No bandwidth. I only cottoned on because …. well never mind how I cottoned on. It was just too haphazard and cannot be relied upon.

We need to do much better.

Posted in Child abuse images, Default settings, Microsoft, Privacy, Regulation, Self-regulation | 1 Comment

Age verification marches on

This week saw the publication of  a report of a conference which I organized with the help of The Reward Foundation. What did we look at?

  • With the enormous advances that have been made in recent years in the field of neuroscience, what new insights do we have about the impact of pornography on the development of the adolescent brain?
  • How is today’s pornography industry  set up and is the content they are producing anything like that of earlier years?
  • How do age verification technologies actually work? Can they preserve the privacy rights of both adults and children?
  • What actually happened in the UK that led to the implementation of its age verification regime for commercial porn sites being delayed?
  • More importantly, what wider learnings are there from the UK experience?
  • Which other countries are moving towards age verification for pornography sites?

Government officials, academics, activists, technologists, educators, researchers and regulators from 29 countries on five continents gathered in a Zoom to exchange experiences and ideas. And plan and plot for the future!

There is no doubt online age verification is an idea whose time has come. In a different guise it was even debated yesterday in the UK’s House of Lords.

I hope you find the report useful and if you want to know more please get in touch with me or The Reward Foundation.

Posted in Age verification, Pornography, Privacy, Regulation, Self-regulation

The Government lost in court today

The Government lost in court today. Yippee! The age verification companies won the right to proceed with a Judicial Review of the Secretary of State’s decision not to proceed with the implementation of age verification. What will the Government do now?

I cannot believe they will want to go through the messy business of disclosure. Why? Because they have nothing to disclose. The decision was taken on political grounds, with no sound basis in policy or research.



Posted in Age verification, Pornography, Regulation, Self-regulation

UK Government in court over age verification

The Government is in court tomorrow over its failure to implement age verification for pornography sites. Fingers crossed.

Posted in Age verification, Pornography, Privacy, Regulation, Self-regulation, Uncategorized