Online Harms White Paper – the obvious problem

Some time ago I read a discussion of strategies online businesses might adopt to try to avoid the clutches of potential or actual Regulators. Without embarrassment or apology it was suggested companies deliberately bundle and entangle lots of things.  Why? At least in part because this would make it more difficult for anyone outside the business to understand what was actually going on or identify the causes of a given problem that appeared to be worrying people.

This is the same kind of thinking that brought us confusion marketing.

Opacity is a great shield

I don’t remember if the same piece also recommended the additional, very effective tactic of simply not telling anybody anything you are not legally obliged to disclose.  That is nevertheless commonplace in the high tech world, supplemented, where appropriate, by extensive use of non-disclosure agreements.

Thus, if outsiders know anything about what goes on inside a company it will generally either be because of a leak or because an ex-employee decides to spill the beans. Alternatively, it could be the result of a journalistic exposé. If it is none of those things,  in all probability it will be linked to an internal PR decision the business took to permit a glimpse of a fragment of their operations. Court cases can also be illuminating.

Of course all this can be supplemented by research carried out by academics or by data released by the police or children’s groups but such sources,  and here we must also include leaks, confessions and exposés, are inevitably limited in nature and therefore they are often contested. “Contested” here is a euphemism for “dismissed and ignored.”

Über opacity : those obscure bodies performing public-facing functions

Alongside the tech companies there is an ecosystem of infrastructural bodies which, historically, were responsible for developing the technical standards that allow the internet to function.  I am speaking about organizations such as the IETFW3C, and ICANN .

The barriers to participation in them are enormous, revolving principally around cash, time and high levels of technical knowledge.  The weakness of international institutions under those same headings – cash, time and high levels of technical knowledge – means there might be little effective external oversight or engagement. IETF, W3C and ICANN serve their masters and their masters, by and large, not exclusively, are different bits of the tech industries. They have a reason for being in the room and the reason is money.

I am less familiar with the W3C but in the case of ICANN,  law enforcement,  most Governments and NGOs are constantly outgunned, all too often attending as under-resourced supplicants facing a battery of industry lawyers, accountants, lobbyists and pointy heads who regard them as intruders trying to mess up“their” precious world. I could name several large Governments from the prosperous North that simply don’t turn up to things because they don’t have the wherewithal in the right budget to pay the air fares and hotel bills. Civil society bodies, naturally, are in an even worse position.

Anything emanating from ICANN’s  Governmental Advisory Committee’s Public Safety Working Group is automatically treated with antagonistic suspicion.

Meanwhile the IETF tries to maintain the absurd fiction they are “only engineers” whose working groups have no role or remit to consider the public policy impact of any given project they might undertake.

The internet is therefore a peculiar mix of privately owned enterprises, and  a series of supporting bodies which, similarly, are private entities. Yet they are all nevertheless performing extremely important tasks. They sit at the heart of the way the modern world communicates and does business. All this is down to how the internet evolved, but that does not mean it has to stay that way forever.

Against such a background what are Governments to do?

All over the world people are making it clear they really don’t like the internet’s downsides. Governments are being pressed into action yet, for the reasons given, this is happening without them having a lot of information or insights that would help guide their political or legislative missiles.

Should Governments  give in to the  passive aggressive, silver tongued hostility of tech? Should they be brave and decide to stand as a buffer between their electorates and the tech companies, explaining how difficult everything is ?

Or should they plunge in and try to  do their best to meet their citizens’ legitimate concerns?  In publishing its White Paper on Online Harms the UK Government has opted for the latter course of action. All I can say is “well done,  right decision”.

Public policy-makers in the tech space remind me a bit of physicists in the early modern era. They knew some stuff but were unaware of a great deal we now take for granted. Nevertheless they pressed on.

Even so

I was prompted to write this blog because the other night I was with a group of very well-informed people who either work in tech, had recently worked in tech or were close observers of tech. The hot topic of conversation was the White Paper.

I heard complaints about the imperfections of policy making in British public life (who knew?), complaints about how much of what the UK Government was focusing on was linked only to the behaviour of a handful of businesses, the giants, and because of this a lot of collateral damage was likely to be done to other, smaller or better behaved enterprises.

Then we fell to discussing details of how particular processes of concern to children and parents might work better. Conversations like that will be happening all over the place but they will be doing so, as this one was, against a backdrop of not knowing many crucial details. To use a footballing analogy, we ended up dissecting the off-side rule without first having agreed on the size of the pitch, the number of players there should be on each team, how long the game should last and so on.

This is not an argument for tearing everything up and going back to the drawing board. We are where we are.  Carpe diem.

The role of the Regulator is going to be absolutely key

I am certain we must focus on ensuring the new Regulator that is to be established will have the right level of resources and the right combination of  powers. In particular it must have the power to require the production of information about any given company’s operations.  The information thus obtained  must form the core of the evidence that will be used to draw up the detailed regulations and codes that follow. Without such granularity the risk is the Regulator will keep missing the mark.

Self-evidently any and all of its actions  should be subject to judicial review with full transparency. The Regulator should publish assessments of companies’ performance.

Should the Regulator be a new body or be made part of Ofcom? Discuss. My gut feeling is so many of the tasks the Regulator will undertake are such a long way from Ofcom’s traditional territory that probably a new body is required.

The Government needs to step away asap

What is clear is this new Regulator must enjoy the confidence of the tech world, civil society, the media and law enforcement. I suppose it is inevitable the Government must appoint the person who will be its head but, harking back to my earlier point about the important role of the internet in the nation’s life, there is a strong case for this appointment being subject to approval by Parliament with all-Party agreement that whips will not be applied.

Moreover, while I can see the Secretary of State should have a power to direct the Regulator to look at a particular problem, I am uneasy about the idea that the Secretary of State has to agree any code of practice that might result from such a direction.

Perhaps here, again, Parliament should be required to approve whatever codes the Regulator might propose: either all of them or maybe only certain types, and these would definitely include any that had resulted from an initial direction by the Secretary of State.

Duty of care, platforms’ liability and terms and conditions

Thus, while I am saying in the absence of essential information it will be difficult, at this distance, to say what individual codes of practice should say or to prescribe particular approaches to specific problems that currently preoccupy us, there were three clear and overarching ideas  in the White Paper which ought to be included on the face of whatever Bill makes its way to Parliament.

First among these is establishing a “duty of care”. This is not a new idea but it has not previously been made explicit that it applies in cyberspace every bit as much as it already does in the physical world.

Second, without necessarily imposing a  general obligation to  monitor all activity on a site or platform, although my view on that is weakening,  businesses should be put under an explicit obligation to analyse every aspect of the services they provide so as to anticipate, mitigate or prevent potential problems that might arise  e.g. by deploying available technical tools as well as engaging in accessible educational and informational activity.

Third and closely linked to the second is the idea that for platforms to preserve their immunity from civil or criminal liability they must show they have taken reasonable and proportionate steps to  enforce their own stated terms and conditions of service. Ts&Cs should not be merely marketing hype.

Harmful content

I used to think unless particular content was on the face of it illegal there was no basis for Governments to expect internet businesses to remove it. That sort of remains my opinion but it is clear it is too narrow a view. However, it looks like it is  impossible to come up with adequate definitions of what content, though probably legal, nevertheless ought to be removed. Context can be everything. Nuance matters.

Again, what do you do? Throw your hands up and say you are powerless? It’s not even worth trying? No.

Maybe we should explore the idea of establishing within the Regulator an arm which is  essentially judicial or quasi judicial in nature. They should issue the guidelines on what type of  legal content will be considered harmful, and in what circumstances. They could also adjudicate should there be material doubt or a dispute.

In making decisions I would expect such a body to have regard to the nature of the audience known to be using the service. If children are present in significant numbers one standard should apply, whereas if they aren’t the standard could be altogether different and more liberal. This should help minimise the risks to free speech. 1 in 5 of all internet users in the UK are children. We simply cannot continue to pretend the  internet is, could be or should be an adults only medium.

Trying to be too many things to too many different interests

Much of the original, Utopian, wonderful idealism associated with the early days of the internet has been confounded by the harsh realities of the world. The internet as we know it is trying to be too many things to too many different people with too many diverse interests. It is not sustainable. Something very different lies ahead. It may not be an unqualified improvement on what we have now. It  could lose some of its edginess and dynamism but it will gain in other areas.  On balance, will it be a price worth paying? We shall see.


About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Default settings, E-commerce, ICANN, Internet governance, Privacy, Regulation, Self-regulation, Uncategorized. Bookmark the permalink.