Make a smart career move – become a judge

Around the world we now have over 30 internet hotlines, organized under the umbrella of INHOPE. I have the honour to be a member of INHOPE’s Advisory Council and for seven years I was also a Policy Board Member or a Director of the UK’s hotline, the Internet Watch Foundation (IWF).

  •  The IWF example

This global network of hotlines does a fantastic job. They do not all work in the same way but, taking the IWF as an example, it receives reports about or goes looking for pictures and videos of child sexual abuse on the worldwide web or in Usenet Newsgroups.  The IWF staff then take a view about whether or not the pictures or videos are likely to be found to be illegal by an English court. All of the jurisdictions in the UK have rules which are broadly similar.

  •  Professional view

If the staff decides the content is likely to be found to be illegal and it is hosted in the UK the IWF issues a notice to the host. Strictly-speaking this notice merely informs the host of the presence of the material and also that the IWF thinks it is illegal.  Technically it does not mandate or require that the material is taken down but I know of no case where a host in the UK has not immediately removed the allegedly offending item. If the content is published on the web in a foreign jurisdiction the authorities in that country are notified, including the hotline if there is one.

  • Gone in 60 minutes

Depending on the local arrangements, the in-country hotline can then follow a process similar to the IWF’s. Where there is no hotline it is left entirely to the local police to contact the host to request the removal of the image.  In the UK, once the host has been notified, the material might be deleted within one hour. Under the terms of the EU’s eCommerce Directive, once notified, providing the host acts expeditiously to remove the material they will have no legal liability for it.

Outside of the UK similar rapid take down times are not uncommon but in some countries the length of time between notification and removal can be measured in weeks, months or even years.

  • Job done

Obviously from the IWF’s point of view with UK-based content their task is completed once the police have been notified and the material has been removed. The police will normally initiate a criminal investigation and wherever feasible processes can be started to identify and rescue the child or children depicted in the images. 

  •  The creation of a block list

Given the sometimes lengthy delays which occur in relation to the removal of content hosted overseas, the IWF decided to create a block list of URLs. This list is updated twice per day. Once it is determined that the image has been removed at source by the overseas host the address is deleted from the block list.

However, pending its removal, the fact that the address is on the list means that the number of people who can view that image from within the UK is hugely reduced. This helps make UK kids safer and also goes some way to reducing the risk of further abuse of the child or children in the images.

  •  Other technologies

The IWF block list only works with web sites, but the web is still by far the most popular, easy to use internet interface

Only those with the right technical knowledge and determination can circumvent web blocking although there are other technologies available which could be used as an alternative means of accessing the same or similar images.

  • Blocking can work on a huge scale

In 2009 it was estimated that the UK system was preventing up to 58 million attempts per annum to reach addresses containing child abuse images. No doubt the great majority of these attempts were generated by web crawlers, botnets and other automated systems, but not all of them will have been. They were all illegal nonetheless.

Blocking works. We have to deal with the other technologies as well, but the web has to be part of the equation and that is where blocking offers to play its limited but important role.

  •  The formalities

The IWF is a little unusual in some respects. Technically it is an NGO but it has been officially sanctioned and recognised to do its work by the UK authorities. This was finally given effect in a Memorandum of Understanding between the Crown Prosecution Service and the Association of Chief Police Officers . In that respect the IWF is very similar to many other UK agencies and organizations which are not immediately thought of as being law enforcement or judicial bodies but nonetheless have powers to make decisions which have a public impact. Think about the British Board of Film Classification. Think about parking wardens.  

  • A quasi-judicial body

It was established many years ago that when the IWF takes a decision about what to block and what not to block it is acting in a quasi-judicial capacity, a key legal fact which is underlined by the existence of the Memorandum of Understanding. Amongst other things this means the IWF is bound by the rules of natural justice. Since it was established in 1996-97 no decision it has taken has been challenged in court.

  • Very broad support and saves public money

They may not thank me for saying so but I have always thought of the IWF as being an auxiliary law enforcement agency. They could not do their work if the UK Government, Parliament, the police, the prosecution authorities and the public did not support them. Moreover they save public money because the majority of their income is provided by the internet industry.

  • Vested interests

However, some members of the militant wing of the National Union of Lawyers and Allied Operatives are unhappy with this arrangement. They say only a judge or a court should have the power to make the initial determination as to whether or not a particular image is illegal or is likely to be found to be illegal. They say only a judge or a court should have the power to decide whether or not a notice and take down procedure should be initiated or whether or not a URL should go on a blocking list.

  • Analogue thinking in a digital world

This is a classic example of trying to apply analogue thinking in a digital age. It simply will not work at the scale or at the speeds which are relevant to the internet. Alternatively we have to anticipate a very large growth in the number of judges that we need to employ. And that probably means more court houses, police stations, police officers, court attendants and the like. Not exactly likely in the “Age of Austerity”, but there you go.

  • Practical challenges 

In 2009 a man was arrested in Mexico and found to be in possession of 4,000,000 images. A world record, admittedly, but arrests of people with over a million images are not unusual.

In the Mexican case a very large proportion of the 4,000,000 images were almost certainly duplicates but, assuming you had no way of separating them out in advance, if you took one second to look at every image, working a seven hour day it would take over 150 days just to look at each picture. Let’s leave aside for the moment what it would do to your mental health. And that is for a single arrest of one guy in one country.

In practice, for the purposes of a prosecution in cases such as the Mexican one, a representative sample of images would probably be produced as the basis of the case but for the purpose of inhibiting the further distribution of the images every one of them is important and, if they are published online, every address matters.

  • Need new approaches

We have to find fresh approaches to dealing with the new volumes of certain kinds of online criminal behaviour. Hotlines and blocking lists are just one example of a smart way of responding to the challenge.

  • Rule of law is central

But, of course, as with all things in the public sphere the operation of hotlines must be subject to the rule of law. People in the free speech and civil rights community worry that blocking lists could be misused for other purposes.  Shouting at them and telling them they are wrong won’t work.

  • Political facts of life

Even though the images are not legally protected in any way and therefore no free speech  or free expression issues should arise, the whole topic is seen to be close to or has been deliberately conflated with arguments of that kind. We have to accept that as a political fact of life. We therefore have to ensure and demonstrate that checks and balances are in place. We should have a Simon Davies, a Joe McNamee, a Yaman Akdeniz or their clones on every scrutiny panel in every jurisdiction.

But what we cannot do is nothing. The status quo is not working.

  • The rights of the children in the images

There can be no question of the child consenting to the rape or other sexual abuse depicted in the image. There can be no question of the child consenting to the image being published. Therefore by definition these images are themselves an egregious breach, inter alia, of one of the child’s fundamental civil rights. The right to privacy. How do we weigh that in the balance?

  • Who will speak for the children? 

If these were pictures of famous footballers or Hollywood film stars engaged in activities they did not want the world to see they could be left to look after themselves. They would rush to their expensive attorneys and start spraying writs and subpoenas about the place. Abused children cannot do that. They need a champion and they are looking to the European Parliament to be their champion.

What argument is there, which is centred on the rights of the child, the rights of the abused person, which leads anyone to conclude that civilized society can tolerate these pictures staying on view a moment longer?

  • Must not punish the victims for the past failures of others

I agree it is regrettable that the EU has not exerted itself more to get States outside the EU to speed up their processes for getting images deleted at source, but is it right to punish the children for that historic failure?

Waiting for international negotiations and treaties to persuade some countries to up their game is an alibi for inaction. The ISPs will be relieved they do not have to spend any money adjusting their systems. It’s a way of saying “Let someone else solve this problem.” That is morally indefensible. We have to deal with things as they are now, not how we dream they might be one day. Blocking is not a complete solution but it offers to act in the here and now to make things a bit better for society’s most powerless victims. Anyone who had a heart would see that.

About John Carr

John Carr is one of the world's leading authorities on children's and young people's use of digital technologies. He is Senior Technical Adviser to Bangkok-based global NGO ECPAT International and is Secretary of the UK's Children's Charities' Coalition on Internet Safety. John is now or has formerly been an Adviser to the Council of Europe, the UN (ITU), the EU and UNICEF. John has advised many of the world's largest technology companies on online child safety. John's skill as a writer has also been widely recognised.
This entry was posted in Child abuse images and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s