Here in the UK it has been a busy week and a good week for child safety on the internet. Later today the Prime Minister is holding an industry summit in Downing Street to review progress on the calls he made in his landmark speech on 22nd July, 2013. We have already had a glimpse of what’s in store
On Saturday we saw announcements reconfirming the intention of the UK’s biggest ISPs to require every domestic account holder to decide how or indeed whether they want to use filters to restrict access to online adult content in their home.
The filters will be provided to customers at no extra cost. They will work at the level of the household’s WiFi router or higher up on the network itself. This means every device that connects to the internet through the common access point will be governed by these choices. For families with maybe twenty or more different smartphones, tablets, games consoles, laptops or what have you this is going to simplify things hugely.
For new customers these arrangements will be in place by the end of the year. Existing customers will be put in a similar position by the end of next year. It’s not practical to move everyone over at once.
On by default
During the sign up process if a new customer simply clicks yes, next, yes, next, yes the filters will be turned on by default. They will cover a broad range of categories, not just pornography. In theory the account holder – who will have been verified as being over 18 – will make the decisions about whether or not to use the proffered filters. At the end of the set-up, or following any subsequent changes, an email will be despatched to that person’s given address summarising the choices or alterations that were made. The point to note, though, is once the decision on filters has been taken it will apply in the same way to everyone. 5 year old Jenny gets the same access as 17 year old Johnnie and Granddad.
Major public awareness campaign – reaching the “unlulled”
A £25 million public awareness campaign funded by the ISPs aims to make sure everyone understands what the filters will do as well as what their limitations are.
Nobody wants parents to be lulled into a false sense security about the effectiveness of filtering. However, we have been living with a generation of “unlulled” parents pretty much since the internet began and it hasn’t worked out too well in many families. ISPs are now going to have a go at it the other way around. We call this innovation. It’s one of the things for which the internet is said to be famous.
Historically the choice was presented rather starkly: do you try to reach out to explain how filters can help kids stay safer before or after they have been allowed to venture into cyberspace? In my book they are not alternatives. You do both but absent a positive contra-indication you apply the filters at the kick off. Parents should not have to jump through hoops to make the internet safer for their kids. Any hoop-jumping should go in the opposite direction.
Mobile phones have been doing this since 2005
The mobile phone networks have been doing something similar since 2005. The key difference is that whereas the ISPs’ offering assumes it is the age-verified adult account holder making the decisions about filters in the home the mobile networks require real time proof of age at the exact point where the decision is taken. Until that proof is received the filters are applied to every phone, or rather to every SIM card because it is the phone number that constitutes the account.
Of course a household account used by many is not the same as a mobile phone number used by one person but the parallels are there. In the long run I think all ISPs in the home market will end up closer to the position which currently exists with mobile phones.
At work and at school we are all used to having our own log ins with appropriate access rights linked to our accounts. The proliferation of individually owned portable devices plays into the same space. Eventually I think we will all have unique age-verified log ins that we will carry with us across platforms and devices. Every account will therefore be configured in an age appropriate way according to the needs or interests of each family member.
Having said that there is no question that what the ISPs are proposing to do now is an important step forward. We’ll see how it works out. Maybe we won’t need to go in the direction I am suggesting is inevitable. I’d be delighted to be proved wrong.
Public WiFi providers are joining in
Britain’s largest WiFi providers are also joining in on the drive for a safer and better internet for children. Family Friendly WiFi is coming down the tracks. In many places it is already here.
At the time of writing the detail of the final package on WiFi is not completely tied down so check against delivery with what the Prime Minister actually says on the subject. The overall commitment, however, is now clear and irrevocable.
In public spaces where children and young people are likely to be found on a regular basis urls containing child abuse images (drawn from the IWF list) and legal porn will not be accessible. Other categories of adult content will be blocked by some WiFi providers but it is not clear if all of them will do that. I expect in time a consistent and common standard will emerge although there is no doubt that restricting access to porn is number one on most parents’ list.
A new logo will promote awareness of Family Friendly WiFi facilities in retail outlets, coffee bars, on buses and trains, in parks and so on. I imagine major retail chains, municipalities and other big brands will be quick to ensure they are offering a Family Friendly service. The value of a distinctive logo is that it will encourage smaller enterprises to join in.
A very large experiment
The internet is still relatively new. One way of thinking about the collection of measures outlined above is as one very large experiment. Through it we should all learn a lot, for example about how effective filters are as aids to good parenting in the digital age. The world will be watching. The only alternative model people could point to up to now for anything like this came out of Australia but there the politicians messed it up so comprehensively it should stand as a text book example of how not to do it.
Now the really radical stuff
Some of the most dramatic new steps to make the internet safer and better for children are being taken by Google and Bing.
The core point is Google and Bing are adjusting their search engines to make it harder for paedophiles or individuals looking for child abuse images to do their worst.
Blocking retained and extended
Google and Bing already block access to urls known to contain child abuse images. The urls are drawn from the IWF’s and similar lists. This will continue.
In relation to detecting actual images, as opposed to urls, Microsoft’s image fingerprinting technology, PhotoDNA, has held centre stage for many years. They give it away so really any internet-based business providing free or paid for online storage facilities should deploy it or something similar. Those that don’t are, in effect, saying they don’t care what people do on their systems or put on their machines. They are turning a blind eye.
A video version of PhotoDNA
Google has also announced a new type of fingerprinting technology – essentially a video version of PhotoDNA – so that known child abuse videos can be detected and removed at scale. The new programme is currently being tested on YouTube but will soon be made available to the entire industry. As paedophiles increasingly move from still images to video this is a most welcome development.
Action on Peer2Peer
Progress has also been made around Peer2Peer networks. A pilot project has been announced which will allow for the blocking of torrent urls that initiate sharing illegal child abuse images. Google and Bing will be working a lot more closely with the police, helping to construct new and larger databases of known illegal images, making them easier to detect and remove. Progress in this area is essential as without a doubt it is in the realm of Peer2Peer networks that the largest growth in the distribution of child abuse images has taken place.
Paedophilic searching gets harder
If a person types in a known paedophile term or something that suggests they are looking for child abuse images a clear warning message will now be displayed from Google, Bing and child safety organisations telling the searcher they may be on edge of breaking the law and this could have severe consequences for them. It will also point towards sources of advice and help if they seriously want to break with their criminal and abusive behaviour.
These messages may not deflect determined or already committed child sex offenders but there is little doubt they will cause some to stop and think. If that means fewer children are abused it has to be worth giving it a shot.
Bing and Google will be collaborating with the police and other agencies to ensure they stay as up to date as possible in relation to how the paedophile community and image collectors are trying to use the internet for their evil ends to make sure they can’t.
While it is obvious why one might want to block access to known illegal content or sources known to supply illegal images, when it comes to words describing illegal or unsavoury acts it is a different matter. Very few countries have laws which make words illegal, however revoltingly they are used.
However, the search engines are now relegating the possibly legal but unsavoury content, that was previously being returned on some child abuse related queries, to the cyber equivalent of Siberia and replacing it with positive content.
To give a hypothetical example if someone was looking for information about child rape they will be helped to find articles in the academic press or to locate sources of help for victims. The stuff paedophiles have been publishing will be on the outer edges of findability. In this way neither search engine can be accused of censoring the internet or of refusing to provide access to otherwise legal if distasteful material but they have gone a long way in that direction. Again we will see how this is going to work out in practice.
Pulling out the stops
Google and Microsoft have really pulled out the stops. This is a very impressive initiative. It is narrowing the spaces in which paedophiles and collectors of child abuse images can operate. The challenge now is to work out how to gauge the effectiveness of and learn from the various measures that are being put in place but I am sure there is a will so we can find a way.
The beginning of a new phase
This is unlikely to be the end of the story but there is no question we have reached an important milestone. There are still anxieties about the Darknet and the true extent to which it and the use of encryption may continue to frustrate law enforcement’s and everybody’s efforts. But we don’t always have to do the really hard stuff first. Dealing with the more open parts of the internet has to be a key priority precisely because it is so accessible and therefore has the greater potential to draw in new offenders.
All the companies involved deserve a great deal of credit for their willingness to sail into these uncharted waters and so does Claire Perry MP and the Government as a whole for sticking with the issue in the way that they have. I await with interest to see how what is happening in the UK starts to roll out or impact other jurisdictions.