Legal threat to the UK’s Information Commissioner

This should not have been necessary, but I’m afraid it is.

We all know about the damage exposure to porngraphy does to children. There are no serious voices being raised against that proposition, particularly in view of the types of pornography which have become standard fare on the internet of late. Playboy centrefolds they are not.

Earlier this week we learned and saw documented another manifestation of this damage. Ofsted reported on the large scale sexual harrasment of girls taking place in schools and, commenting on it, the Chief Inspector was very clear when she said the Government needs to “look at… the ease with which children can access pornography“. In modern parlance that means porn on the internet. The Chief Inspector added that sexual harassment, including online sexual abuse, had become “normalised” for children and young people. How did it ever come to this?

To borrow a phrase from the Venerable Gail Dines, girls in school have become victims of the “pornification” of our culture. The internet has played a decisive part, not the only part, but a decisive one, in helping create these lamentable conditions.

We all know children, often very young children, are accessing porn sites in gigantic numbers, either driven by natural curiosity or by accident. The porn sites know this. But they continue to receive and process children’s data with the entirely predictable result that this helps draw children back to them time and again. You cannot separate the fact of unlawful data processing from its consequences. This is not a theological or wholly abstract offence.

The porn sites are fully aware

Despite being fully aware of this the porn sites take zero meaningful steps to prevent children gaining access. This was why, in June 2020, I wrote to the UK’s Information Commissioner asking her to call them to account. The Commissioner declined giving what was, in my view, a political answer not a legal one.

Maybe I should have pursued it at the time but the fearless souls at CEASE have taken up the cudgels. Please see their letter threatening legal action and their plea for financial support. I hope you can help.

The CEASE letter quotes my correspondence and the Commissioner’s reply. Here is my original and the reply. And my reply to the reply.

There is something not quite right about a country (us), rightfully proud of being the first in the world to adopt an “Age Appropriate Design Code, putting it under the aegis of the Information Commissioner only then to be told it was not intended to help with stuff that is unarguably age inappropiate.

Posted in Age verification, Internet governance, Pornography, Privacy, Regulation, Self-regulation | Leave a comment

Warriors win. Children win

Three days ago, on Wednesday 9th June, the Canadian Centre for Child Protection (C3P) published a brilliant report containing details of their work over the two year period 2018-2020. I blogged about it the next day.

In the two years covered in their report we learned C3P had seen and verified 5.4 million child sex abuse images and in respect of them issued take down notices to over 760 Electronic Service Providers in all parts of the world. 97% of all the images were hosted on the clear web, not hidden away anywhere and 48% of the notices issued related to images which had already been flagged at least once to the hosting company concerned.

We also learned, inter alia, the Canadians identified a single telecoms company, based in France, responsible for fully 48% of all the child sex abuse material referenced.They named the company. It is called “Free”.

On Thursday 10th June Forbes published an article based on the Canadian report. In the Forbes story they named French billionaire Xavier Niel and published his picture while informing us he owned 70% of Free’s parent group, a company called Iliad.

Free had hosted 1.1 million csam files in 2018-2020. These had triggered 2.7 million notices. The most likely root cause of the problem was Free’s policy of allowing anonymous users. Obviously there was no suggestion Niel personally or senior executives were aware of any of this but that hardly constitues a defence when you think of the pain and misery their lack of diligence had caused and continued to cause over many years.

Yesterday, 11th June I received an email from Wonder Woman, whose real world identity is CEO of C3P. Not many people know that so keep it to yourselves.

In the email I was informed

As of yesterday, Free’s file-hosting service no longer allows anonymous users to upload content — only Free account holders have access to the service now. We believe this has effectively eliminated this service as a means of online CSAM distribution. In addition to this, all 6,500 archives files, containing more than 2 million images/and over 35,000 videos, that were still live prior to the release of the report have been deleted from their file-hosting service”.

What can I say?

Could someone remind me why it is important to “play nice” and refrain from naming and shaming? Yet again we see truth is our most important weapon and truth loses its potency, power and purpose if it is kept hidden away.

This is not just a feather in the cap of C3P. It’s a whole Golden Eagle. And well done Forbes for riding shotgun. Children in general and survivors in particular are forever in the debt of both.

Posted in Child abuse images, Internet governance, Regulation, Self-regulation | Leave a comment

Canada puts another nail in the coffin of self-regulation

The Canadian Centre for Child Protection (C3P) is justly famous for many things. One of them is the quality of their research. It is based on two vital, interdependent pillars.

First is a deep understanding of the position and needs of survivors of child sex abuse, particularly those who have had the additional misfortune of appearing in a picture or video which depicts their abuse where the picture or video has also been circulated online.

Second is CP3’s exceptionally strong grounding in the technologies used to make, publish and distribute child sex abuse material (csam) over all parts of the internet.

More evidence of C3P’s top class work became available yesterday when they published their long-awaited report: “Project Arachnid: Online availability of child sex abuse material”. Its nearly 60 pages do not make easy reading (there is an Executive Summary as well) but it is essential reading for anyone engaged in trying to rid the world of the scourge of csam.

The period covered is 2018-2020. Not a vast span but that only serves to underline the scale of what we are facing. And when you look at the report see what C3P say about their backlog. Scary stuff.

Enormous numbers

In the two years under review C3P examined and verified 5.4 million images and issued take down notices to over 760 electronic service providers (ESPs) in all parts of the world.

Qu’est ce que vous allez faire Monsieur le Président ?

Astonishingly, C3P found that fully 48% of all the material identified was linked to a single, French telecommunications company. The G7 is starting in the UK today. President Macron will be there. I wonder if any journalists will tackle him on this and, if so, what will he say? I expect he will be absolutely horrified because there is no doubt his Administration has been making several moves in the right direction and we expect to see even more.

A dark web problem? Emphatically not

You see all kinds of people rolling their eyes and talking about the dark web, encryption and a variety of subterranean exotica as if they were therefore already resigned to being powerless to do anything about csam. But the unvarnished truth is 97% of the csam detected by C3P was in the clear web. So far from being intractable, online csam is highly tractable. What has been missing is the will on the part of far too many ESPs.

And a massive number of repeats

Perhaps even more schockingly 48% of all the take down notices issused by C3P related to images which they had already flagged as illegal to the same provider. That is truly shameful because the technology exists, and is widely available, which would allow any ESP to detect already known images and prevent them ever seeing the light of day again, at least not on a publicly accessible web page. Table 7 of the report (p 38) shows “recidivism” rates going up, not down. And clock Table 7.3 for the names of the companies involved.

Why don’t more companies use the technology that would allow them to detect csam in milliseconds? Because they don’t have to. No law requires it and this, maybe more than anything else, reminds us why self-regulation – voluntarism – has had its day.

Too slow, too slow

C3P says following the issue of a take down notice the median removal time for the item concerned is less than 24 hours. Bravo! But in 10% of the cases it took more than seven weeks for the illegal material to go.

That is utterly unacceptable and again is a product of voluntarism. And by the way it seems the delays are longest where the images concerned involve older adolescents. This conjures up several unpleasant thoughts about non-expert Sysadmins second-guessing child protection experts meanwhile leaving a child’s sexually abusive image on view until they conclude their own internal interrogation. Not on.

Change is gonna come

From page 48 onwards C3P’s recommendations will be instantly recognisable by children’s advocates in the UK and the EU and in many other parts of the world.

Big Tech obstructionism created space for too many bad guys to flourish

Any and every major thinker in the internet space has known this moment would arrive – the end of voluntarism – but too many industry leaders were determined to drag things out as long as possible to keep the mountains of cash rolling in. It was this obstructionism by Big Tech and their deliberate delaying tactics which created the space for myriad unsavoury characters to hide in the shadows. Until now. Thank you C3P. Keep it up.

Posted in Child abuse images, Internet governance, Regulation, Self-regulation | Leave a comment

Facebook must change its mind

Over 90% of the reports shown in the National Center for Missing and Exploited Children’s graph below came from Facebook owned properties, principally Facebook Messenger and Instagram Direct. Yet the company intends to avert its eyes from these Apps. It will do so by introducing strong encryption.

Facebook does not claim the act of closing its eyes will have any impact on the level of illegal activity. They acknowledge that, although if anything one might expect the level of illegal behaviour to increase as more people realise they can never be caught.

To excuse or justify their intentions Facebook are saying two main things.

One is that a lot of the publishing of the images which are then reported to NCMEC is not being done by “bad people”. Rather it is being done by people who are expressing their outrage or disgust, or they think it is some kind of joke.

Hmm. These are not defences known to law and for the victims it makes zero difference. Whether or not people should be arrested and prosecuted for a lapse in good taste or for being outraged or disgusted is a separate question. What never changes is the urgency of getting the images removed and stopping them from being circulated further.

Facebook will not be able to do that if it blinds itself.

The second is Facebook says it will take other steps to identify miscreants and intervene to kick them off the platform e.g. through analysing meta data and patterns of behaviour. Shouldn’t they be doing that anyway? It is not an alternative because it does not get around the fundamental point. Whereas now Facebook can see things, in future….

There may also have been a riff in a minor key about the number of duplicate reports, somehow implying the problem isn’t as large as it might otherwise first seem. Wrong. But anyway see the numbers in the graph.

Through their own appalling behaviour Facebook substantially created what they say is people’s new desire for greater privacy. The company has come up with the wrong answer. It is an answer which inevitably will harm children.

They need to find a better one. If they don’t I am afraid I for one will leave Facebook and all its Apps. It will be inconvenient, at least for a while, but I doubt I will be alone and I think it will become very difficult for children’s advocates or groups to continue thinking of or describing Facebook as a “partner”.

As the Victorians used to say, if it goes ahead Facebook will be “putting itself outside the boundaries of polite society” and no amount of largesse, grants or free flights to and high class hotels in exotic locations, no pr front will be able to cover that up.

I can see Facebook feel unjustly treated. They are uniquely exposed because of their size and because they have been transparent in the past whereas other platforms have escaped similar scrutiny and criticism because they are smaller or because they already deploy strong encryption so no reports or fewer reports reach NCMEC linked to their name. I get that. But it does not alter the basic facts.

Finally, let’s be clear: this decision will be taken by a single person. Mark Zuckerberg. I very much hope he makes the right one.

Posted in Apple, Child abuse images, Facebook, Regulation, Self-regulation | Leave a comment

Government in a muddle over porn

On the morning of 11th May the Queen’s Speech was delivered and published. In the afternoon, Caroline Dinenage MP appeared before the Communications and Digital Committee of the House of Lords. Ms Dinenage is the Minister of State responsible for what has now been renamed the “Online Safety Bill”. In response to a question from Lord Lipsey, she said the following (scroll to 15.26.50)

(the Bill) will protect children by not only capturing the most visited pornography sites but also pornography on social media sites”.

That is simply not true.

As currently drafted the Online Safety Bill applies only to sites or services which allow user interactivity, that is to say sites or services allowing interactions between users or allowing users to upload content. These are what are commonly understood to be social media sites or services. However, some of the “most visited pornography sites” either already do not allow user interactivity or they could easily escape the clutches of legislation written that way simply by disallowing it in the future. That would not affect their core business model in any significant way, if at all.

You could almost hear the champagne corks popping in Pornhub’s offices in Canada.

Now scroll forward to around 12.29.40 where the Minister also says

“(according to research published by the BBFC in 2020) only 7% of children who accessed pornography did so through dedicated porn sites….even children intentionally seeking out pornography did so predominantly through social media “

This too is simply untrue as this table shows

The above is taken from research conducted for the BBFC by Revealing Reality (and note what it says in the body of the report about children seeing porn online before they had reached the age of 11). Bear in mind the table shows the three key routes to children’s pornography access. They are not exhaustive or exclusive one of another. A child could have seen porn on or via a search engine, social media site and a dedicated porn site. Or they may have seen porn on social media once, but be visiting Pornhub every day. 

Other research published the week before the Queen’s Speech looked at the position of 16 and 17 year olds. It found that while 63% said they came across porn on social media, 43% said they had also visited porn web sites.

Part 3 of the Digital Economy Act 2017 principally addressed the “most visited pornography sites.” These are the commercial ones, the likes of Pornhub. In explaining why the Government did not implement Part 3 and now intended to repeal it, I was astonished to hear the Minister say it was down to Part 3 falling victim to the “speed of technological change” as it had not included social media sites.

Does the Minister truly believe the issue of porn on social media sites has only cropped up as a serious matter in the past four years or so? I’m almost tempted to say “if so I give up” .

When the Digital Economy Bill was going through Parliament the children’s groups and others lobbied for social media sites to be included but the Government flatly refused to countenance it. I will not mention at the time Part 3 received Royal Assent, Boris Johnson was a Cabinet Minister in the Conservative Government of the day. Nor will I allude to what I believe are the real reasons why the Tories did not want to proceed with any form of restriction to online porn before the Brexit General Election was out of the way.

Secretary of State and Julie Elliott to the rescue

Two days after the Minister of State appeared in the Lords, the DCMS Select Committee of the House of Commons met with Secretary of State Oliver Dowden MP. In her contribution (scroll forward to 15:14.10) Julie Elliott MP got straight to the point and asked Mr Dowden to explain why the Government had chosen to exclude commercial pornography sites from the scope of the Bill.

The Secretary of State said he believed the biggest risk of children “stumbling” over pornography was via social media sites (see above) but whether or not that is true “stumbling” is not the only thing that matters here, particularly for very young children.

He also said he “believed” the “preponderance” of commercial pornography sites do have user generated content on them so therefore they would be in scope. I have never seen any evidence to support that proposition but see above. A few mouse clicks by the site’s owner could remove interactive elements. Revenues are likely to remain substantially unaffected and in one bound the porn merchants would free themselves of the cost and trouble of having to introduce age verification as the only meaningful way of restricting children’s access.

How could this happen?

Were the Minister of State and the Secretary of State poorly briefed or did they just not grasp and understand the briefs they were given? Whatever the explanation it is a remarkable state of affairs given how much attention this subject has received in the media and in Parliament over several years.

But the good news was Dowden said if a “commensurate” way could be found to include the kind of sites that were previously covered by Part 3 then he was open to accepting it. He reminded us that such might emerge from the joint-scrutiny process which will shortly begin.

I am reaching for my commensurate pencil. I keep it in a special drawer.

Bravo Julie Elliott for getting the kind of clarity we all need.

Posted in Age verification, Facebook, Google, Internet governance, Pornography, Privacy, Regulation, Self-regulation | Leave a comment

Move slow and leave things broken

On 20th December, 2020, a new EU-wide law took effect. An unintended consequence appeared to call into question the legality of companies continuing to look for child sexual abuse materials being exchanged over messaging platforms. It also appeared to call into question the legality of companies identifying images which were likely to be child sexual abuse material or behaviour suggesting a child might be being groomed for a sexual purpose. Companies had been doing some or all of this on a voluntary basis since 2009 so it was a bit of a bombshell.

Over 90% of all the reports pertaining to these kinds of threats to children that were made to the authorities in the USA originated either on Facebook Messenger or Instagram Direct but the child victims involved came from every corner of the globe. Messenger and Instagram have enormous reach.

Facebook, the company that owns both platforms, has one of the worst reputations among Big Tech for law breaking or for contesting the law if it does not suit their business model. But not here. At the stroke of midnight on 19th December, 2020, everything ground to a halt. Eventually this lead to a 58% drop in reports coming from EU Member States. There was no suggestion any fewer children were being abused or exploited. The only difference now was the police had no possibility of intervening because the reports had dried up.

Other companies that had been in exactly the same boat did not stop looking. These included Google, Microsoft and Snapchat. Not exactly online minnows. Their level of reporting remained unaltered. As Sophie in’t Veld MEP pointed out during a debate in the European Parliament, after 20th December there was never any real legal risk attaching to any company carrying on protecting children as they had done before.

On 29th April the EU announced an “interim derogation” (temporary suspension to you and me) of the potentially troubling law. The expressly stated purpose of the suspension was to restore what everyone had understood to be the status quo ante.

Nearly two weeks on I enquired if Facebook had shown the same eager speed and determination to restart the measures they had abandoned so punctiliously.

The answer is not only “no they have not”, it is also the case that as of now they have no definite date by which they will. Why?

Several reasons were given to me. None of them are credible. First they have not yet seen the final, official text. That needs to be endorsed and published in the Official Journal of the EU. However, like the rest of us they have seen the text that appeared on the EU’s official web site.

Then, seemingly, they need to study the official text to ensure they are meetings its terms.

After that the company’s engineers have to be engaged because, while turning it off could be done by flipping switch, apparently turning it back on cannot.

Facebook’s unconvincing, ostensible turn to lawfulness, its abundance of caution, comes at a price. Children are paying it.

“Move slow and leave things broken”. How does that sound as a new company motto?

Have they really no shame?

Posted in Child abuse images, Facebook, Google, Internet governance, Regulation, Self-regulation | Leave a comment

Great news for children coming out of Brussels

Last night in Brussels it was announced that a political agreement had been reached on the interim derogation. In plain English what that means is, pretty much immediately, we can go back to the position we all thought we were in on 19th December 2020 (the day before the new, bad e-Privacy law kicked in).

The suspension is operative for up to three years, during which time we will all need to roll up our sleeves to formulate a longer-term framework. Watch this space.

Thus, the EU has paved the way to allow companies to recommence scanning messaging platforms for child sexual abuse material, grooming behaviour and the use of “classifiers” to detect images not yet determined to be child sexual abuse material but likely to be.

This is a great outcome. A huge pat on the back is due to everyone who had a hand in it. It reflects the enormous amount of work done by many MEPs, Commissioners, Commission staff, children’s groups and child advocacy organizations across the world.

If I have drawn one major conclusion from this whole unfortunate episode it is this: children’s groups and children’s advocates need to engage more closely with privacy lawyers in particular and privacy activists in general.

I share many of the privacy community’s concerns and worries – I think we all do – but a handful of ideologically motivated individuals with a talent for catching media headlines, showed they are not above resorting to outright lies and misinformation to achieve their desired end. In a world bedevilled with often quite intimidating legal and technical language, in a world of zero trust in Silicon Valley and declining trust in Governments, many people, too many people, fell for the scaremongering propaganda.

We cannot let that happen again. We need to find new and better ways to improve public and the media’s understanding of the issues because from that will flow a more grounded and sustainable understanding by policy makers.  Watch this space.

I won’t repeat everything in the Commission’s press release but here’s a first, quick look at the detail of last night’s announcement

  1. The definition of what constitutes qualifying child sexual abuse material or activities is explicitly aligned with the 2011 Directive.
  2. Companies and organizations need to have an appeals mechanism to cater for potentially erroneous decisions. It would be strange if they didn’t already but hey.
  3. There needs to be “human oversight” of the processing of personal data. Potentially problematic given the scale on which the systems operate on larger platforms but it depends how one defines “human oversight”. Expressly there is no requirement for prior authorization before reports can be made or illegal content is taken down.
  4. The tech used needs to be the least privacy intrusive.  That should already be the case.
  5. Companies and organizations need to consult data protection authorities on how they work in this area and the European Data Protection Board will issue guidelines to assist the data protection authorities. Fine, as along as these guys stop thinking along tram lines and learn how to speak in a language the majority of us can understand. They should embrace a mission to explain and heighten public understanding of privacy and not allow themselves to be manipulated into a position where they become identified in the public mind as enemies of common sense who provide shelter to criminals who harm children (and others).
  6. A public register will be established of public interest organizations with which online service providers can share personal data. Sounds OK.  Presumably it will be aligned with the GDPR and changes in Europol’s mandate.
  7. Annual transparency and accountability reports will be required. Hugely important but it cannot be left to companies to mark their own homework. Proportionality will be important here, as it is everywhere else, but so is the idea that everyone can have confidence that the transparency and accountability reports speak the relevant truth and nothing but the relevant truth. I am too polite to repeat the story about how you grow mushrooms.
Posted in Child abuse images, Default settings, E-commerce, Facebook, Google, ICANN, Internet governance, Privacy, Regulation, Self-regulation | Leave a comment

What can aeroplanes teach us?

The other day I was talking to the CEO of a tech company, expressing my frustration at the way scaremongering misinformation seems to have taken hold in relation to the way various child protection tools operate online.

We are talking about three types of tools:

PhotoDNA and similar detect known examples of child sex abuse material (csam). Every image in this category by definition is illegal and represents an egregious infringement of the right to privacy and human dignity of the child depicted.  

The second are so-called classifiers. These flag images which are likely to be csam.

The third address grooming behaviour, that is to say behaviour which is likely to lead to a child being sexually abused.

In essence the misinformation circulating about each of these tools implies or expressly states they “scan” all private communications thereby creating the impression that, duplicitously and hiding behind the name of child protection, the police or the security services, the companies themselves, and goodness knows who else, are reading everything you send or receive as a message. Or they could, if that took their fancy.

The simple truth is if any illegal reading or examination of messages is taking place it has nothing whatsoever to do with any of the child protection tools I have mentioned.

Howsoever or wheresoever it originated the miasma of falsehood enveloping the child protection tools is proving to be astonishingly tenacious. Why?

Like many conspiracy theories and other lies that get read and repeated over the internet, the smokescreen of misinformation has been able to take hold because it exploits an underlying lack of trust in or suspicion of “them”.  In this case “them” are some of the major actors in the drama: Big Tech, Governments, law enforcement and the security services.

But there is another set of actors playing an important role in this tragedy. I am referring to parts (stress parts) of the tech community and privacy activists who think each of the interests I listed are as bad as the others. Noblesse oblige they alone therefore have a self-proclaimed and claimed unique responsibility to look out for the rest of us.

Anyone who objects or takes a different view is pitied, marginalized or completely ignored because they obviously don’t understand the complexities of the issues. It’s a kind of techno evangelical paternalism. “Forgive them for they know not what they do.”

And those aeroplanes?

Back to my CEO. He compared the emergence of the internet with the emergence of international air travel. Aeroplanes were unquestionably a new and revolutionary technology that changed the world. Initially air travel was the preserve of a small, rich elite but as technology advanced and prices fell it became a global industry which in turn fed and helped create a whole number of others, not the least of which was tourism.

Then came a prolonged spate of terrorist hijackings. These destroyed consumer confidence in air safety. Tourism collapsed, planes were empty or did not fly. Relatively rapidly the world got together and agreed international standards and systems to make air travel safer. Did it stop all terrorist hijackings? No. But the new system of checks at airports self-evidently reduced the number of hijackings very substantially, and acted as a major reassurance to people waiting in line to catch a flight. Consumer confidence returned. Planes started going up again.

What was the magic ingredient that did the trick at airports and has now been extended to a great many public and other buildings around the world? Metal detectors.

Can they be fooled? Yes. Do they seem to work well enough? Yes. Does anyone feel their privacy is being invaded by having to pass their body or their bags through a detectorist arc or by having a wand passed over them? No. Can the wand or arc operatives see what, if any, underwear you have on? Can they make any other deductions or infer anything else from your movement, or your suitcase’s or briefcase’s movement, past or through the detector? No.

Yet that’s exactly how the child protection tools work. They look for something which says “metal is present take a closer look.” Nothing more. Nothing less. If the bleeper bleeps someone opens the potentially offending item. If no threats to children are found everything carries on as before and as intended. The idea that you only use a metal detector if a suspect comes into your building or your airport is absurd.

The challenge is how do we convince people that what I have described is the case? The even larger challenge is how do we create systems of accountability and transparency which will give all stakeholders – and here we must include parents and children- the confidence that that is all that is happening. Nothing more. Nothing less.

Posted in Uncategorized | Leave a comment

Money counts. Children don’t. And 163

From around 2009 various online platforms voluntarily started using smart technical tools to detect, delete and report actual or likely child sex abuse images and detect and address potential paedophile behaviour.

When the European Electronic Communications Code took effect on 20th December 2020, an unknown number of companies stopped doing it. This was an unintended consequence of parts of the Code becoming law.

In July 2020, a few months before the December deadline, having realised what was going to happen, the European Commission announced their intention to propose an “interim derogation” (temporary suspension) of the relevant clauses. In September they published a legislative proposal which would have achieved that.

Had the proposal been accepted, what everyone believed to be the  status quo ante would have been restored, without a blip or a hitch. There was a widespread expectation this would happen, rooted in the equally widespread belief that no substantial interest wanted to overturn or change the existing, longstanding arrangements.

How wrong we were. Nine months later reports of threats to children coming out of EU Member States have fallen by 51%.

Why?

Under the co-decision legislative processes of the EU all three elements – the Commission, Council of Ministers and  the European Parliament – have to agree a single text. The Council of Ministers substantially supported the Commission’s text. Not the Parliament.

The LIBE Committee had and still has lead responsibility for handling this matter on behalf of the European Parliament.

At a meeting of the LIBE Committee on 4th February, 2021 the Committee’s Rapporteur, Birgit Sippel of the German SPD, acknowledged (at 15:40) there was a procedure which would have allowed the process to be speeded up but she went on to say it is  “normally only used” for more technical matters and, if I understood her correctly, because it would have  entailed “giving away all the powers of political groups and individual MEPs” there was no support for it from other political groups on LIBE. Later Sippel spoke vigorously in defence of the “democratic hard work” of MEPs and about “not calling into question the legitimate rights and duty of this house to properly scrutinise  proposed legislation.”

We may never know why, at any point in the previous twelve years, Sippel or these same political parties failed to stir themselves sufficiently in relation to the very issues they suddenly said they were so concerned about. This makes the current, lamentable state of affairs look more like an opportunist power grab.

For LIBE the need to restore the status quo ante to preserve a child’s right to safety took second place to the (self-evidently) pick-and-choose rights of political parties and individual MEPs.

Throughout her leadership on the derogation Ms Sippel has been vocally supported in her stance by a member of a German far left party who is also on LIBE (Cornelia Ernst) and by the only member of the German Pirate Party in the entire Parliament (Patrick Breyer). He too is on LIBE. I’ll come back to this. Soon.

The tourism industry fared differently

Last week the Commission produced a proposal to establish a system of “vaccination passports”. It was tabled in the Parliament on Thursday.

Manfred Weber, Chair of the EPP Group asked for the proposal to be put on a fast track, as did the Commission. They both invoked Rule 163 of the European Parliament’s Rules of Procedure. Sippel spoke against adopting 163  suggesting her Committee should be left to do the job. She assured her colleagues they would complete the work by June.  If only children could be so lucky.

However, by more than 2:1 in plenary session Sippel’s objections to using the emergency procedure were ignored. She was defeated. Vaccination passports will be fast-tracked.

Why?

The Governments of places like Greece, Spain, Italy and Portugal moved straight away to impress on all their MEPs how badly their local tourist industries need vaccine passports. This would give them at least some chance of welcoming back visitors in the Summer. Money counts. Children don’t.

But really another obvious question is why did no one from the Commission or any of the several  qualifying groups seek to invoke Rule 163 for the interim derogation?

This is what Rule 163 says

“A request to treat a debate on a proposal submitted to Parliament pursuant to Rule 48(1) as urgent may be made to Parliament by the President, a committee, a political group, Members reaching at least the low threshold, the Commission or the Council. Such requests shall be made in writing and supported by reasons.”

Is it now too late  for this do be done so as to bring this tragi-farce to an end? On a straight vote in the Parliament I am pretty sure I know who would win.

How many times did the word “German” appear above?

It is very striking how three of the most energetic and vocal obstructionists on the child protection agenda  – the ones ensuring children remain in danger – are all from German political parties. I wonder if Sippel’s position is not, therefore, in some way related to internal German politics? Is this the reason she could not get agreement to the fast tracking she referred to on 4th February?

If there is anything in this theory it makes the matter even more disgraceful than it already is. It would mean the whole Parliament  – the whole might of the European Union – has not found a way to exert itself to overcome what is, in effect, an internal argument taking place within a narrow spectrum of the politics of a single country.

And children in the EU are paying the price. Not children in any other part of the world. Only in the EU’s 27 Member States. Shame. Shame.

Posted in Child abuse images, Default settings, Privacy, Regulation, Self-regulation | Leave a comment

Time to vaccinate against porn-fuelled violence against women

I am pleased to welcome guest blogger Baroness Tanni Grey-Thompson who speaks about the  threat posed to women by the violent porn which is commonplace on the internet and about the British Government’s failure to address it. This is particularly apposite today because of a vote which will be taken in the House of Lords this afternoon.

 

 

It’s time to vaccinate society against the porn-fuelled pandemic of violence against women

 

We are dealing with another pandemic – one that also spreads in the open air and in the home.  That pandemic is violence by men against women and girls.

We are not as good at sequencing the genome of the causes of this abhorrent behaviour as we have been for the Wuhan, Kent or South African strains of Covid-19, but if we take all a step back,  it is crystal clear that there is a very short list of influences on the behaviour of some men in our society which lead to assault and tragically even murder, and widespread access to extreme, violent pornography is at or near the very top of that list. One grandparent got in touch to tell me about the experience of their young grandchild and how they had been exposed to listening to another child talk about incredibly graphic violent pornography.

We saw on Monday how quickly the Prime Minister promised, rightly, to take action on stalkers, following our vote in the House  of Lords to put them on an offenders’ register.  Again today, peers have the opportunity to urge the Government to be even quicker in making a practical difference by enforcing a law which is already on the statue books, to deal with what is a well-documented driver of the attitudes of some men towards women, girls and sex, and that is extreme pornography.  The government itself published research only a month ago proving that this kind of nasty pornography is associated with domestic violence.

Parliament passed the Digital Economy Act four years ago, to give the British Board of Film Classification the power to block access in the UK to websites which host the sort of extreme pornography the BBFC would never allow to be sold from an adult sex shop, let alone be shown in a cinema with an Unrestricted rating, which is what the internet is.

Two years  ago, the government quietly dropped this plan.  Had Ministers come back to Parliament and asked us to repeal that legislation, and instead to wait for three, four or even five years more for a new law they hope will be a bit more effective by tackling social media as well as porn sites, but which we now know may not even apply to a large proportion of the websites in question because of the way the government plans to draft it, they would have been sent packing.

So the Government did not do that.  It just quietly shelved it, and has now had to come up with arguments for why it did so – but these simply do not stand up to the sort of scrutiny the House of Lords applies.

Ministers have made a technical argument that changes in how we navigate the Internet might make blocking websites harder at some time in the future by encrypting some web traffic.  But women want action now, and those changes are still years away.  Nor do these changes excuse internet service providers from their responsibilities to help block access to violent pornography.   We know that site blocking is possible now and will still be possible in the foreseeable future.  And given we accept that this is only an interim measure, to be applied while we wait for a new Online Safety Bill over the next few years, then that can replace the existing law in plenty of time to deal with technical evolution.

The evidence of how compulsive use of internet pornography can affect the brain and decision-making faculties of a compulsive user over time is something that we have to take seriously. I know there is no single cause of violence towards women but there is a short list of variants of this terrible virus and today we have the opportunity to administer a vaccine which has already been developed in the Digital Economy Act of 2017.  As Baroness Benjamin put it so clearly when she proposed today’s amendment, “we have to stop creating a conveyor belt of sexual predators who commit violence against women and girls.”

In time, we may develop a better vaccine that may be more comprehensive and deal with more variants, as the Government claims its new Online Safety Bill will, but that is not a good reason not to give society a jab now that will help to stop the spread of this deadly disease, be that in the open air in a park or in within a family home.  That’s why We Can’t Consent To This, CEASEUK and Women’s Aid all support this action.

This vaccine is ready to go now, and could be rolled out within a few months simply by re-designating the BBFC as an interim regulator until Ofcom is ready to take over.  It is nothing short of immoral not to use the vaccine we have available today in the hope of a better vaccine which we have yet to even see designed at some point in the future

If the government truly wished to take some action, rather than generate spurious arguments that it will take 27 months to implement an existing law, they could do it within weeks by re-starting where they left off.

Let’s start our vaccination programme against the virus of violence towards women and girls today by restricting access to extreme pornography right away.

Posted in Age verification, Internet governance, Pornography, Regulation, Self-regulation | Leave a comment