Last week (20th January) the UK Parliament’s Home Affairs Select Committee interviewed representatives of Facebook, WhatsApp, Twitter, Google, Snap and Tik Tok.
The Chair of the Home Affairs Select Committee is Yvette Cooper, an intellectual heavyweight of the first water. You had to feel a modicum of sympathy for the hapless folk the companies fielded. But only a modicum. A mini modicum.
Inevitably, on Inauguration Day in the USA, much of the Committee’s focus was on Trump, Trumpism and the post-truth world that helped create and sustain both. 6th January figured large.
To their credit none of the company representatives sought to deny or minimise the role social media businesses played leading up to and including 6th January. The air was full of regrets for not acting sooner or differently. Phrases like “we are still learning”, “we know we must do better”, peppered the replies to MPs’ questions. All this put me in mind of Professor Sonia Livingstone’s aside to me in correspondence about the importance of
“breaking the cycle of
- Putting a new product or service into the market
- Waiting for civil society to spot the problems and families to experience them
- Taking belated action.”
I might have added
4. Then being ready with self-deprecating comments like “we know we must do better” and “we are still learning“.
The disarming humility and contrition doubtless are genuinely meant at the time by the people speaking for their employers but humility and contrition butter no parsnips. Particularly when similar things keep on keeping on. There is a limit to the price societies can be expected to pay to allow companies the “freedom to innovate” . We are about to find out where that boundary lies. s230 is heading for the exit.
Facebook and end-to-end encryption
Yvette Cooper and others also raised questions about Facebook’s plans to introduce end-to-end encryption (E2E). In particular Cooper wanted to know what impact Facebook themselves thought this would have on their own ability to detect child sex abuse images currently being exchanged via Messenger and Instagram Direct.
Monica Bickert’s reply was certainly truthful, in a literal sense, but it was also incomplete to the point of being deceptive. Her answer to Cooper’s question was
“I don’t know but I accept the numbers will go down”
Future hypotheticals
Bickert added that she thought the numbers would probably go down anyway because of other measures the company was taking. In other words the drop in numbers that is coming if things go ahead as planned may partly be down to Facebook simply being more effective in discouraging illegal behaviour which threatened or harmed children. Cooper exposed this as self-exculpating baloney.
Turns out it largely hinges or depends on planned educational initiatives designed to help children avoid abusive individuals and situations in the first place. Not exactly mind-blowing or revolutionary. In fact it is the kind of stuff they are already doing and if all Bickert is saying is they will do more of it or better then bring it on. It is welcome even though a tad oblique as compared with straightforward detection, deletion and reporting, which was the main thrust of Cooper’s questioning. Cooper was not asking about images that might not be created or exchanged or paedophiles who might be avoided.
46% decline in 21 days
Cooper referred to numbers published some time ago by NCMEC. These suggested if Facebook went ahead with E2E there could be a 70% drop in images being detected, deleted and reported. That’s globally.
What Cooper evidently did not know, but Bickert must have, was the day before the Select Committee meeting NCMEC had published new data showing the known or actual effect of Facebook ceasing to be able to detect child sex abuse in the manner they had hitherto.
Because of the fiasco with the European Electronic Communications Code, on 20th December in all EU Member States Facebook stopped scanning for child sex abuse materials. Stopping scanning has exactly the same effect as introducing E2E.
On 19th January, NCMEC’s new published numbers showed in the 21 days immediately following 20th December there had been a 46% drop in reports from EU countries.
Excluding the UK, in the three weeks prior to 20th December NCMEC received 24,205 reports linked to EU Member States. In the three weeks afterwards it dropped to 13,107. We will never know which children were in the 11,000 images that weren’t picked up. How many were new images, never seen before, with all that that entails?
So when Cooper asked, as she did twice, about the likely effect of introducing end-to-end encryption Bickert was truthful when she said she couldn’t say but she might have at least mentioned the numbers NCMEC had just published. Then she could have explained why a 46% drop, or worse, concretely, not hypothetically, is a price worth paying.
Facebook blames their customers
Cooper persistently challenged Bickert as to why they were going ahead with E2E at all when they knew it will mean more children will be put in harm’s way, more perpetrators will go un-caught and un-punished. Bickert’s answer was, er, “surprising”.
Bickert referred to a survey of British adults who, seemingly, listed privacy related concerns as their “top three”. I am not sure which survey Bickert had in mind, she didn’t say, but if it was the 2018 Ofcom one she might have read a little further and seen “the leading area of concern” is the protection of children. But even if that was not the case, whether or not children were “listed” in the top 50 concerns expressed by adults, teens or stamp collectors for that matter, what was Bickert really saying?
“Don’t blame us. We’re only doing this because it’s what the dudes want and our job is to give it to them.”
An industry standard?
Bickert and her colleague from WhatsApp shifted their ground a little saying “strong encryption is now the industry standard” as if this was the key justification for going ahead with or retaining E2E. Cooper pointed out that Facebook was a major part of the industry so that amounted to rather transparent, self-serving circular reasoning. Moreover in other areas Facebook has repeatedly shown it is willing to strike out alone and not just follow the herd. They cannot now shelter behind the actions of others.
The underlying reasons?
Suggesting something is an “industry standard” is simply a less vulgar or less pointed way of saying “our revenues will likely be badly impacted if we don’t do this”. It’s a variation on the dudes theory expounded earlier. In other words it is about money.
Secondly, how did we get to a point where the dudes seemingly feel they need to have E2E? Isn’t it because of the previous actions and admitted failures of companies like Facebook?
So first they create the problem and then they come up with the wrong answer to it. Chutzpah on stilts.
Facebook’s “pivot to privacy” is alliteratively admirable but not in any other way. It is about Facebook trying to repair its appalling image in the privacy department, based on its history of not respecting or taking sufficient care of its users’ privacy. It is acting now in order to continue generating gigantic quantities of moolah.
Towards a very dark place
We may never know what role encrypted messaging services played in organizing and orchestrating the events of 6th January but few can doubt that the unchecked growth of strongly encrypted messaging services is taking us towards a very dark place. A place where child abusers as well as fascist insurrectionists feel safe.
In and of itself strong encryption is not a bad thing. Indeed it is now essential in many areas. But in the wrong hands, used for the wrong purposes, it can facilitate a great deal of serious damage. We have to find a way to ensure that does not happen. If companies like Facebook do not find a way of doing that, they will have one thrust upon them. The Silicon Valley experiment has run its course. It will soon look different.