Meet the “cappers” or, rather, don’t

“Cappers” is a term that has been around for a while.  Click here  for a great explanation from Signy Arnason, Associate Director of the Canadian Centre for Child Protection.

“Capping” is about tricking children into doing something inappropriate, for example while livestreaming,  then without the child’s knowledge images or recordings of the inappropriate behaviour are “captured” and subsequently used to extort or sextort the victim. Paedophiles and other sexual predators are ardent cappers but so are people who have absolutely no sexual interest in children. They are just looking for  easy ways to get money or goods.

Typically but not always cappers go on the prowl looking for children within multiplayer video games and chat apps. The child will be asked by someone in there to go on livestream, perhaps stepping through a progression of requests to build  trust, comfort, confidence and familiarity.

Sometimes it is as simple as targeting teen boys by pretending to be a teenage girl then asking the boy to  perform a sexual act on camera.  From reports it seems getting girls to do something sexual on camera is often, not always,  likely to require a greater effort to groom them whereas boys can be more impulsive.

Conversations can begin on one platform, usually a public one, and be moved swiftly to somewhere private. Fake inducements, cash or gifts, might be offered to get the young person to engage.

Capping in the age of lockdown

Now that so many schools have closed down as part of a series of measures to try to get rid of the Covid-19 virus, millions of children will be at home and almost certainly a great many of them, to an even  greater degree than usual and for longer periods of time,  will be glued to screens, playing games and staying in touch with their friends, maybe making new friends through a wide variety of apps.

The Canadian Centre has been monitoring some of the conversations taking place between cappers and with the Centre’s permission I reproduce verbatim an account they picked up just last week

With potentially millions of boys around the world being or soon to be forced to stay home from school, potentially unsupervised if parents are working (teens in particular) now is the time for cappers to do their part to assist the quarantine efforts. There is a dire need for enriching, structured activities for all these boys to engage in.”

I have heard an unconfirmed report that some UK ISPs are detecting a 25% increase in “adult camming”. That’s a slightly ambiguous term but it doesn’t sound great and we must hope it does not include a large number of actions by cappers who have targeted kids.

Grooming expected to rise

Last week the UK’s Internet Watch Foundation put out a clarion call warning about the increased risk of grooming attempts during this period of restricted movement. And the police, in the form the NCA-CEOP Command and the excellent Parent Zone, also started reminding people and pointing  them towards their own advice and guidance on grooming. Further announcements by the police are expected shortly.

The message, I’m afraid, is clear. Some very bad people will try to exploit the current situation. They are being provided with almost the ideal conditions. Harassed parents, bored kids and almost unlimited time stretching into the far distance.

Parental engagement will be key but maybe now is also a great time to start checking out some of the tools and apps that can lend parents a hand in keeping their  children safe when they aren’t and can’t be looking.

It might also be a good time for tech players to step forward and show they get it and are busting a gut to do something extra in these uniquely awful, stressful conditions.

I am not going to mention age verification

Obviously no one could have foreseen the present situation when, last October, the Government announced its decision to delay the introduction of the age verification regulations to control children’s access to pornography sites.

For that reason I am not even going to mention it here. However, other less generous spirits might be inclined to point out that had everything been in place, as it could have been and should have been by now, that would have been one less thing for parents to  think or worry about during lockdown.

Is it yet too late to hope something can be done to speed things up?  The porn companies are ready. The age verification companies are ready. It just needs someone to press the button marked “go”.

Online child protection should be part of the Government’s national response

I have been slightly exasperated by some of the synthetic outrage being expressed here and there because nobody  in Whitehall or Westminster anticipated x or y. None of us have ever lived through days like these.  Mindful of the famous expression by a leading military strategist about “no battle plan survives first contact with the enemy” to some degree we are all having to make it up as we go along.

So I am not going to criticise the Government for not anticipating what we now know to be true.  However,  it has become abundantly clear that there needs to be a national alert about the fact that people like “cappers” are out there seeking to exploit the conditions created by the lockdown. This should be linked to a reminder that wonderful though it is in so many ways, the internet is not always an unalloyed blessing.


Posted in CEOP, Child abuse images, Default settings, Privacy, Regulation, Self-regulation, Uncategorized

More on the risks associated with strong encryption

I hope someone in Facebook reads this and passes a message up the line. I am sure it will only add to the substantial number they have already received on exactly the same point.

Cho Joo-bin lives in South Korea. As he was led out of a police station in Seoul yesterday  Cho Joo-bin thanked law enforcement officers for

“… ending the life of a demon that I couldn’t stop.”

I suppose it is good to hear Cho Joo-bin acknowledges the terrible nature of the things he has done but what exactly did he do and who helped him do it?

Jobs as bait

On the internet he advertised jobs designed to attract women. The jobs were fake. Cho Joo-bin used them as a lure to get the women to make  sexually explicit video clips in return for a big payout.

Once he got hold of the compromising images they were used as a blackmailing tool, threatening to release them online or to their friends and relatives unless the women supplied increasingly dehumanizing and even violent footage. In some of the videos victims had carved the word “slave” on their bodies.

Mr Cho Joo-bin then sold the video clips and pictures on an encrypted network and received payments using cryptocurrencies. According to the South Korean police Cho Joo-bin’s arrest related to 74 women, 16 of whom were minors. They were said by the police to be held in some form of “sexual slavery”.

Part of Cho Joo-bin’s method was to attract paying customers with “trailer” clips in Telegram (encrypted) chat rooms and charge them when they demanded more sexually explicit or perverted material.

In the course of their investigations South Korean police uncovered a network of over 250,000 individuals using what they call “Nth rooms”, encrypted spaces which provide users with a sense of impunity for criminal behaviour covering a broad spectrum of crimes. Seemingly the  cops only got on to Cho Joo-bin through the arrest of a third party.

Imagine this

You have a famous-brand platform. It is huge and has a cuddly, friendly image. The platform’s owner talks about safety a great deal and tells you (“asks you to take on trust”) his company does a lot to police and keep their virtual spaces safe. Your parents use it and even though it’s not your main App you wander about it from time to time. What could possibly go wrong?

This same famous-brand platform has a messaging service closely linked to it. The messaging service even shares the same name.  Could anyone imagine, could a child imagine, there  might be any kind of risk associated with moving seamlessly, via a couple of clicks, from one cuddly, friendly place to another, owned by the same cuddly, friendly company? Would they know they had moved from an unencrypted to an encrypted space, and what the implications of that are?

Only one answer

Nobody is asking for strong encryption to be abandoned. Everyone should know that as their messages move across a network or are stored, hackers and spies of any and every kind cannot get at them. But equally everyone should know that with proper legal authority their messages and online activities can be scrutinised.

Companies that cannot deliver a service that meets this standard should declare either that they will abandon it completely or that they intend to move to a new system where the same difficulty will not arise. And any company just thinking about adopting strong encryption should stop until they are sure they will be in the second category from Day 1.

Posted in Child abuse images, Default settings, Pornography, Privacy, Regulation, Self-regulation

Sad, but unfortunately not surprising

Free speech and civil liberties organizations do a hugely important job scrutinizing the activities of state agencies. However, you are forced to wonder if, sometimes, they don’t completely lose touch with common humanity.

The Electronic Frontier Foundation  (EFF)has just served up another example of exactly the kind of thing I had in mind.  Click and tell me what  you are reminded of when you look at the graphics used in their latest campaign. 

Yep. Me as well.

But what is the EFF actually complaining about? It’s a measure brought forward in Congress by Senators Graham and Blumenthal to try to get internet businesses to do more to combat the kinds of child sex abuse that are facilitated by the internet.

My generous spirit

I try not to leap to conclusions. I try to give people the benefit of the doubt.  When I look at something that is otherwise rather startling I wonder if there is a benign interpretation, some context of which I am unaware, that might cast a new light on the matter in hand.

Driven by such thoughts I wrote to the EFF. Below I reproduce the entire correspondence. I have left out the name of the person at the EFF who sent me the reply for reasons I will happily explain if they are not obvious.

My initial email

Dear EFF,

I have just read your blog opposing EARN IT (the Graham-Blumenthal Bill). Have you published anything suggesting what tech companies can and should do to reduce the distribution of child pornography over the internet? And ditto in respect of grooming behaviour (behaviour designed to persuade under age persons to engage in illegal sexual activity)?

EFF’s reply

Hi John,

Thanks so much for reaching out to us at the EFF. We appreciate you seeking our input on this.

While we do not have any resources of the nature you describe, we know child exploitation online is a real problem. But the EARN IT Act offers no meaningful solutions. It doesn’t help organizations that support victims. It doesn’t equip law enforcement agencies with resources or training to investigate claims. Rather, the bill’s authors have shrewdly used defending children as the pretense for an attack on our free speech and security online. (emphasis added by me)

Thanks again for coming to us about this.



My second email

Dear EFF,

Many thanks for getting back to me…..  I  completely get the importance of supporting victims and equipping law enforcement agencies with resources  and training to investigate claims. Has the EFF said anything about the use of technical tools to detect child pornography so it can be identified and investigated and victims located? If more companies used these sorts of tools I guess that would mean politicians would back off. No?


EFF response to my second email

I haven’t had one.

But look again at the EFF reply, in particular the section I put in bold. “Shrewdly”?  “Pretense”?  Don’t these words suggest deliberate, cynical intent? Directed at what? Disguising the Senators’real motivation which is to “attack…free speech and security”.

Trying to protect children is just a ruse. A smokescreen.

So there you go. Additional comment seems pointless. At the moment.

Posted in Child abuse images, Privacy, Regulation, Self-regulation, Uncategorized

More progress in the fight against online child sex abuse

Last week representatives from the Governments of the “Five Eyes” nations, (Australia, Canada, New Zealand, the UK and the USA) gathered in Washington DC. They endorsed a set of eleven voluntary principles to combat a range of online threats to children. Alongside the principles an explanatory note was also issued.

The principles did not magically appear out of thin air. They were the product of months of negotiations and discussions between “Five Eyes” and the six companies named in a contemporaneous UK Home Office press release: Facebook, Google, Microsoft, Twitter, Snap and Roblox.  There was blood, sweat, tears and lawyers behind every dot and comma.

Each of the companies I just mentioned is a member of the Technology Coalition, along with a further ten,  some of  them household names. The Coalition issued a statement in which they said they “stand behind” the eleven principles, adding “(we will work) with our members to both spread awareness(of the principles) and redouble…efforts to bring industry together to promote transparency, share expertise and accelerate new technologies to combat online child sexual exploitation and abuse.”

Then in the explanatory note this appears:

“The WePROTECT Global Alliance, which currently comprises 97 governments, 25 technology companies and 30 civil society organisations, will promote and support the adoption of the principles at a global level to drive collective industry action.”

The membership list for WePROTECT is currently being updated so I cannot provide you with a working link to it right now, but I do know the 25 technology companies referred to encompass most of the Technology Coalition’s membership and more besides, including several big names that had chosen not to be members of the Coalition.

The largest-ever collection – I think

The moving spirits behind the eleven principles are to be congratulated. I am pretty sure the document they have published and the support it seems to be attracting represents the largest ever assembly of companies, Governments and civil society organizations rallying behind a set of concrete, directed proposals addressing the position of children in the online environment.

The angels are in the details

Of course, the eleven principles document contains the usual high level, platitudinous, compulsory elements  that are mirrored in a thousand other declarations, communiqués, resolutions and solemn protocols stretching back almost thirty years, but what matters most here is the detailed stuff.

From now on

From now on no one can argue  the ideas and proposals listed in that document are unreasonable or not do-able, the ravings of wild-eyed idealists with no knowledge of how tech or online business works.

Unquestionably and unalterably the eleven principles  document therefore establishes a hugely important, new global benchmark. One insider emphasised to me that the document is “aspirational” and I understand that. But I doubt any of the six companies will say they put their name to aspirations that were unattainable or undesirable.

But voluntary?

Cynics may say “Enough already with voluntary statements. How many last chances can there be in the last chance saloon? As long as companies have wiggle room they will wiggle.”  I cannot argue with that, but with initiatives such as these the circumference of the wiggling space shrinks.

I would have liked the language to have had a more urgent, pressing edge to it but it would be foolish and counter-productive not to recognise the eleven principles as progress. This is a global document not a UK one, and it is as a global document that it represents a new benchmark. A UK-only document would be very different.

Even so, let me pick out just a few really good points that I think are welcome signs of an evolution in thinking.

Terms of service

Five times the principles document refers to taking “appropriate action under their terms of service”. This is very important. For too long companies have said “these are our rules, this is the basis on which you agree to engage with us” and in so doing have created a wholly misleading impression. Why? Because they have made limited or no efforts to enforce their rules relying, instead, on outdated, prehistoric external immunities. It’s almost as if their rules are merely marketing material. This has to end, and that includes being wilfully blind to the presence of persons below the minimum specified age.

New materials

I also liked the appearance, in Principle 2, of the reference to developing tools to “identify and combat the dissemination of new child sex abuse material”. The main focus up to now has been on using tools to identify already known images but really we ought to be able to do better than that and in fact some companies tell us they are doing better than that. We need to know more and the technology needs to be made widely available.

Not illegal but very harmful

What is wholly new in a document of this type is Principle 8. It refers to companies seeking “to take appropriate action, including providing reporting options, on material that may not be illegal on its face, but with appropriate context and confirmation may be connected to child sexual exploitation and abuse”.

Too many companies have been relying on the narrowest interpretation of the law concerning illegal child abuse content. As a result, they are refusing to take down images which on any reasonable understanding, any decent human understanding,  are extremely prejudicial to a child’s well-being. That must change and Principle 8 is the harbinger. I imagine a lot of people in Canada and Germany will have felt absolutely delighted when they saw Principle 8. Their niche in the history books is guaranteed.

My one major criticism

If I have one major criticism it is nothing to do with what the document says. It is to do with what it does not say. There is nothing about how to carry forward the momentum. “Five Eyes”, as such, has no machinery with the ability to follow through or monitor progress and anyway it is too narrow a base. The Technology Coalition has led a somnambulant existence since 2006 and seems unlikely to be able to develop the necessary larger reach.  The WePROTECT Global Alliance is extremely  valuable and  important but its structure imposes constraints which may be insurmountable in this particular context.

Then I look at something like the Global Internet Forum to Counter Terrorism (GIFCT), established in 2017 and ask why there has been no equivalent body devoted to the protection of children and the defence of their rights in the online space? Just read what it says about GIFCT’s objectives and structure.  Urgency and millions upon millions of dollars have been put behind this. Quite right too. Children deserve something approaching or at least in the same vicinity as this  level of seriousness.

I  look also at the Global Network Initiative established by the industry in 2008 with a stated aim of defending freedom of expression and privacy rights. It was, originally at least, wholly funded by the industry to act as a buffer against what they considered to be overly intrusive Governments.  This is another multi-million dollar operation which has no equivalent in the world of children’s online rights.

The need for a global observatory

There ought to be a civil society-based global observatory specifically dedicated to advancing the interests of children in the digital environment. Greenpeace is the model I have in mind. Respected because it is guided by the science in furtherance of a cause and with a global, mutually supportive,  connected activist network monitoring, lobbying and engaging with policy makers and decision takers in practically every jurisdiction and in major international arenas.


Just look what is happening on Capitol Hill right now.  On the very day the eleven principles were published a bipartisan measure was introduced in Congress  which, essentially, said if you  are an internet company and you don’t  act  to protect children, pretty much in the way the eleven principles suggest,  you will go out of business.  And that message is remarkably similar to that adopted by the UK’s Independent Inquiry into Child Sex Abuse. Their report came out yesterday.

We all want the benefits the internet can deliver but people are saying they don’t believe the downsides are the inevitable price everyone must pay in perpetuity in order to have them. When enough people start saying it their elected representatives have to pay attention.  We are at the “enough point”. I think it’s called democracy.

PS Encryption

And what about encryption? I hear you ask. Thank you, that’s an excellent question. The word does not appear anywhere in the 11 principles document or the explanatory note. Not once. What conclusions do I draw from that? None yet, but several are bubbling away in the old grey matter. However, I note IICSA picked up on it. The cat is out of the bag.

Posted in Age verification, Child abuse images, Facebook, Google, Internet governance, Privacy, Regulation, Self-regulation

Companies behaving like states

Tech companies and Governments between them created the crisis of confidence that is said to be stimulating the demand for greater online privacy protection. Yet they are now lining up on opposite ends of the argument about how to address that crisis.

Certain  businesses seem to think the answer to their own past failings is to deploy forms of strong encryption that not only shields content from their own eyes but also makes it impossible for it to be seen by others with a legitimate interest e.g. those concerned with the administration of justice.

No human rights document or charter has ever said privacy is an absolute right so why do some companies seem intent on making it one?  It is hard to think of a clearer example of companies behaving like states. In this case bad states.

We have to find ways of deploying encryption that allows the legal system to carry on pretty much as before. Justice delayed is said to be justice denied. In this instance justice is being adjourned sine die.  

Putting children at risk

This is far from being merely a theoretical challenge. We have accepted, even rejoiced, in tools such as PhotoDNA. It came out in 2009 and has allowed businesses to spot child sex abuse material, delete it from their systems and report the distributors.

PhotoDNA was the first. Along with similar tools  developed by other companies, this represented a huge step forward in online child protection. However, these tools have been served with a redundancy notice by the spread of strong encryption into major messaging and cloud storage platforms, probably the two areas where they are most needed.

Limits of Artificial Intelligence

People speak about the potential benefits AI will soon deliver across multiple headings, for example in relation to detecting grooming. Those benefits will be reduced to nought within encrypted environments.

Obviously that is not a reason not to proceed with AI because an awful lot of activity will continue to take place in environments that are not encrypted. However, we need to be clear AI is not going to be any kind of panacea, particularly in respect of reducing the kind of serious criminal activity commonly found in encrypted spaces.

Displacement and encouragement

Actually, isn’t it likely that as unbreakable strong encryption becomes more widespread and probably easier to use,  criminal behaviour currently taking place in unencrypted environments will shift there? Encouraged by its impregnability isn’t it likely to grow?  Maybe the arrival of quantum computers will mean there will cease to be such a thing as an unbreakable form of encryption but right now nobody can rely on that or predict what else might follow. Speaking of quantum computing….

Five bits

The CEO of Post-Quantum  has suggested a solution to the encryption conundrum. He thinks companies and organizations that deploy strong encryption should split the relevant decryption keys into five parts. These would be distributed to five trusted bodies, one of which could be the company or organization itself.

Only on production of proper legal authority would the key holders be able to co-operate and provide law enforcement with the means to decrypt  or open specific devices, messages or stored content linked to identified entities.

No back doors, no mass or indiscriminate surveillance. It sounds like an avenue that is worth exploring, although a person who works inside a company that provides encryption services didn’t quite get what would be gained by involving so many players i.e. five instead of one.  If a court gives an order isn’t that enough? Good question. I only mention it to illustrate that people are at least trying to find answers.

Do you trust democracy?

Ultimately, whatever  technical fixes emerge, for those who are anxious about privacy it will come down to whether or not they trust the political and other institutions that make the laws, appoint the judges and supervise the enforcement of the law in their country.

When you look around the world it is not hard to work out why many people don’t have such trust and therefore why they cling on to what they think strong encryption can do for them. However, that is a problem under a different and larger heading.  It requires a political fix not a technical one. Puleeze don’t try to tell me strong encryption will help deliver that political fix. Truly oppressive states have so many other real world tools at their disposal.

But suppose you do have faith in your own judiciary, your own political institutions and your own means of providing oversight for the justice system and the security services?

Can we  still do nothing until every Government in the world is committed to liberal democracy? Must I tell a mother in Portsmouth that until the guys in Pyongyang take out a subscription to The Guardian we cannot use certain tools to protect her child from a sexual predator, or even try to invent new and better tools that could? That is the worst kind of blind utopianism.

Posted in Consent, Internet governance, Regulation, Self-regulation

Good news from Australia

Good news from Australia. A Parliamentary Committee of Enquiry today publishes its findings, coming down in favour of age verification as an important tool to help keep pornography away from the eyes of people it was never meant for, namely children.



Posted in Age verification, Pornography, Privacy, Regulation, Self-regulation

No room for complacency

My last blog was fairly upbeat about the UK Government’s interim response to the consultation on Online Harms. True the response was light on concrete proposals but much of the language was excellent. The overall tone was tough and purposeful.

Did I speak too soon or too naively? I say that because the day after the response appeared, in The Times, this article popped up under the headline “Boris Johnson set to water down curbs on tech giants”. 

It had all the hallmarks of an insider briefing, opening with the following

“The prime minister is preparing to soften plans for sanctions on social media companies amid concerns about a backlash from tech giants.


“There is a very pro-tech lobby in No 10,” a well-placed source said. “They got spooked by some of the coverage around online harms and raised concerns about the reaction of the technology companies. There is a real nervousness about it.”

Lest we forget, every Government in the world is to some degree conflicted. They want the jobs, prosperity and glitz that inward investment by hi-tech companies brings.

Against that is the day-to-day reality. Members of Parliament in the UK and  their equivalents in many other countries are constantly being visited by, or receiving emails and letters from, concerned parents, teachers and others about something horrible that has happened to one of their children or some other vulnerable individual. Children themselves have not been silent and their views broadly mirror everyone else’s.

So  is the scene set for a titanic struggle? We should assume it is and prepare accordingly because, as I have remarked before, the goodies don’t always win and the baddies don’t always lose.

The Government is going to be in an awkward position. They will not want to be seen as apologists for Silicon Valley.  They will not want to say, in effect

 “Chill. We are all going to have to  learn to live with these dangers to children or threats to us all from terrorists and scam artists. It is the price we have to pay in perpetuity for the benefits the internet brings. And yes we’re sorry the guys who own the companies that allow these things to happen have become obscenely rich off the back of your woes but even so we musn’t be too harsh on them.”

Yet, post-Brexit,  with a Free Trade Agreement with the USA very much in their sights, the pressure on the UK Government to dial it down could become immense. If any real signs of that happening emerge we need to urge Parliament to “take back control” and “get Online Harms done.”

Slogans such as those at least have the advantage of being familiar.

Posted in Age verification, Child abuse images, Default settings, E-commerce, Internet governance, Regulation, Self-regulation, Uncategorized