Apple challenges the Rule of Law

Feb 25, 2025

So Apple has decided to withdraw its Advanced Data Protection tool (ADP) from the UK market. Henceforth no new Apple user in the UK will be able to opt into it and ADP will eventually be withdrawn from anyone who had already done so.

ADP is a form of strong end-to-end encryption. It is deployed in connection with Apple’s iCloud.

The UK Government asked Apple to reconfigure the way they provided ADP so that, on production of a warrant naming a specific individual or entity, the company could handover content related to the warrant. Apple didn’t want to do that.

There are issues with strong encryption being used in other environments and I will return to those in a future blog. However, to be clear, I have no doubt strong encryption is absolutely essential in a number of areas. But things are going badly wrong in others.

Apple’s deployment of ADP is one of these others and that will be the focus of the remainder of this blog.

I apologise for its length.

First a few facts

In early 2025 there were 2.2 billion active Apple devices in the world. Estimates vary but seemingly between 120 and 140 million of these are in the USA.

In 2023 in the UK mobile phones and tablets using the Apple operating system accounted for around half of the market.

In 2025 one forecast suggests there will be 60 million smartphones in use in the UK. Presumably quite a few people will have more than one!

Last year 64% of iPhone users in the USA subscribed to paid-for iCloud storage plans.

In the consumer space, seemingly 71% of US customers say they use a cloud facility to store photos. Just over half used a cloud facility to store important documents, music and videos.

It is highly likely similar patterns exist in the UK and are repeated in many other jurisdictions.

In the consumer space (i.e. excluding enterprise level) iCloud is the second largest cloud storage facility in the world (33% of the market), Google Drive being the largest (40%). Microsoft is third (20%).

The bulk of the rest of the device-based market uses Android as the operating system. There is no reason to suppose their usage patterns are markedly different from those of the Appleariat.

What happens in the cloud matters.

Back to the main story.

The big disappearing trick

For the time being and the immediate future (see below) the use of ADP makes everything that has been encased within it completely inaccessible to anyone other than the person or entity with the necessary key to decrypt and see the content.

Apple cannot see or get at it. They therefore cannot hand it over in readable form to any third party. Apple deliberately designed the system that way.

Encryption tools have been around for a while

Strong encryption tools like ADP have been around for decades but, historically, they were rather clunky, difficult and time-consuming to use. Geeks and some businesses tended to be the only ones who bothered.

That meant, yes, there were issues with geeky crooks and geeky weirdos but the relatively low level of take up meant the problem of inaccessibility could be managed. Too often what that meant in reality was cops and lawyers had to grit their teeth and come to terms with their powerlessness.

In individual cases where serious criminality was suspected, a huge amount of extra effort, time (and expense) might be invested in finding alternative ways of getting whatever evidence was necessary to effect an arrest, prevent an outrage or obtain proof to make a case in court. These efforts did not always succeed and it might, literally, have taken months, maybe years, for everyone involved to recognise and accept they could go no further. They had hit an evidential brick wall.

Justice delayed is justice denied. What about justice that could never be obtained in the first place because someone built a system that meant it couldn’t be?

That’s an insult.

Scale changes everything

Teeth gritting and being resigned to powerlessness are no longer an option. Easy-to-use strong encryption tools have become freely and indiscriminately available at scale. They are being routinely incorporated into mass messaging platforms, making them a major vector for all manner of criminal and other forms of unlawful behaviour. Fraud, terrorism and child sexual abuse are just the first three that spring to mind (PS for a mind-boggling insight into contemporary levels of encryption-dependent fraud you could do worse than read a recent article in The Economist. Strong encryption plays a huge part in helping criminals rip people off on a gigantic scale. Now think how that might translate and be working in other crime areas).

No legal basis

ADP and similar tools have elevated privacy to something it was never meant to be.

No international legal instrument, no national legislature in any country in the world, has ever said privacy is an absolute or unqualified right. Period.

Privacy is a qualified right.

The UN’s Universal Declaration of Human Rights addresses privacy in Article 12

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence… Everyone has the right to the protection of the law against such interference…

Article 17 of the International Covenant on Civil and Political Rights speaks in smilar terms

No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home, or correspondence…

Everyone has the right to the protection of the law against such interference…

The key word here is “arbitrary”. Not all interferences with privacy are prohibited, only those that lack legal justification or are disproportionate.

In General Comment 16, the UN Human Rights Committee (which also monitors compliance with the International Covenant) clarified that restrictions on privacy must not only be

defined in law, they must also be legal, necessary, and proportionate

Article 8(1) of the European Convention on Human Rights similarly protects the right to private and family life, home and correspondence.

However, Article 8(2) expressly allows that right to be restricted if it is for national security, public safety, the prevention of crime and disorder, the protection of health or morals or the protection of the rights and freedom of others (emphasis added by me).

Article 1(2) of the GDPR says it

protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data

However, Article 23 says authorities can restrict GDPR rights if it is necessary and proportionate for reasons such as:

National security and defence, public security

Prevention and investigation of crimes

Protection of judicial independence

Regulatory functions (e.g., financial oversight)

Protection of rights and freedoms of others

Adding any such restrictions must:

be established by law, respect the essence of the right and be necessary and proportionate.

UK legislation mirrors all of these provisions.

Creative reinterpretation cannot change that

No amount of creative reinterpretation is going to change the legal facts.

On its web site Apple says privacy is a “fundmental right”. No argument there, but that is not the same as saying privacy trumps any and every other right.

Yet with its ADP system Apple has chosen to make privacy just that. A right that stands above, separate from and superior to all others.

This flatly contradicts both the spirit and the letter of the instruments cited above.

Making our courts mute and impotent

Even if Apple wanted to, it could not comply with any warrant or court order of any kind in respect of anything placed within ADP.

Apple is therefore, unilaterally, directly and intentionally, challenging the very idea of the Rule of Law.

Apple has decided to render our courts mute and impotent. It may be true there is no law which explicitly forbids them from doing what they have done. Is it, though, at least arguable that what Apple has done is unlawful because they deliberately sought to subvert or circumvent established privacy laws?

Apple is preventing states from “protecting the rights and freedom of others” by in effect favouring the position of those who used ADP.

Lawful or not Apple had no wider moral right to do it.

Apple is doing it for money (see below) others, not-for-profits that provide strong encryption services, may be doing it because they hold a certain view of how the world is or ought to be. They are entitled to that view. But they, and Apple, are not entitled, simply on their own say so, to create and foist on the rest of us huge new spaces which provide criminals with impunity, allowing them to act to the detriment of very large numbers of people.

Compulsory decryption?

In the UK, at least in respect of criminal matters or issues of national security, s.49 of the Regulation of Investigatory Powers Act 2000 can require a suspect to hand over the decryption key.

However, if the likely or potential penalty for refusing to hand over the decryption key is significantly less than the likely or potential penalty which would arise from what would happen following disclosure, the suspect is highly incentivised not to disclose.

What then? Our law is made redundant? That is unacceptable. The process of obtaining a conviction under s49 is anyway protracted and fraught.

ADP uses XTS-AES-256 for its encryption. Seemingly a “brute force” attack to try break it would take 3.1 x 1056 years. I’m told that is “trillions of times longer than the age of the universe”. Bruce Schneier, a famous cryptographer, is also supposed to have calculated that trying to break some strong forms of encryption would require “more energy than the sun produces in billions of years.”

Getting the decryption keys is the only way to go.

Thus, one way of thinking about what the UK Government asked Apple to do is restore some kind of status quo ante. To get us back to a world where our Attorney General or Lord Chief Justice is able to say a case can proceed. To say it has not been frustrated by a decision taken by a Board of Directors in Cupertino.

A world of little trust

We live in a world where there is vanishing little trust. In the past, companies and Government agencies have both disgraced themselves in respect of our privacy. Think Snowden as but one example. Does that mean we must now put all our eggs in the companies’ basket?

Truly, why would anyone in the UK trust Apple more than they would our own courts and judicial processes? Things have changed since Snowden.

In the UK, under the Investigatory Powers Act 2016 we have a robust system which must be followed before the police or security services can obtain a warrant requiring the disclosure of encrypted data. It involves a “double lock”. A Secretary of State must first sign an order which then has to be countersigned by a Judicial Commissioner drawn from the Investigatory Powers Commissioner’s Office. The rules they have to observe are crystal clear.

The warrant must be necessary and proportionate in respect of information which is also necessary for the investigation and necessary because other reasonable means of obtaining it have been exhausted.

I am not entirely clear how things might work in a civil case although I believe the processes can ultimately lead to the same end.

Are these processes and safeguards sufficiently well understood by the general public, politicians, journalists, and civil society leaders? Maybe not, in which case we need to address that.

All reasonable people who think about these things should be able to feel happy about the arrangements for obtaining a warrant to require the production of encrypted data.

“All reasonable people” here means Apple, the cops, you and me. It emphatically does not mean only Apple, the Electronic Frontier Foundation and similar.

Apple have said if they were to agree to create the kind of “back door” the UK Government asked for it would mean “bad actors” would inevitably find their way to it and through it and that would ruin it for everyone.

Really?

Reassuringly these putative bad actors won’t have gained admission via brute force (see above) so how will they have done it? Exactly. Bribery?Corruption? Human error?

Aren’t those design challenges as much as they are anything else?

Isn’t Apple simply indulging in self-serving scaremongering? Again.

Are there no examples from elsewhere in the forest of secure processes that can be adapted for this environment?

Otherwise look at the madness we have created. We have allowed a system to be established which will do substantial harm to millions of people but we can’t address it for fear that, if we do, we will end up with a system that will do substantial harm to millions of people.

You cannot balance anything with zero

Traditionally, in a dispute or investigation where there is an apparent conflict between different parties’ rights, including the right to privacy, guided by the legal principles outlined above, the courts or maybe in this instance the Commissioner, would be expected to strike a reasonable and proportionate “balance”. In every jurisdiction I know the same broad principle holds true.

But you cannot balance anything with zero. Zero always wins. ADP is zero.

Everything is not quite as it seems

Referring back to my earlier comments, I said ADP and other forms of strong encryption create an impenetrable wall. What I ought to have said really is “impenetrable for now”, meaning until quantum machines upend everything.

Nobody knows when this moment will arrive. It could be next year or the year after and when it does arrive, if it is in the hands of …. let’s call them “the wrong people” they might not announce that fact. We could be caught unawares for quite some time.

However, once they are here quantum machines will make readable, possibly in the twinkling of an eye, every message we might ever have sent under ADP or any of the other strong encryption regimes currently in use on any messaging platform.

Check out “Harvest Now Decrypt Later (HNDL).”

Few people who support Apple’s stance like to talk about HNDL because it does rather undermine the case they make about the importance and solidity of the protection ADP and other forms of strong encryption provide today. In other words, for short-term tactical advantage, to try to win an argument, they are propagating a deceit. A potentially dangerous one.

I acknowledge that such a thing as post-quantum cryptography is emerging and it may be effective. If that happens and it can be made to work at scale, anything that is encrypted using post-quantum cryptography might well become and remain uncrackable. But that does not apply to the stuff you have already sent and stored, or stuff you will send and store later today or tomorrow, or indeed at any time in the future until your messaging platform decides to deploy post-quantum cryptography to its customers’ activities.

Remember that. It might save you a great deal of embarrassment, loss or difficulty somewhere down the line. I predict pen and ink and face-to-face conversations will make a big comeback.

Diluting its brand?

It is impossible to believe Apple’s stance in relation to the UK Government’s request is linked in any way whatsoever to any kind of high-minded determination to fight human rights breaches, oppression or any other nasty thing come to that.

Apples sells privacy. Privacy is their product (coupled, admittedly, to beautifully designed and easy to use devices).

For Apple this whole thing is about money. Apple thinks complying with the UK Government’s decision would compromise or dilute their strong privacy brand, maybe making it possible for another company or entity to outflank them on that front, over time eroding their market position.

To put that slightly differently, in their dollar-driven calculations, Apple has concluded complying with the UK Government’s request would lose them more money than they would gain or retain by refusing. Since they will not wilfully actively break the law the only thing they can do is what they have done. Pull the service.

A gigantic asymmetry

The UK may have been the first liberal democracy to try to face down, in an unavoidable way, the challenges posed by the emergence of strong encryption and the way it is being incorporated into mass messaging systems, but other jurisdictions are going to finish up in the same place sooner or later.

Civil society has to step up and engage.

The problem is, on the one hand, there are businesses with a vested financial interest in the outcome of this debate. They have a bottomless pit of money, an endless supply of lawyers, an ability to get their messages out rapidly, capturing the headlines, shaping the political agenda, and huge experience in doing so, linked to their great talent for lobbying and media management, plus a global network of activists who have been working together for years and, at least in this respect, wholly sharing their views.

Then on the other hand there is us. NGOs relatively new to the space, with barely two brass farthings between us, not accustomed to working together internationally at the same speed or intensity and, at least in the UK’s case, not always aided by a typically slow-moving Government whose ability to communicate about these issues is severely constrained by a range of considerations, not all of which will necessarily always be connected to the immediate matter in hand.

The role of philanthropy

If ever there was a case for philanthropy to step in and help level the laying field a little bit it is here. We will never be able to get a fully level playing field but it doesn’t need to be fully level when your case is as strong as ours. What I am certain of, however, is children’s organizations alone cannot win this argument. We need to help construct and be part of a much broader civil society coalition.

Apple and China – a side note

Apple manufactures 95% of all its hardware in China. In 2024 16% of its revenues (US$15 billion) were derived from customers living in China. Since 2018, for its Chinese customers, Apple has kept and still keeps the decryption keys utilised by their products on servers based in China. Chinese law requires it so they comply. If the Chinese authorities want access to customer data they only have to go through the Chinese courts. ADP is not available in China.

An ironic twist?

Wouldn’t it be ironic if ADP and similar forms of strong encryption were only availble in certain liberal democracies and not available at all in countries governed by “different kinds” of regimes?

What’s that all about?

Answers please on a postcard to the usual address.

PS As a reminder of one of the reasons why this matters to me

Apple and child sexual abuse material

In 2023 36 million reports of child sexual abuse material were made to the National Center for Missing and Exploited Children in the USA. Around 250,000 of these came from the public, in other words the vast majority came from online platforms directly. 17 million came from Facebook alone. 267 (not a typo) came from Apple. Why? Because Apple don’t and won’t look for child sexual abuse material and this despite the fact that, in August 2021, Eric Friedman, then and now a senior Executive at Apple, declared his company to be “the greatest platform for distributing child sexual abuse material”.

Recognising Apple had a responsibility to do something about this the company came up with an elegant solution which would have enabled them to do what so many other platforms have been doing, some since at least 2009, namely identify known child sexual abuse images then delete them.

Apple’s solution was a form of “client-side scanning”.

In essence it meant child sexual abuse material could be detected and addressed on devices before it entered the encrypted tunnel. Encryption was therefore not touched or compromised in any way.

Their proposal was praised by some highly respected techies but was nevertheless attacked by others. Nothing new there.

At first Apple vigorously defended its solution against critcism. Craig Federighi lauded its “multiple levels of auditability”, even saying it was an “advancement of the state of the art in privacy….enabling a more private world”.

Then they caved. Not something a company like Apple does often on anything. But here we are only talking about child victims of sexual abuse whose pain and humiliation they are helping put on show.

An absolute disgrace. In fact a unique disgrace. I can think of no precedent for a company admitting it has a problem of any magnitude then deliberately doing nothing significant to abate it even though it had shown it could. Such hubris. Several senior Apple staff quit over this unconscionable and arrogant volte face.

And just think about that for a moment. Apple announced it was the “the greatest platform for distributing child sexual abuse material”. Then it announces, unlike all of its major competitors, it intends to do nothing to try to halt the traffic. What kind of signal is that sending out to child sex abusers and other crooks?

Apple is your platform of choice. Come to us”.

With support from the Heat Initiative Apple is now being sued in a class action on behalf of known victims of child sexual abuse where the images have been seen, classified and included in the official US database.

Over 1,500 platforms are known to be using that database. Apple isn’t.

Victims cannot understand why any company that could act to end the distribution or storage of child sexual abuse material doesn’t. Neither can I. I very much hope the court sees things the same way.