You might have seen the news today that Apple is announcing a raft of improvements to Macs and iOS devices aimed at improving security and privacy. These include FIDO support, improvements to iMessage key verification, and a much anticipated announcement that the company is abandoning their plans for (involuntary) photo scanning.
While every single one of these is exciting, one announcement stands above the others. This is Apple’s decision to roll out (opt-in) end-to-end encryption for iCloud backups. While this is only one partial step in the right direction, it’s still a huge and decisive step — one that I think will substantially raise the bar for cloud security across the whole industry.
If you’re looking for precise details on all of these features, see Apple’s description here or their platform security guide. Others will no doubt have the time to do deep-dive explanations on each one. (I was given a short presentation by Apple today, and was provided the opportunity to ask a bunch of questions that their representative answered thoughtfully. But this is no substitute for a detailed look at the technical specs.)
In the rest of this post I want to zero in on end-to-end encrypted iCloud backup, and why I think this announcement is such a big deal.
Ten years of privacy collapse
If you’re the typical smartphone or tablet user, your devices have become the primary repository for your private papers, photos and communications. Imagine some document that your grandparents would have kept on a shelf or inside of a locked drawer in their home. Today the equivalent document probably resides in one of your devices. This data is the most personal stuff in a human life: your private family photos, your mail, your financial records, even a history of the books you read and which pages you found meaningful. Of course, it also includes new types of information that are unimaginably more valuable and invasive than anything your grandparents could have ever imagined.
But this is only half the story.
If you’re the typical user, you don’t only keep this data in your device. An exact duplicate exists in a data center hundreds or thousands of miles away from you. Every time you snap a photo, each night while you sleep, this doppelganger is scrupulously synchronized through the tireless efforts of cloud backup software — usually the default software built into your device’s operating system.
It goes without saying that you, dear reader, might not be the typical user. You might be one of the vanishingly small fraction of users who change their devices’ default backup policies. You might be part of the even smaller fraction who back up their phone to a local computer. If you’re one of those people, congratulations: you’ve made good choices. But I would beg you to get over it. You don’t really matter.
The typical user does not make the same choices as you did.
The typical user activates cloud backup because their device urges them to do at setup time and it’s just so easy to go along. The typical user sends their most personal photos to Apple or Google, not because they’ve thought deeply about the implications, but because they can’t afford to lose a decade of family memories when their phone or laptop breaks down. The typical user can’t afford to shell out an extra $300 to purchase extra storage capacity, so they buy a base-model phone and rely on cloud sync to offload the bulk of their photo library into the cloud (for a small monthly fee), so their devices can still do useful things.
And because the typical user does these things, our society does these things.
I am struggling to try to find an analogy for how crazy this is. Imagine your country held a national referendum to decide whether most citizens should be compelled to photocopy their private photos and store them in a centralized library — one that was available to both police and motivated criminals alike. Would anyone vote in favor of that, even if there was technically an annoying way to opt out? As ridiculous as this sounds, it’s effectively what we’ve done to ourselves over the past ten years: but of course we didn’t choose any of it. A handful of Silicon Valley executives made the choice for us, in pursuit of adoption metrics and a “magical” user experience.
What’s done is done, and those repositories now exist.
And that should scare you. It terrifies me, because these data repositories are not only a risk to individual user privacy, they’re effectively a surveillance super-weapon. However much damage as we’ve done to our privacy with search engines and cellphone location data, the private content of our papers is the final frontier in the battle for our privacy. And in less than a decade, we’ve already lost the war.
Apple’s (losing) battle to encrypt your backups
To give credit where it’s due, I think the engineers at Apple and Google were the first to realize what they’d unleashed — maybe even before many of us on the outside were even aware of the scale of the issue.
In 2016, Apple began quietly deploying new infrastructure designed to secure user encryption keys in an “end-to-end” fashion: this means that keys would only be accessible only to the user who generated them. The system Apple deployed was called the “iCloud Key Vault“, and it is consists of hundreds of specialized devices called Hardware Security Modules (HSMs) that live in the company’s data centers. The devices store user encryption keys. Those keys are in turn gated by a user-chosen passcode, which is typically the same passcode you use daily to unlock your device. A user who knows their passcode can ask for a copy of their key. An attacker who can’t guess that passcode (in a small number of attempts) cannot. Most critically: Apple counts themselves in the category of people who might be attackers. This means they went to some trouble to ensure that even they cannot (be forced to) bypass this system.
When it comes to encrypted backup there is essentially one major problem: how to store keys. I’m not saying this is literally the only engineering issue, far from it. But once you’ve found a way for users to securely store their keys, every other element of the system can hang from that.
The remaining problems are still important! There are still, of course, reasonable concerns that some users will forget their device passcode and thus lose access to backups. You need a good strategy when this does happen. But even if solving these problems took some time and experimentation, it should only have been a matter of time until Apple activated end-to-end encryption for at least a portion of their user base. Once broadly deployed, this feature would have sent a clear signal to motivated attackers that future abuse of cloud backup repositories wasn’t a good place to invest resources.
But this is not quite what happened.
What actually happened is unclear, and Apple refuses to talk about it. But the outlines of what we do know tells a story that is somewhere between “meh” and “ugh“. Specifically, reporting from Reuters indicates that Apple came under pressure from government agencies: these agencies wished Apple to maintain the availability of cleartext backup data, since this is now an important law enforcement priority. Whatever the internal details, the result was not so much a retreat but a rout:
Once the decision was made, the 10 or so experts on the Apple encryption project — variously code-named Plesio and KeyDrop — were told to stop working on the effort, three people familiar with the matter told Reuters.
(For what it’s worth, some have offered alternative explanations. John Gruber wrote a post that purports to push back on this reporting, but on closer inspection the dispute in that post is mainly about whether the FBI killed the plan, or whether fear of angering the FBI caused Apple to kill its own plan. I file the distinction under “🤷”.)
This setback did not completely close the door on end-to-end encrypted backups, of course. Despite Apple’s reticence, other companies — notably Google and Meta’s WhatsApp — have continued to make progress by deploying end-to-end encrypted systems very similar to Apple’s. At present, the coverage is partial: Google’s system may not encrypt everything, and WhatsApp’s backups are opt-in.
A road not taken
As of July 2021 the near-term deployment of end-to-end encrypted backups seemed inevitable to me. In the future, firms would finally launch the technology and demonstrate that it works — at least for some users. This would effectively turn us back towards the privacy world of 2010 and give users a clear distinction between private data and non-private user data. There was another future where that might not happen, but I thought that was unlikely.
One thing I did not foresee was a third possible future: one where firms like Apple rebuilt their encryption so we could have both end-to-end encryption — and governments could have their surveillance too.
In August of last year, Apple proposed such a vision. In a sweeping announcement, the company unveiled a plan to deploy “client-side image scanning” to 1.75 billion iCloud users. The system, billed as part of the company’s “Child Safety” initiative, used perceptual hashing and cryptography to scan users’ private photo libraries for the presence of known child sexual abuse media, or CSAM. This would allow Apple to rapidly identify non-compliant users and, subject to an internal review process, report violators to the police.
Apple’s proposal was not the first system designed to scan cloud-stored photos for such imagery. It was the first system capable of working harmoniously with end-to-end encrypted backups. This fact is due to the specific way that Apple proposed to conduct the scanning.
In previous content scanning systems, user files are scanned on a server. This required that content must be uploaded in plaintext, i.e., unencrypted form, so that the server can process it. Apple’s system, on the other hand, performed the necessary hashing and scanning on the user’s own device — before the data was uploaded. The technical implications of this design are critical: Apple’s scanning would continue to operate even if Apple eventually flipped the switch to activate end-to-end encryption for your private photos (as they did today.)
And let’s please not be dense about this. While Apple’s system did not yet encrypt cloud-stored photos last year (that’s the new announcement Apple made today), encryption plans were the only conceivable reason one would deploy a client-side scanning system. There was no other reasonable explanation.
Users have a difficult time understanding even simple concepts around encryption. And that’s not their fault! Firms constantly say things like “your files are encrypted” even when they store the decryption keys right next to the encrypted data. Now try explaining the difference between “encryption” and “end-to-end encryption” along with forty-six variants of “end-to-end encryption that has some sort of giant asterisk in which certain types of files can be decrypted by your cloud provider and reported to the police.” Who even knows what privacy guarantees those systems would offer you — and how they would evolve. To me it felt like the adoption of these systems would signal the end of a meaningful concept of user-controlled data.
Yet this came very close to happening. It could still happen.
It didn’t though. And to this day I’m not entire sure why. Security and privacy researchers told the company exactly how dangerous the idea was. Apple employees reacted negatively to the proposal. But much to my surprise, the real clincher was the public’s negative reaction: as much as people hate CSAM, people really seemed to hate the idea that their private data might be subject to police surveillance. The company delayed the feature and eventually abandoned it, with today’s result being the end of the saga.
I would love to be a fly on the wall to understand how this went down inside of Apple. I doubt I’ll ever learn what happened. I’m just glad that this is where we wound up.
So what’s next?
I wish I could tell you that Apple’s announcement today is the end of the story, and now all of your private data will be magically protected — from hackers, abusive partners and the government. But that is not how things work.
Apple’s move today is an important step. It hardens certain walls: very important, very powerful walls. It will send a clear message to certain attackers that deeper investment in cloud attacks is probably not worthwhile. Maybe. But there is still a lot of work to do.
For one thing, Apple’s proposal (which rolls out in a future release) is opt-in: users will have to activate “Advanced Protection” features for their iCloud account. With luck Apple will learn from this early adoption, and find ways to make it easier to encourage more users to adopt this feature. But that’s a ways off.
And even if Apple does eventually move most of their users into end-to-end encrypted cloud backups, there will always be other ways to compromise someone’s data. Steal their phone, guess their password, jailbreak a partner’s phone, use sophisticated targeted malware. And of course a huge fraction of the world will still live under repressive governments that don’t need to trouble with breaking into cloud providers.
But none of these attacks will be quite as easy as attacks on non-E2E cloud backup, and none will offer quite the same level convenience and scale. Today’s announcement makes me optimistic that we seem to be heading — in fits and starts — to a world where your personal data will belong to you.