PINs for Cryptography with Hardware Secure Elements
2024-2-14 22:1:51 Author: words.filippo.io(查看原文) 阅读量:28 收藏

I’m a big fan of technologies that enable otherwise impossible security properties and user experiences, like cryptography often can. One such technology is hardware secure elements.

Here’s a thing you can’t do with cryptography: encrypt data securely with a low-entropy secret, like a PIN. If a high-speed brute-force attack is possible, you need a high-entropy passphrase or key. We do have password stretching algorithms like scrypt and Argon2 which make brute-force attacks slower and force them to target one victim at a time (through salting), but if your secret is small enough—like a 6-digit PIN—you’re just not going to space today. Even if you’re ok with decryption on the user’s device taking five seconds, and assuming conservatively that an attacker’s system is only ten times more powerful than the user’s device, a 6-digit PIN will fall in less than a week. But PINs provide such a better UX than long, high-entropy passwords (even if we pretended humans can generate and produce the latter)!

This is a conundrum you can solve with hardware secure elements. You put a high-entropy randomly generated cryptographic key inside some secure tamper-resistant computer, and then you program it to ever only give out the key if presented with the right PIN. Since you have arbitrary software running on that computer, you can implement things like maximum incorrect retries counters, exponential retry cooling timers, self-destruction… You’re not bound by the capabilities of cryptography and information theory anymore. This is easier said than done (the key might be extracted through side-channels, the secure computer might get hacked, and so on) but it’s also not at all a new idea. This is what smart cards and Hardware Security Modules have always been about.[1]

This is for example how I use my Yubikey, a smart card in a USB form factor: I encrypt my passwords with passage and age-plugin-yubikey, which stores the cryptographic key PIN-protected on the YubiKey. My PIN would be laughable as an encryption secret, but you only get six tries to figure it out before the YubiKey permanently locks.[2]

Although smart cards have been around forever, the technology was heavily popularized by the introduction of secure elements in mobile devices. Every iPhone has a hardware secure element called the Secure Enclave that stores the encryption key for the internal flash storage, and only produces it when provided with the PIN. It’s why locked iPhones are so hard to unlock even if the PIN is just six digits. You can’t just pull out the storage chip and brute-force the PIN.[3] This technology took some time to get to laptops, because there users kinda do expect to be able to pull out the drive and recover data from it, and because secure passphrases are easier to type on a physical keyboard, but by now the latest versions of Bitlocker and FileVault also work like this.

The security UX step forward is massive: with just an easily-remembered and easy to type PIN users have cryptographically robust device encryption that makes a misplaced or stolen phone or laptop a security non-event! We were nowhere close to that 10 years ago.

Recently, a few services successfully ported this technology and UX progress beyond local device encryption and to cloud storage. Here’s the problem statement. Imagine you have some user data that you’d like to backup or sync to the cloud, but which is too sensitive to keep in plaintext on your servers. Arguably, all user data you don’t need to process server-side qualifies. Some easy examples are end-to-end encrypted message backups, or password manager vaults. A simple solution is to encrypt it client-side with a password only known to the user. However, the security UX of that is poor: if the user forgets the password, they are locked out; if they pick a simple one it can be brute-forced. Do you see where this is going?

In 2016, Apple got a bunch of HSMs and programmed them to only decrypt iCloud Keychain data when provided with the PIN of a linked iOS device. This way users don’t need to remember any high-entropy secret, but their data can’t be brute-forced server-side. They have since expanded this protection to other classes of data, including finally iCloud backups for users who opt in to “Advanced Data Protection”, closing the final major end-to-end encryption loophole. (There are a lot of details in the Platform Security whitepaper.)

WhatsApp recently deployed a similar system for backups, which they aptly call end-to-end encrypted backups. The user is given the option of writing down a randomly generated high-entropy key, or to provide a PIN that will wrap that key in a HSM-based system. The whitepaper is a good read.

Signal also has a similar system, based on CPU integrated enclaves[4], for storing account metadata backups, and Celo has one based on some fancy cryptography and a consensus of nodes. The idea is always the same: introducing policies (like retry timers and counters) that make low-entropy PINs suitable for strong encryption by hosting secure hardware for the user.

An interesting consequence, which goes to show how fundamental this UX progress is, is that we need to amend Matthew Green’s excellent “puddle test”. The original goes a bit like this (paraphrased):

If you can recover your data after you drop your phone in a mud puddle without having to remember any high-entropy secret, then the data is not encrypted.

With these systems, we can scratch “high-entropy”: if the recovery process requires even just a PIN, it’s possible that it actually makes the data unavailable to the service provider!

I hope to see this design applied more broadly. There is no reason to relegate it to end-to-end encrypted message backups and password vaults! Anything that doesn’t need to be in plaintext on a server should be protected like this. Along with passkeys solving authentication, I think we have the tools to move on from passwords and passphrases forever.

If you got this far, you might want to follow me on Bluesky (now open for registration!) at @filippo.abyssdomain.expert or on Mastodon at @[email protected].

The picture

One cold evening I sort of stumbled into the Botanical Garden of Rome while it was all lit up for Plots of Light. It was pretty magical. If you stepped just off the allowed path, there was a bench in the darkness from where you could see St. Peter's.

On the left, a bamboo forest is lit in blue and red lights. On the right, the dark outline of some trees frames the city skyline lit in contrasting soft yellow lights. In the middle, the Dome of St. Peter's raises above the horizon.

My awesome clients—Sigsum, Latacora, Interchain, Smallstep, Ava Labs, Teleport, and Tailscale—are funding all my work for the community and through our retainer contracts they get face time and unlimited access to advice on Go and cryptography.

Here are a few words from some of them!

Latacora — Latacora bootstraps security practices for startups. Instead of wasting your time trying to hire a security person who is good at everything from Android security to AWS IAM strategies to SOC2 and apparently has the time to answer all your security questionnaires plus never gets sick or takes a day off, you hire us. We provide a crack team of professionals prepped with processes and power tools, coupling individual security capabilities with strategic program management and tactical project management.

Teleport — For the past five years, attacks and compromises have been shifting from traditional malware and security breaches to identifying and compromising valid user accounts and credentials with social engineering, credential theft, or phishing. Teleport Identity Governance & Security is designed to eliminate weak access patterns through access monitoring, minimize attack surface with access requests, and purge unused permissions via mandatory access reviews.

Ava Labs — We at Ava Labs, maintainer of AvalancheGo (the most widely used client for interacting with the Avalanche Network), believe the sustainable maintenance and development of open source cryptographic protocols is critical to the broad adoption of blockchain technology. We are proud to support this necessary and impactful work through our ongoing sponsorship of Filippo and his team.


  1. Hardware secure elements come in a million form factors, especially as the technology has seen wider adoption in the past few years. A taxonomy would probably be a whole project of its own, but an interesting axis is how programmable they are. Smart cards can in theory run arbitrary “applets” but in practice they can only be provisioned by the factory or by the manufacturer, to prevent key extraction, so you are limited to protocols like PIV, OpenPGP card, and FIDO2. Sometimes you can leverage a smaller, limited hardware module into a bigger more flexible one: for example many computers have a TPM, which can be carefully combined with Secure Boot/Measured Boot to make it so a cryptographic key will only be available if the machine is running the right code; that code can then effectively implement arbitrary policies for how to release access to the key. There are products that provide that as an all-in-one stack in a convenient form factor, like the USB Armory or the Tillitis key. Another critical distinction is whether they have an independent UI, so that they can take input and show output to the user without going through an untrusted machine. This is important for example for cryptocurrency wallets that can show where you are sending your money before you authorize a transaction. Finally, they vary in how tamper-resistant they are, too: Intel tried to put an arbitrarily programmable secure module in every CPU, SGX, and that did not work out. A useful concept in exploring the space is that of trusted computing base. [Edited on 2024-02-15 following feedback to remove an incorrect claim on the security track record of the Apple Secure Enclave, which I was confusing with the Secure Element. My apologies for the mistake.] ↩︎

  2. Low-entropy PINs are not the only thing you can get out of a hardware module. The other reason I use a YubiKey is recoverability: even if you hack my computer and keylog the PIN, you can’t exfiltrate the key that encrypts my passwords. You have to decrypt them one by one with the YubiKey, and each decryption requires a physical touch, so you are heavily rate-limited in how much damage you can do, hopefully giving me time to detect and remediate the compromise. ↩︎

  3. There’s a bit of nuance involved here, because there are various levels of “locked” an iPhone can be: for example after a reboot the key is definitely only available to the Secure Enclave, while after the first unlock it probably lingers somewhere in main memory. There’s also not only one key since things like Apple Pay credentials are more locked down than main storage, and iPhones also unlock with biometrics, which I don’t remember how tied to the Secure Enclave it is. This is really just an implementation detail, though, if you look at the whole iPhone as a large hardware secure element that only unlocks with a PIN or face. ↩︎

  4. The original Secure Value Recovery relied exclusively on Intel’s SGX, which is arguably an HSM in an integrated form factor, but in practice has a poorer security track record than discrete systems. A new version of Signal's system uses MPC to distribute computation across multiple CPU enclaves, presumably including AMD's SEV and/or ARM's TrustZone. [Edited on 2024-02-15 following feedback to add details of SVR2.] ↩︎


文章来源: https://words.filippo.io/dispatches/secure-elements/
如有侵权请联系:admin#unsafe.sh