Apple, the FBI and crypto: what’s at stake?

My goal here is to attempt to describe the technical details of the situation.  I’ll discuss my personal position (I favor Apple) on my other blog, and add a link here.

Let me be clear at the outset.  I do not have any inside information on this.  What I know comes from public news reports.  I have been attempting to understand the issues based on that.  It is entirely possible that I have some of the details wrong.

Background

This is about an iPhone, used by the San Bernardino Terrorists.  The FBI wants access to that iPhone to help in their investigation.  It is entirely reasonable for the FBI to want access.

The iPhone, itself, is encrypted.  Recently, to protect the privacy of their users, Apple began providing full encryption on their iPhones.  The owner of the iPhone provides the encryption key.  When the phone leaves the factory, presumably Apple has full access.  But, once the owner sets up the crypto with his own key, Apple no longer has access to that phone.  Or, more accurately, it no longer has full direct access.

In particular, Apple does not have a copy of the encryption key.

Apparently, the only way to get in through the encryption is with trial and error guessing of keys.  The FBI seems to be willing to do that.  But there’s a difficulty.  The phone was designed so that after an excessive number of failed attempts (i.e. attempts with the wrong key), the phone will self-destruct in the sense that it will automatically erase all data.

What Apple is refusing

The FBI wants Apple to turn off this self-destruct mechanism.  They could do that by installing new firmware on the phone — firmware without that self-destruct mechanism.

There’s another protection method on the phone.  It will not install new firmware unless that firmware is signed by a verified digital signature.  So Apple would have to sign that firmware update, in order for it to do what the FBI wanted.

So here’s what it amounts to.  The FBI wanted Apple to design a hacker tool (malicious software/firmware), and digitally sign that.  Designing the software isn’t the big issue.  Presumably the FBI could do that themselves, though Apple would be better equipped to do that.  The real issue is in signing that replacement firmware.  Apple uses a security certificate, and it expects users to trust Apple to use that certificate wisely.  By using it to sign a hacker tool, Apple would be violating that user trust.

And that, apparently, is what Apple is refusing to do.

Advertisement

Tags: ,

About Neil Rickert

Retired mathematician and computer scientist who dabbles in cognitive science.
%d bloggers like this: