Sunday, September 28, 2014

Perspective on smartphone security

Apple's iOS 8, the latest release of its iPhone/iPad operating system, will encrypt much of the device's contents if you use a passcode or password to access the device. This has sent law enforcement into a tizzy.
“Apple will become the phone of choice for the pedophile,” said John J. Escalante, chief of detectives for Chicago’s police department. “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”
Even F.B.I. director James Comey got into the act. From the New York Times article:
At a news conference on Thursday devoted largely to combating terror threats from the Islamic State, Mr. Comey said, “What concerns me about this is companies marketing something expressly to allow people to hold themselves beyond the law.”

He cited kidnapping cases, in which exploiting the contents of a seized phone could lead to finding a victim, and predicted there would be moments when parents would come to him “with tears in their eyes, look at me and say, ‘What do you mean you can’t’ ” decode the contents of a phone.

“The notion that someone would market a closet that could never be opened — even if it involves a case involving a child kidnapper and a court order — to me does not make any sense.”

Director Comey should stop watching 24 reruns. His tone-deaf and frankly idiotic remarks merely fuel the substantial mistrust of government that exists among even law-abiding and generally patriotic citizens like myself. (Note, though, that there's a compelling argument that Comey's bluster is just that, bluster, a theatrical performance to hide the fact that the N.S.A. actually would have no difficulty breaking Apple's encryption. I think that's at least as plausible as the other leading hypothesis for Comey's remarks, i.e., that he's an incompetent moron.)

Comey's argument rests on the assumption that it's not merely normal, but proper, for law enforcement to be able to access the data in your personal computing device with no impediment other than gaining physical possession (and a warrant, at least in some jurisdictions).

That assumption is wrong.

I repeat: that assumption is wrong.

Moreover, the real reason for the encryption has nothing — nothing — to do with thwarting law enforcement. I'll get to that shortly. First, though, let's think through the principles here, rather than getting caught up in the technology.

Consider a different law enforcement need: access to your home. If law enforcement needs to get inside your home, officers or agents obtain a search warrant, present it to you and you reluctantly permit them entry. If you refuse, they can legally break the door down.

Your personal computing device must be subject to the same protections. It is no less personal a domain than your home, even if it is as easily taken from you as your wallet. (Incidentally, rifling through your wallet shouldn't automatically be legal for police, either. I don't know what the current law is on that.)

What ticks off law enforcement is that there's no widely available battering ram right now for the average personal computing device. (On the other hand, many people are dumb enough not to have some kind of passcode protecting their device. This is the equivalent of leaving the front door unlocked.) Law enforcement has relied for ease of access on the indifference of device and software manufacturers. They have not made widespread encryption the default behavior on their devices. It has been available for some time, but it has been used only in limited contexts — to protect passwords, for instance. To encrypt your own data, especially on iOS devices, has not been terribly easy. With iOS 8, it will happen with minimal effort.

Law enforcement is obviously disgruntled that its job has been made that much harder. But does that justify demanding, or at least petulantly whining, that the front door to your personal computing devices be incapable of being locked?

That's what this really comes down to: law enforcement wants your digital front door to be not merely unlocked, but incapable of locking.

Put so baldly, that's quite a startling position, isn't it?

Would we accept police demands that our front doors not be capable of locking? Of course not. Do we accept that locked doors protect criminals and terrorists as well as you and me? Yes. We may not like it, but we accept it.

Why do we accept that tradeoff? Because we need locked doors. Even if you live in a low-crime area and you typically leave your door unlocked, you like to know that you can lock it if need be. Most of the time it's not the police trying to break into your home.

And that brings me to the real reason for the new encryption feature: our personal computing devices can be stolen. We need to be confident that the thief can't gain access to our personal data. This might include names, addresses, phone numbers, birthdays, voicemail and text messages, maybe even confidential information like your credit card numbers. If it's a phone, it might serve as a token for near-field communications payment systems or the new Apple Pay system. Phones can also serve as the second leg of a two-stage authentication system for login. The loss of a smartphone these days can be a disaster.

Every security measure is a tradeoff. For the device-encryption question, we could probably do extensive studies to determine whose interests are a higher priority to society. Or we could default to preferring the rights of law-abiding citizens over the limited number of high-stakes cases that depend on unfettered access to personal computing devices.

You may be tempted to believe that the consequences of a lost or stolen phone are purely financial, and therefore the possibility of saving innocent lives that Comey holds out should unconditionally take precedence. At first blush, that's a compelling argument (if you buy into the distinctly TV-show-ish premise, which I find difficult to do). But if we truly lived according to that principle, we'd give up our privacy altogether. After all, a crime like kidnapping requires that the kidnapper be able to operate invisibly. We could eliminate the possibility of kidnapping simply by making it impossible for any of us to live our lives without scrutiny.

That, of course, is not how most of us want to live. So we do our best to reduce the possibility of kidnapping without shredding our right to live our lives freely. Is it a good tradeoff to let the kidnapper keep his secrets on his phone, if at the same time hundreds or thousands of law-abiding citizens can breathe a little easier because their stolen phone won't result in their lives being open to the thief? Kidnapping is a rare event and it's hard to imagine that the key to cracking the case will lie solely in the putative kidnapper's phone. I therefore say that wholesale encryption on personal computing devices is, on balance, a good thing even if it makes law enforcement harder. After all, law enforcement would be easier with unlocked doors, too.

Matthew Green has a slightly different take on why Apple introduced the encryption feature. He points out that any back-door access maintained for law enforcement can't be guaranteed to remain accessible solely to law enforcement, so customer data could be opened up to criminals through that back door. Apple therefore understandably prefers not to be the arbiter of such access, and so has made it technically impossible to violate its customers' privacy. Green's is a good argument. I still think, though, that some of the impetus is customer demand. The San Francisco Bay Area is a hotbed of smartphone theft and Apple's employees themselves likely have been victims, or they know people who have been victims. The issue has also received extensive coverage from local media outlets. All this would have influenced both engineers and managers to make this feature a priority.

Whatever the impetus, the new encryption feature is a good thing.

We have been conditioned since 11 September 2001 to make national security a priority. This has resulted in our law enforcement authorities having a warped perspective on how our lives should be lived. It's long past time we pushed back. The new encryption feature in iOS 8 is one way to do so. Heaven knows that if we permit gun sales to be as lax as they are, there's no good argument in favor of restricting encryption on our personal computing devices. Don't let Comey or anyone else distract you from the real point. It's not about national security. It's about personal security.

No comments:

Post a Comment