Oct 05 2014
I understand enough about encryption to get myself in trouble, but not much more. I can talk about it intelligently in most cases, but when we get down to the nitty gritty, bit by bit discussion of how encryption works, I want to have someone who’s really an expert explain it to me. Which is why I’m glad that Matthew Green sat down to explain Apple’s claims of new encryption that they can’t open for law enforcement in great detail.
The Too Long; Didn’t Read (I often forget what tl;dr means) version of it is that there is a unique ID that’s hidden deep in the hardware encryption chips on your phone that software doesn’t have access to. This UID is made part of your encryption key through complex algorithms and can’t be pulled out locally or remotely and makes for a strong encryption key that protects your encrypted data. Do keep in mind that not all of the interesting data on your phone is encrypted, there are still nooks and crannies that can be looked at by someone with physical access to the phone. And that some of the most interesting stuff on your phone isn’t what’s on it in many cases; it’s the list of who you’ve called, where you’ve been and the like that they can get from the carrier. That metadata is often at least as important as what’s on your phone, and much easier to get without ever having to even see your phone.
I’m personally very glad that Apple (and Android as well) have begun encrypting phones by default. Yes, police need to the ability to get into phones and see what people have been doing on them, but the last two years have shown that this ability has been abused for quite some time. Various governmental officials in the US have decried the move saying they need the ability to catch pedophiles and terrorists. Yet so far the count of cases where the information needed to catch anyone from either of those categories couldn’t be gotten by other means is still in the single digits. At the same time the number of lawsuits against police in the US abusing their ability to get into phones numbers in the hundreds. Do the math and figure out for yourself if it’s worth law enforcement having easy access.
We’ll be seeing more organizations of all types moving encryption, partially to protect users and partially to defend themselves from the negative publicity being open to the police brings. There will be a number of missteps, of poor encryption methodology and cases where people realize they can’t just get their backup from the cloud because they used serious encryption and lost the key. There will be growing pains and there will be examples of guilty people escaping because law enforcement doesn’t have easy access to phone data. But we need to have strong encryption to protect the privacy of average citizens who’ve done nothing more than catch the attention of the wrong person at the wrong time as well. Our privacy is much more delicate and deserving of protection than many in power believe it is.
One Response to “Understanding Apple’s new encryption model”