On February 16, 2016, the CEO of Apple wrote in a public letter that encryption is “the only way to keep [their customers’] information safe.” The information in question refers to all the data and digital possessions on your iPhone. That means your photos, text messages, contacts, location data, medical records, passwords, credit cards, fingerprints—everything that you trust in that sliver of a computer in your pocket.
Apple knows that a passcode alone isn’t secure enough to keep all that information safe, so the digital information—the code—is all encrypted. The typical encryption process essentially shuffles the data so that it can’t be read unless you have the code to reverse it (called a key). In the case of the iPhone, the key to decrypt is only available by inputting your passcode. That means that unless you put in your passcode, nobody—not even Apple—can access the data encrypted on your phone. Sounds good, right?
The aforementioned letter was in response to an FBI request that Apple create a backdoor—a way to access encrypted data without knowing the passcode. The phone the FBI so desperately wants to break into belongs to a suspect of the 2015 San Bernardino mass shooting—the single worst shooting in the US since Sandy Hook. The data on the suspect’s iPhone may be able to give insight on the attack, offer intel on other perpetrators, and bring justice to the families of the 36 victims.
But Apple said no.
“The government is asking Apple to hack our own users,” writes Apple CEO, Tim Cook, in a public letter. “[Creating a backdoor] would be the equivalent of a master key, capable of opening hundreds of millions of locks—from restaurants and banks to stores and homes.”
Apple says that it has provided all of the information and data it has access to, but refuses to comply to a court order to “bypass or erase the auto-erase function” that wipes the phone’s data after 10 failed passcode attempts. A cryptography expert reported to the Washington Post that without a backdoor “it might take 10 years to crack a strong password on the phone.”
“[If] a way to bypass the code is revealed,” Cook writes, “the encryption can be defeated by anyone with that knowledge.” A different cryptography expert said to Engadget that “the mere knowledge that it was done will make it easier for others to find out how.”
Since Apple’s letter, Google CEO Sundar Pichai tweeted his support for the response, saying the court order “could be a troubling precedent.” In a less direct act of support, Microsoft CEO Satya Nadella retweeted a statement made by the Reform Government Survelliance group against backdoors. Facebook’s statement mirrored that of Google’s, calling the judge’s demands “a chilling precedent.”
4/5 But that’s wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent
— sundarpichai (@sundarpichai) February 17, 2016
And that’s what the issue comes down to—it’s not just a concern that encryption will be weakened. All the FBI claims to want is a way to brute force the code; they want the ability to input as many passwords as it takes as quickly as possible without the phone erasing itself as a security measure. Not only that, but Apple may already have the ability to install firmware on your device if it’s in their possession. It’s completely plausible that they could install a version of iOS that lacks an auto-erase function onto this one, single device, and allow the FBI to brute force the code. The crux is that only Apple may be capable of installing firmware without user consent, or else the FBI would just do it themselves. As cybersecurity expert Dan Guido said to Wired, the government is trying to turn Apple into the very threat their security is meant to protect from.
The real concern is about precedent. If Apple creates a bypass this one time, they and other companies around the world may be forced to comply again under different circumstances, or—worst case—companies may be forced to prevent you from ever locking the government out of your device. Your device which has every existing detail about yourself, including everywhere you’ve ever been and everything you’ve ever said.
So what does this all mean for you?
More than anything, you should be aware of what the American government is trying to do, and know what you can do to secure your digital information and keep it private. If you’re interested in full security on your desktop, you should look up FileVault (Mac) or BitLocker (Windows), how to encrypt your emails via SSL, and what a Tor browser is. If you’re looking for some simpler solutions—namely ones for your smartphone—here are some tips and tools.
If you own an iPhone and use a passcode, your data is already encrypted, but make sure to keep your iOS firmware up-to-date, as every update includes security patches and more. Just last week, Error 53 was explained: If an unofficial repair shop makes any alterations to Touch ID—the fingerprint sensor—then the sensor will be disabled as a security precaution until the phone is brought to Apple to be fixed. Moreover, if you’re on iOS 7 or below, Apple can extract your data without your passcode; it’s only with iOS 8 and higher that Apple tied the encryption key to your passcode for the sake of security.
If you’re on an Android device, the process is a little more complex. If your phone came with Android 6.0 already installed, your device is encrypted by default. Otherwise you’ll need to turn on a PIN for your lock screen and device boot up. If you’re on Android 4.4 or lower, you’ll need to go settings > personal > security > encryption.
For added security, you can encrypt your passwords for your various accounts. 1Password is an app for smartphones and PC that uses complex algorithms to create and encrypt your passwords, and then allows you to input them with a single touch or click for ease-of-use.
Lots of people use Messenger, WhatsApp, or Snapchat, and while each of those services provide encryption on outgoing messages, they can be intercepted and decrypted by the provider—Facebook for both Messenger and WhatsApp, or Snapchat for its own app. If you’re on iOS, the native iMessage feature is encrypted end-to-end and cannot be intercepted or decrypted by Apple. If you don’t have friends with iPhones, you can use Signal by Open Whisper Systems for end-to-end message, media, and call encryption on iOS or Android. For a full list of messaging apps and their encryption policies, and more tools to keep yourself digitally secure, check out eff.org.