Following the San Bernardino shooting, the FBI has requested, and a judge has ordered (pdf here), that Apple provide a backdoor into the iPhone 5C of Syed Farook. Farook and Tashfeen Malik, his wife, were responsible for the shooting in San Bernardino, California that resulted in 14 casualties. The two were later killed by local police.
While the situation is gruesome, Apple continues to stand by its choice not to provide a way into iPhones for law enforcement. In a letter Tim Cook released publicly on its website today addressed to customers, the iPhone company addresses the importance of user privacy and security.
"They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."
In the note to consumers, Cook mentions that “the implications of the government’s demands are chilling” but that “opposing this order is not something we take lightly.” Standing up to law enforcement in this way sets a tone for the company’s stance on handing over user data in the future and sets an example for smaller tech companies to, hopefully, do the same.
In this case, the iPhone owner in question is not alive. But if he were, the government would have a tough time getting into the phone anyway. One’s passcode is protected by law and doesn’t require handing over, even when prompted to do so by the officials.
Though interestingly the same can’t be said for those that make use of a fingerprint to enter their phone. In this situation, Farook’s iPhone was a 5C model and doesn’t use a fingerprint sensor anyway.
It’s in Apple’s best interest to protect user data. Many iPhone buyers trust Apple to keep various facets of their information private. Fingerprint data, for example, that the iPhone stores thanks to TouchID or payment data brought about by Apple Pay.
But it’s not just assumed—Apple sings from the mountain tops about the numerous steps they take to keep user information private. TouchID, for example, keeps prints encrypted in a secure enclave on the iPhone’s processor that never touches the cloud. Apple Pay changes the transaction code with each purchase and requires your fingerprint to activate.
But, maybe above else, Apple is happy to tell you what it does in the way of privacy because they know how heavily its rivals rely on collecting information on users. Specifically Google. Google collects data to serve ads that you'll find useful and offer data before you know you need it, but it’s in Google's vested interest to keep user data secure as well. Should information leak, the distrust of Google that followed could stop people from using its services and ultimately ruin the search company’s business.
The security of our devices in relation to the government and hackers has come into question now more than ever. In the wake of the Celebgate photo-leaking scandal and Edward Snowden’s whistleblowing of the NSA's massive online and telephonic surveillance operations, many smartphone and PCs users have remained wary of what kind of information the government has access to when using these devices. Yet many others continued to use their gadgets as usual without giving it second thought.
Apple’s decision to side with users’ data, no matter the circumstance, sets an interesting tone for court cases in the future. While we may never know what goes on behind closed doors at Apple or any large tech company, at the very least Apple is making known the importance of user privacy, even when the stakes are high or the position may be unpopular with those in law enforcement, government, or others hoping to find a quick window into the devices used by hundreds of millions around the world — from ordinary peaceful citizens to criminals and domestic terrorists.