Apple has publicly responded to a court order brought on by the FBI and US government asking them to purposefully break into one of their devices. There’s been a lot written on this subject today, so I’ve rounded-up what I think are the must reads after the jump.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
A primer on the entire situation, from Vox:
The FBI captured the iPhone of dead San Bernardino terrorism suspect Syed Rizwan Farook back in December, but encryption technology prevents them from accessing its contents. On Tuesday, a federal magistrate judge in California ordered Apple to write a custom version of the iPhone software that disables key security features and install it on Farook’s iPhone in order to foil the encryption.
Ben Thompson, writing at Stratechery:
This solution is, frankly, unacceptable, and it’s not simply an issue of privacy: it’s one of security. A master key, contrary to conventional wisdom, is not guessable, but it can be stolen; worse, if it is stolen, no one would ever know. It would be a silent failure allowing whoever captured it to break into any device secured by the algorithm in question without those relying on it knowing anything was amiss. I can’t stress enough what a problem this is: World War II, especially in the Pacific, turned on this sort of silent cryptographic failure. And, given the sheer number of law enforcement officials that would want their hands on this key, it landing in the wrong hands would be a matter of when, not if.
Rich Mogull, writing for MacWorld:
Make no mistake: This is unprecedented, and the situation was deliberately engineered by the FBI and Department of Justice to force a showdown that could define limits our civil rights for generations to come. This is an issue with far-reaching implications well beyond a single phone, a single case, or even Apple itself.
As a career security professional, this case has chilling implications. […]
Apple does not have the existing capability to assist the FBI. The FBI engineered a case where the perpetrators are already dead, but emotions are charged. And the law cited is under active legal debate within the federal courts.
The crux of the issue is should companies be required to build security circumvention technologies to expose their own customers? Not “assist law enforcement with existing tools,” but “build new tools.”
Dan Guido, explaining how Apple could comply with the FBI court order:
Again in plain English, the FBI wants Apple to create a special version of iOS that only works on the one iPhone they have recovered. This customized version of iOS (ahem FBiOS) will ignore passcode entry delays, will not erase the device after any number of incorrect attempts, and will allow the FBI to hook up an external device to facilitate guessing the passcode. The FBI will send Apple the recovered iPhone so that this customized version of iOS never physically leaves the Apple campus.
Matthew Panzarino, writing at TechCrunch, on the very real slippery slope that exists:
And herein lies the rub. There has been some chatter about whether these kinds of changes would even be possible with Apple’s newer devices. Those devices come equipped with Apple’s proprietary Secure Enclave, a portion of the core processing chip where private encryption keys are stored and used to secure data and to enable features like TouchID. Apple says that the things that the FBI is asking for are also possible on newer devices with the Secure Enclave. The technical solutions to the asks would be different (no specifics were provided) than they are on the iPhone 5c (and other older iPhones,) but not impossible.
The point is that the FBI is asking Apple to crack its own safe, it doesn’t matter how good the locks are if you modify them to be weak after installing them. And once the precedent is set then the opportunity is there for similar requests to be made of all billion or so active iOS devices. Hence the importance of this fight for Apple.
Alex Abdo, from the ACLU:
This is an unprecedented, unwise, and unlawful move by the government. The Constitution does not permit the government to force companies to hack into their customers’ devices. Apple is free to offer a phone that stores information securely, and it must remain so if consumers are to retain any control over their private data.
The government’s request also risks setting a dangerous precedent. If the FBI can force Apple to hack into its customers’ devices, then so too can every repressive regime in the rest of the world. Apple deserves praise for standing up for its right to offer secure devices to all of its customers.
Andrew Crocker, writing for the EFF:
What’s more, such an order would be unconstitutional. Code is speech, and forcing Apple to push backdoored updates would constitute “compelled speech” in violation of the First Amendment. It would raise Fourth and Fifth Amendment issues as well. Most important, Apple’s choice to offer device encryption controlled entirely by the user is both entirely legal and in line with the expert consensus on security best practices. It would be extremely wrong-headed for Congress to require third-party access to encrypted devices, but unless it does, Apple can’t be forced to do so under the All Writs Act.
My stance is simple: I believe in the importance of privacy.
Update: I’ve posted a second round-up full of more reading and information about this case.