A court has ordered that Apple find a way to help law enforcement break in to a locked iPhone that had belonged...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
to one of the shooters accused of killing 14 co-workers in December. Apple CEO Tim Cook called the order "an overreach by the U.S. government."
The smartphone in question belonged to Syed Farook, who with his wife, Tashfeen Malik, shot and killed 14 co-workers on Dec. 2 in San Bernardino, Calif. The couple died in a gun battle with police soon after.
During its investigation into the attack, the FBI has been trying to unlock Farook's iPhone, but the device has a PIN lock on it. Apple devices running iOS 8 or higher encrypt all phone data by default and require a PIN to be unlocked. The FBI can't use a brute force attack to access the device, because after a few wrong guesses, the operating system puts a time delay on further attempts. After the 10th incorrect attempt, the device will completely lock and will either require syncing to iTunes to restore, or it could wipe all the data on the device, depending on the user settings.
U.S. Magistrate Judge Sheri Pym ordered Apple supply software capable of bypassing the phone's security measures to allow the FBI to gain access to the device, effectively creating an iPhone backdoor. Apple has been given until Feb. 21 to file an appeal.
The order specified that Apple would need to provide the FBI with custom-signed iPhone software that would "bypass or disable the auto-erase function," enable the FBI to enter passcodes electronically and remove any delays when inputting incorrect passcodes, which would allow for a brute force attack.
Cook wrote an open letter taking a firm stance against the court order, and said whatever the FBI may call it, "make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor."
"Up to this point, we have done everything that is both within our power and within the law to help them," Cook wrote. "But now, the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone."
The court order makes it clear that Apple is only required to place the custom iOS software on the device in question, but Cook argued that "there is no way to guarantee such control."
"The government suggests this tool could only be used once, on one phone. But that's simply not true," Cook wrote. "Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks --from restaurants and banks to stores and homes. No reasonable person would find that acceptable."
Jonathan Ździarski, a forensic scientist and author, pointed out that unlocking the iPhone doesn't even guarantee the FBI will have access to the data on the device, because more encryption backdoors may be needed.
"FBI is going to be thrilled when they get into this iPhone and realize all the data they want is encrypted in third-party apps," Ździarski wrote.
Cook added it would be ironic for the same engineers who built strong encryption into the iPhone to protect users to "be ordered to weaken those protections and make our users less safe" with an iPhone backdoor.
"We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption," Cook wrote. "Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them."
Support and dangerous precedent
Many have come out in support of Cook's opposition, including Dino Dai Zovi, mobile security lead at Square Inc., based in San Francisco, who said on Twitter, "The most interesting question, in my opinion, on the FBI vs. Apple case is whether this is an (international) precedent that we want to set."
Cook agreed this is not a precedent that should be set, and the "implications of the government's demands are chilling."
"If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone's device to capture their data," Cook wrote. "The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge."
Amit Sethi, senior principal consultant at Cigital Inc., based in Dulles, Va., said in this specific case, the risks of further damage from an iPhone backdoor could be minimized, because device in question is an iPhone 5C, which does not have the Secure Enclave that's part of newer iPhones.
"With the newer generation of Apple devices, installing a modified version of iOS would not help, because the brute force protections are implemented in the Secure Enclave itself," Sethi said. "This is good, because if an iOS device is stolen, the attacker cannot modify the operating system on the device to brute force the user's PIN or password."
However, Sethi said the more dangerous aspect of this ruling would be the precedent it could set.
"Will the U.S. government require Apple to build a backdoor into all Apple devices that takes away this protection and makes all users' devices less secure? When it comes across a complex password that it cannot brute force, will it require Apple to limit PIN or password strength? When it decrypts data on a device and finds a messaging application that encrypts stored data using a separate password, will it require that application's author to build in a backdoor as well?" Sethi asked. "If so, all of these limitations and backdoors will make all users' data significantly less secure."
Google CEO Sundar Pichai also came out in support of Cook's message and reiterated the dangerous precedent such a ruling could represent.
"Forcing companies to enable hacking could compromise user privacy," Pichai wrote on Twitter. "We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism. We build secure products to keep your information safe, and we give law enforcement access to data based on valid legal orders. But that's wholly different than requiring companies to enable hacking of customer devices and data. Could be a troubling precedent. Looking forward to a thoughtful and open discussion on this important issue."
Learn about the U.S. and China disagreeing over software backdoors.