bexxandbrain - Fotolia

News Stay informed about the latest enterprise technology news and product updates.

Apple court filing challenges iPhone backdoor as rhetoric heats up

The rhetoric about the iPhone backdoor from Apple and the FBI has gotten more intense as Apple challenged the FBI in court by calling its motion unconstitutional.

The court of public opinion has heard stronger and stronger arguments from both sides of the debate between Apple and the FBI over a potential iPhone backdoor and an Apple court filing calling the FBI's request unconstitutional.

Last week, prosecutors for the government filed a new motion in the ongoing effort to get Apple to comply with a court order to create an iPhone backdoor to allow the FBI to gain access to the phone of Syed Rizwan Farook, who killed 14 people in San Bernardino, Calif.

"The government and the community need to know what is on the terrorist's phone, and the government needs Apple's assistance to find out," prosecutors argued in the filing. "Apple's rhetoric is not only false, but also corrosive of the very institutions that are best able to safeguard our liberty and our rights."

In a brief filed with a California District Court earlier this week, Apple lawyers claimed that the government's use of the All Writs Act of 1789 as justification for compelling the company to build an iPhone backdoor was unconstitutional.

Apple said the government wants the law to act as "an all-powerful magic wand" that could force the company to assist law enforcement, and that the government's interpretation of the Act is far too broad. Apple went on to say that "according to the government, short of kidnapping or breaking an express law, the courts can order private parties to do virtually anything the Justice Department and FBI can dream up. The Founders would be appalled."

Most recently, Apple CEO Tim Cook was interviewed by Time Magazine and spoke at length about the iPhone backdoor case. Cook revealed that he and Apple employees had long discussions before deciding to stand up to the court order. Cook also mentioned the dangers of extending the All Writs Act too far.

"The way that we simply see this is, if this All Writs Act can be used to force us to do something that would make millions of people vulnerable, then you can begin to ask yourself, if that can happen, what else can happen?" Cook asked. "In the next Senate you might say, 'Well, maybe it should be a surveillance OS. Maybe law enforcement would like the ability to turn on the camera on your Mac.'"

Cook also directly countered a common argument by the FBI and law enforcement that increasing encryption will directly lead to lives lost from terrorist attacks.

"I think it's very simplistic and incorrect," Cook said. "Because the reality is, let's say you just pulled encryption. Let's you and I ban it tomorrow. And so we sit in Congress and we say, 'Thou shalt not have encryption.' What happens then? Well, I would argue that the bad guys will use encryption from non-American companies, because they're pretty smart, and Apple doesn't own encryption ... The Internet doesn't have boundaries. You can wind up getting an app from Eastern Europe or Russia or wherever, it doesn't matter which country, just outside the United States. And that app would give you end-to-end encryption."

iCloud encryption and Apple employees

A new report said that Apple is looking into ways to increase iCloud security without inconveniencing customers. Currently, Apple holds the encryption keys for iCloud backups and has cooperated in handing over that data to law enforcement when asked. The report claimed Apple wants to strengthen that encryption so even Apple will not hold the keys. However, the company doesn't want users to lose data because of a lost password, which would be a possibility if Apple can't access iCloud backups.

Even Apple employees are reportedly gearing up for battle. The New York Times interviewed more than a half-dozen current and former Apple employees, including engineers involved with developing mobile products and security. Of those interviewed some employees said they would refuse to create the proposed iPhone backdoor and others said they would quit before complying with the court order.

Heated rhetoric

Last month, Apple CEO Tim Cook called the proposed iPhone backdoor that a court ordered the company to build for the FBI the "software equivalent of cancer." Since then, the rhetoric on both sides has continued to escalate.

San Bernardino District Attorney Michael Ramos claimed in an amicus brief that the iPhone in question could contain evidence "that it was used as a weapon to introduce a lying dormant cyber pathogen that endangers San Bernardino County's infrastructure." Ramos has not clarified those statements, but the amicus referenced California Penal Code §502, a provision concerning a possible "computer contaminant," described as a "set of computer instructions that are designed to modify, damage, destroy, record, or transmit information within a computer, computer system, or computer network without the intent or permission of the owner of the information."

Soon after that, the New York Police Department counter-terrorism chief John Miller accused Apple of "providing aid to the kidnappers, robbers and murderers who have actually been recorded on the telephones in Riker's Island telling their compatriots on the outside, 'You gotta get iOS 8. It's a gift from God, because the cops can't crack it.'"

On the same day, Craig Federighi, senior vice president of software engineering at Apple, published an op-ed in The Washington Post noting that "hackers have repeatedly breached the defenses of retail chains, banks and even the federal government" and lauded the encryption technology of the iPhone as "the best data security available to consumers."

"That's why it's so disappointing that the FBI, Justice Department and others in law enforcement are pressing us to turn back the clock to a less-secure time and less-secure technologies," Federighi wrote. "They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013. But the security of iOS 7, while cutting-edge at the time, has since been breached by hackers."

UN High Commissioner for Human Rights Zeid Ra'ad Al Hussein also publicly backed Apple and wrote that setting the precedent of creating an iPhone backdoor in this one case could unlock "a Pandora's Box that could have extremely damaging implications for the human rights of many millions of people."

"It is potentially a gift to authoritarian regimes, as well as to criminal hackers. There have already been a number of concerted efforts by authorities in other States to force IT and communications companies such as Google and Blackberry to expose their customers to mass surveillance," Zeid wrote. "Encryption and anonymity are needed as enablers of both freedom of expression and opinion, and the right to privacy. It is neither fanciful nor an exaggeration to say that, without encryption tools, lives may be endangered. In the worst cases, a Government's ability to break into its citizens' phones may lead to the persecution of individuals who are simply exercising their fundamental human rights."

On the fence

Not all comments on the subject have been quite so heated. Hillary Clinton, former Secretary of State and presidential hopeful, was careful to not explicitly support either side.

"There has got to be some way to protect the privacy of data information," Clinton said. "There has got to be some way to avoid breaking data encryption and opening the door to a lot of bad actors. But there has to be some way to follow up on criminal activity and prevent crimes and terrorism."

Similarly, President Barack Obama cautioned against taking an "absolutist" stance on encryption while speaking at the South by Southwest Interactive conference in Austin, Texas.

"If your argument is strong encryption no matter what, and we can and should create black boxes, that I think does not strike the kind of balance that we have lived with for two hundred, three hundred years," Obama said, "and it's fetishizing our phones above every other value. And that can't be the right answer."

He also warned that although companies like Apple and privacy advocates have been gaining popular support, that support could disappear if there were another major terrorist attack or crime.

"What you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed and it will go through Congress in ways that have not been thought through," Obama said. "And then you really will have dangers to our civil liberties because ... the people who understand this best and who care most about privacy and civil liberties have sort of disengaged and taken a position that is not sustainable for the general public as a whole over time."

Lastly, it has been rumored that Sens. Richard Burr (R-NC) and Dianne Feinstein (D-Calif.), the chair and vice chair of the Senate Select Committee on Intelligence, could soon propose legislation that would impose civil penalties upon companies that refuse to comply with court orders to help investigators access encrypted data. Both senators have been linked to potential legislation in the past that would require companies to comply with court orders, even if it meant creating encryption backdoors.

Next Steps

Learn more about IT and user privacy concerns related to the FBI vs. Apple battle.

Learn which cryptographer said that Apple "goofed" in their battle with the FBI.

Learn why the FBI chose the "perfect test case" to force the iPhone encryption issue.

Dig Deeper on Information security laws, investigations and ethics

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

What is your stance in the argument over an iPhone backdoor?
I come down solidly on both sides.... Sigh.

Of course we all want our data secured. No back door and only one front door key. And in the best of all possible worlds that's exactly the right approach. Unfortunately, Pangloss is long gone and in his rosy place we have the violence of war, plotted behind those same walls you've built to protect your data.

Yes, yes, of course you're right. Bad guys (and gals) want to steal your data. So you do your damnedest to keep it safe. And then comes a pack of far worse guys (and fewer gals) who are planning to throw a bomb in the middle of your nicely firewalled party. 

Your call. Do you keep your data secret and let a lot of innocents die? Or do you find the bombs before they detonate? So the real question is how many lives is your data worth...?
Nuclear bombs cannot be launched in retaliation to a Nuclear attack to the nation without a specific protocol and unlocking keys guarded by multiple highest Authorities in the nation (President, Congress the Military). In the same vein, mobile encryption should be able to be deciphered when multiple Authorities concur to do it by using their own secret opening keys. If the country judicial courts, the FBI and the Police are all given complementary deciphering keys, necessary all of them at the same time to be input in order to read the encrypted mobile data, the issue is solved. Under a crime investigation you can't hide anything from the law, your phone can be monitored by legal decree, your safes can be opened by force even if you throw away the keys. Why should a mobile harboring crime evidence be exempt from the power of the law to investigate odious crimes like this one in focus here? To protect the silly Apple encryption? Are you kidding me? Apple's encryption scheme is just plain stupid if they claim that by decoding the content of a given mobile implies breaking the encryption of all others! Who the hell ever came up with that?! Go back to your drawing boards and do it the right way, when you have an Authority order to obtain personal information to solve a crime, you have to support that back door.
They could easily create a firmware update solely to this phone, to access the data, and patch it back up.
Globalization of the update does not need to occur.
This is a stupid excuse.
My stance is that Apple probably wants to instill confidence to the consumers. Feeling of safety and protected privacy has a price, and it might be a deciding factor even over functionality.

But what is the difference between perceived safety and privacy (largely instilled by advertisement, anyway) and a factual safety and privacy?