The US government has never been allowed to create a “backdoor” to encrypted devices. Now, it’s trying to force Apple to build one.
A protestor holds up an iPhone outside of the the Apple store on 5th Avenue on Feb. 23, 2016 in New York City. (Bryan Thomas/Getty Images)
The Federal Bureau of Investigation's much-discussed request to Apple can seem innocuous: Help us extract six weeks of encrypted data from the locked iPhone of Syed Farook, an employee of San Bernardino’s health department who spearheaded an attack that killed 14 people. Most people believe Apple should comply.
But the FBI is demanding a lot more than the data on a single phone. It has obtained a court order requiring Apple to build custom surveillance software for the FBI – which computer security expert Dan Guido cleverly dubs an FBiOS.
Once that software exists, it is inevitable that other law enforcement agencies will approach Apple seeking to get it to use the FBiOS to unlock iPhones in other investigations. Already, Apple says it has received U.S. court orders, under the same legal authority, seeking to get it to unlock 12 other devices.
In effect, the FBI is asking for Apple to write software that will provide something the government has sought without success for more than a decade: A “backdoor” that cracks the increasingly sophisticated encryption on consumers’ phones.
Golden key
The US government has previously attempted to create its own “golden key” that could unlock every device. That effort collapsed in the face of fierce objections across the political spectrum. Now, the government is pushing a private company – Apple – to create a key.
What’s at stake in this clash of titans, therefore, is a much larger issue: How far should tech companies go to help the government conduct surveillance of their users.
The court has asked Apple to build special software that would disable the security on the device, and to install that software to the target iPhone as an update. Once the phone is updated with the new software, the FBI will be able to break into it.
Last year, a White House working group examined just this approach to creating a backdoor into encrypted devices – which it described in typically dense bureaucratic language as “provider-enabled remote access to encrypted devices through current update procedures”.
Translated into English, they were considering using the routine updates that every phone receives as a means for law enforcement to plant spyware that could track everything on the device, from whereabouts to text messages to all emails.
The panel saw a potentially fatal flaw in this approach, noting it “could call into question the trustworthiness of established software update channels”.
This is no small thing: software updates are key to cybersecurity. Updates are issued regularly to patch the inevitable flaws that are discovered in today’s complex software. Failing to install software updates leaves users’ vulnerable to hackers. The lack of timely software updates, in fact, has forced the US military to turn off certain features of its non-battlefield smartphones.
If Apple gives its stamp of approval to the FBiOS and the technique becomes common, phone users may start to wonder whether the updates they receive contain spyware.
In addition, Apple says the FBiOS would “be relentlessly attacked by hackers and cybercriminals” hoping to obtain a copy of the golden key. The government counters that Apple can install the software on the device at Apple’s physical premises. Apple will “retain control over it entirely,” and is free to destroy it afterwards, the government states.
It’s also not at all clear that the US government will prevail in its court fight with Apple.
Albert Gidari, a leading surveillance lawyer who has represented Google and other companies and is now director of privacy at Stanford University’s Center for Internet and Society, argues that the government is over-reaching in its request. He points to a 1994 telecommunications law that says the government does not have the power to require companies to implement “any specific design of equipment, facilities, services, features, or system configurations” for surveillance purposes.
The US government argues that the 1994 law is irrelevant to its case, and instead is relying on a 1789 law, the All Writs Act, that gives courts “all writs necessary and appropriate” to conduct their business.
However, Orin Kerr, a former federal prosecutor and professor at George Washington University Law School, argues that the 1789 law may not support the government’s position. In 1977, the Supreme Court ruled that the law did require a telephone company to help law enforcement set up surveillance equipment on certain lines, but that “the power of federal courts to impose duties upon third parties is not without limits; unreasonable burdens may not be imposed.”
In a court filing, Apple argues that the US government’s request is “burdensome” and requires “involved engineering”.
Court battle
It is not a coincidence that the FBI has taken its battle to the courts. For the past two years, the FBI has been campaigning to win a so-called “backdoor” into encrypted devices. In 2014, FBI director James Comey called for a “regulatory or legislative fix” that would allow the agency to access devices with a court order.
But late last year, the Obama Administration decided not to pursue legislation.
With the US Congress out of the picture, the debate between tech and law enforcement will play out in the US District Court in the Central District of California.
The FBI says that the debate is narrower than it has been portrayed. “The relief we seek is limited and its value increasingly obsolete because the technology continues to evolve,” FBI director Comey wrote in a blog post.
Microsoft founder Bill Gates told the Financial Times that he supports the FBI: “They are not asking for some general thing, they are asking for a particular case.”
But, several companies such as Google and Twitter – which all could face similar surveillance requests – have weighed in to support Apple.
“We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders,” tweeted Google CEO Sundar Pichai, “But that’s wholly different than requiring companies to enable hacking of customer devices & data.”
This article was originally published on ProPublica.
Limited-time offer: Big stories, small price. Keep independent media alive. Become a Scroll member today!
Our journalism is for everyone. But you can get special privileges by buying an annual Scroll Membership. Sign up today!