Apple is fighting a federal judge’s order compelling it to help the FBI unlock a terrorism suspect’s iPhone, but if it loses that court battle, technical experts say, it will likely be able to offer the help that the bureau wants.
Magistrate Judge Sheri Pym’s order requires Apple to design a custom software package that disables the auto-erase feature on San Bernardino shooting suspect Syed Rizwan Farook’s iPhone 5c. The software, which will run through the phone’s RAM to preserve the data stored in the regular operating system, must let the “submit passcodes … for testing electronically via the physical [USB] device port, Bluetooth, Wi-Fi, or other protocol available” on the phone.
Apple is not technically being told to break its own encryption, an issue that forms another part of the long-running “crypto wars.” Rather, Pym ordered the company to essentially design and digitally sign software that bypasses a security feature currently preventing the FBI from flooding the phone with password guesses until it picks the correct one, a technique known as “brute-forcing.”
“The implications of where this could lead are fairly dangerous.”
iPhones can be configured to erase themselves after 10 failed login attempts, and authorities aren’t sure whether Farook enabled that feature. The government wants to search his phone for evidence that he or his wife communicated with other extremists in the lead-up to their Dec. 2, 2015, rampage at a San Bernardino facility for people with disabilities, which killed 14 people.
“I don’t think it’s going to be very hard,” Dan Guido, the CEO of information-security company Trail of Bits, told the Daily Dot.
The fact that the custom software must be able to run on RAM, he said, only makes things “a little bit” harder for Apple, Guido said. “It’s not impossible, but it complicates things.”
The main reason why it should be possible to comply, Guido said, is that the iPhone 5c does not have a key security feature of modern Apple smartphones that puts up roadblocks to brute-forcing passwords. That feature, called Secure Enclave, slows down repeated login attempts, stops responding for an hour after nine failures, and prevents someone from entering logins at supercomputer speed.
The feature debuted in September 2013 on the iPhone 5s, which uses Apple’s A7 processing chip. Farook’s iPhone 5c, which Apple released at the same time, uses the older, less powerful A6 chip.
In addition to bypassing the auto-erase feature, the court order also directs Apple to eliminate any software-based delays between passcode attempts, so that the FBI can try billions of combinations as quickly as technologically possible. But the way iOS unlocks devices—by combining the user’s passcode with a built-in hardware security key—takes 80 milliseconds each time.
“It’s an inherent feature of the design,” Steven Bellovin, a computer-science professor at Columbia University, told the Daily Dot. “The PIN and the device key are combined in a form that’s intentionally, inherently slow. The 80ms-per-guess time is how long it takes to actually calculate the encryption key. And there’s no way to speed that up, certainly not on this hardware, possibly not any place, depending on just what design they chose.”
There is a catch. Using Apple’s custom software, the FBI will be able to send login attempts to the iPhone through a computer, thus eliminating the need to physically use the device’s touchscreen. But if it could access the hardware security key, it could use a supercomputer not just to send passcodes to the phone but to actually conduct login attempts and bypass the 80ms delay.
Directly reading the hardware security key requires bouncing x-rays off of it, and it’s unclear whether even the FBI is capable of doing that. Guido said it would take “extreme effort” and “huge amounts of money and time.”
While Apple is thought to be capable of writing the necessary code, that process will not be without its risks. Bellovin warned of “unforeseen interactions” between the iPhone’s built-in operating system—which stores Farook’s data and thus must be preserved—and the new code that Apple would be required to write. He offered a hypothetical scenario that will likely disturb federal investigators.
“Flash memory has a limited number of write cycles,” he said. “It just wears out. This is just inherent in the physics of how flash memory is built. If, for some reason, there’s some code someplace in iOS that writes to flash every time you try to enter a PIN and fail, and they don’t remember to disable that [in the new code] … you might actually destroy the flash and hence the destroy the data just trying to do these guesses.”
It all depends, Bellovin said, on how much of the necessary code Apple already has sitting around. “The more code they have to develop,” he said, “the riskier it is.”
The former involves engineering effort and debugging, which could be quite non-trivial and risky if they don’t already have such a version.
— matt blaze (@mattblaze) February 17, 2016
Apple CEO Tim Cook appeared to suggest that the company would have to start from scratch, saying in his fiery response to the court order that “this software … does not exist today.”
“The U.S. government has asked us for something we simply do not have,” he wrote, “and something we consider too dangerous to create.”
Judge Pym based her order on the All Writs Act, an 18th-century statute empowering courts to require private parties to assist law enforcement as much as is feasible without incurring an “undue burden.” In a separate case involving a locked iPhone and the All Writs Act, Apple has pushed a judge to accept its argument that unlocking the phone would create an “undue burden” on the company by turning it into an agent of the government.
If Apple is eventually forced to comply with the California order, privacy advocates said, the decision could have lasting effects on what governments—including repressive regimes—can force tech companies to do to their products.
“This is another salvo in the war,” Bellovin said. “The implications of where this could lead are fairly dangerous.”
Photo via William Warby/Flickr (CC BY 2.0) | Remix by Jason Reed