Tech

Top security experts say senators behind anti-encryption bill are ‘woefully ignorant’

‘It’s like apples and sporks.’

Photo of Eric Geller

Eric Geller

Article Lead Image
Remix via Max Fleishman Photo via Kārlis Dambrāns / Flickr | Photo via ttarasiuk / Flickr

The senators behind a controversial encryption bill defended their work in an op-ed on Wednesday night, but security experts pounced on their reasoning and said it was evidence of their technological illiteracy.

Featured Video

In a Wall Street Journal editorial titled “Encryption Without Tears,” Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.) pushed back on widespread condemnation of their Compliance with Court Orders Act, which would require tech companies to provide authorities with user data in an “intelligible” format if served with a warrant.

Silicon Valley companies, technologists, and civil-society groups have blasted the bill because it would effectively outlaw end-to-end encryption, which shields users’ communications so that even the companies cannot read them.

The legislation is part of a decades-long battle between law-enforcement officials who say tech companies should only use breakable encryption and security experts who say unbreakable encryption is a vital tool for digital safety.

Advertisement

“Are the senators happy to trust the security of their personal communications to these companies, or do they rely on government-built security systems without such a backdoor?”

Police and intelligence officials worry that criminals and terrorists are masking their planning through encrypted messaging, but technologists argue that encryption designed to be bypassed would necessarily contain holes—which they call “backdoors“—that hackers and foreign spies could penetrate. Unbreakable encryption, they say, is both an individual and national security necessity.

This fight broke into the mainstream when the Justice Department demanded Apple’s help bypassing the security features on a dead terrorist’s iPhone. Although the government eventually backed down when a third party presented it with a way to access the phone, the court battle raised the profile of a long-simmering debate about how far tech companies should go to facilitate criminal and terrorist investigations.

In their response to their critics, Burr and Feinstein attempted to draw an analogy between the way companies already access their customers’ data and the way the bill would require them to do so.

Advertisement

“Critics in the industry suggest that providing access to encrypted data will weaken their systems,” they wrote. “But these same companies, for business purposes, already maintain and have access to vast amounts of encrypted personal information, such as credit-card numbers, bank-account information and purchase histories.”

But this claim, which is central to the senators’ argument that their bill would not dramatically undermine digital security, earned them the scorn of cryptographers and security researchers.

“It’s like apples and sporks,” Jonathan Ździarski, a forensic scientist and the author of several security publications for O’Reilly Media, said in an email. “The data on my phone isn’t already accessed for business purposes; I can choose to keep my personal information private.”

“There may be some data that some companies collect, implicitly with the user’s consent,” he said, but “that is only a small slice of the data that would be affected by this encryption bill.”

Advertisement

Matthew Green, an assistant professor of computer science at Johns Hopkins University, observed that the very data Burr and Feinstein are describing as accessible to companies has been leaked countless times when hackers have breached corporate servers.

“On a weekly basis we see gigabytes of that information dumped to the Internet,” Green, who recently led a team that exposed a flaw in Apple’s iMessage encryption, said in an email. “This is the whole problem that encryption is intended to solve.”

“You can’t hold out the current flaws in the Internet as a justification for why the Internet shouldn’t be made secure,” he added.

Cryptography expert Bruce Schneier was even more blunt.

Advertisement

“The information [stored on servers without the best encryption] is at risk, which is why we don’t give those companies access to everything,” Schneier said in an email. “Are the senators happy to trust the security of their personal communications to these companies, or do they rely on government-built security systems without such a backdoor?”

These criticisms of Burr and Feinstein’s analogy emphasize an important point about digital security: The differences between the levels of encryption protecting certain types of data—purchase records on Amazon’s servers versus photos on an iPhone, for example—lead to different levels of risk.

Apple’s activities provide a microcosm of this differentiation—and a possible signal of changes to come.

While Apple added full-disk encryption to iOS 8 in September 2014—effectively cutting off its ability to extract data directly from new phones—it continues to give police complete customer iCloud backups, because iCloud does not feature end-to-end encryption. 

Advertisement

But that may be changing. Apple, angered by how the government handled the fight over the terrorist’s iPhone, is reportedly considering adding end-to-end encryption to iCloud. The company’s top lawyer told a House committee on April 20 that “no decision has been made,” but the security upgrade may be inevitable.

“It matters how data is encrypted and who has the key or keys,” Lee Tien, a senior staff attorney at the Electronic Frontier Foundation and a veteran of earlier cryptography battles, said in an email. “The individual device is an issue because the provider doesn’t have the key.”

Ździarski noted that Burr and Feinstein’s analogy missed the mark by pointing to data that companies hold on their own servers. Law enforcement, he said, can—and, as Apple’s iCloud situation illustrates, regularly do—demand that data already.

“Our existing laws on the books already allow for the type of data that a business collects to be provided to law enforcement under an order,” he said. “It is the data that companies have not collected that this legislation is targeting—private information under the control of the user.”

Advertisement

“It matters how data is encrypted and who has the key or keys.”

Lee noted that this divide between server-hosted, accessible customer data and locally stored, encrypted customer data highlighted a broader point: People who care about their data’s security feel more comfortable when they can control how it is secured.

“In [server-hosted data] situations, the user has little or no role in the security; the company may control the crypto,” he said. If Burr and Feinstein want to further shift control of data security to the tech companies, he added, “They miss my point entirely—they ignore that the user wants or may want protection distinct from what the companies want.”

The backlash to the Burr–Feinstein bill reflects the seeming intractability of the two sides in this argument.

Advertisement

Lawmakers backing police believe that technologists are throwing up their hands because they don’t like government oversight of technology, despite the potential availability of a solution. Technologists, meanwhile, believe that members of Congress don’t understand the rigid mathematical reality of encryption technologies or the availability of alternative approaches.

“If the senators honestly believe what they’re saying,” Ździarski wrote, “then they’re woefully ignorant of not only the technology they’re legislating on, but also on existing legislation and case law.”

 
The Daily Dot