Advertisement
Tech

The rise of the new Crypto War

The U.S. government wants to stop terrorists and criminals from ‘going dark.’ But at what cost?

Photo of Eric Geller

Eric Geller

Article Lead Image

James B. Comey, Jr., the seventh director of the Federal Bureau of Investigation, is afraid of the dark.

Featured Video

“The law hasn’t kept pace with technology, and this disconnect has created a significant public safety problem,” Comey said in an Oct. 16, 2014, speech at the Brookings Institution, an influential Washington, D.C., think tank. He called the problem “going dark.”

As more and more criminals presumably “go dark” by encrypting their phones and email accounts, federal agents are finding it increasingly difficult to intercept their communications. The spread of easy-to-use encryption software and the eagerness with which tech companies promote it have deeply troubled the FBI. But on that unusually warm October day, Comey also wanted to vent about another frustration: He felt that the bureau’s proposed solution was being distorted.

“The path to hell starts at the backdoor. You should not ask for backdoors.”

Advertisement

“There is a misconception that building a lawful intercept solution into a system requires a so-called ‘backdoor,’ one that foreign adversaries and hackers may try to exploit,” Comey said. “But that isn’t true. We aren’t seeking a backdoor approach. We want to use the front door, with clarity and transparency, and with clear guidance provided by law.”

He only used the word twice, but by strenuously denying that he wanted one, Comey had set off a fierce debate about the secret law-enforcement data-access portals known as backdoors. In the months that followed, Comey, his deputies at the FBI, and his counterparts at other agencies would face relentless questioning and criticism from skeptical members of Congress, exasperated security researchers, and outraged privacy groups. Despite Comey’s protestations, many feared that the agency once known for its disturbing reach and systemic abuses of power in the era of J. Edgar Hoover was seeking a return to that fearsome omniscience in the digital age.

The debate over backdoors has pitted Comey and other national-security officials against America’s biggest tech companies, which have fired off letter after letter warning the government not to undermine encryption and the increasingly powerful security tools built into their products. It has strained relations between an obscure but important government technical body and the security industry that used to consider it a trusted partner. And it has infuriated the cryptography experts and civil-liberties activists who have spent decades beating back government efforts to weaken the encryption that is now vital to all aspects of online life.

The Crypto Wars

Advertisement

A technological backdoor is a secret portal giving someone access to a secure product, be it a smartphone app, a computer program, or a Web connection. Pure software backdoors let the government directly access systems like Gmail, Facebook, or WhatsApp, and read unencrypted communications. A more complex form of backdoor access involves the government using special keys to decipher encrypted data that it gathered through conventional interception.

Backdoors that rely on encryption keys can either involve a master key for all data flowing across a particular product or keys for individual users that can be plugged into a law-enforcement system to wiretap those people. When a company sets up its system to generate keys for law enforcement—whether for its entire product or for individual users—it holds onto those keys until it is compelled to produce them. This is called key escrow. Here, there is no portal for direct access. Instead, the software code that is written to create the encryption is designed to be able to spit out keys for the government.

FBI Director James Comey

FBI Director James Comey

Jason Reed

To begin to understand the many problems with backdoors and the reasons why security experts are so troubled by their resurgence, consider just a few of the challenges of designing a key-escrow system.

Advertisement

Master keys for entire communications systems are very dangerous; a single screw-up or weak point compromises the whole system. One way to mitigate the threat they pose is to break them up into pieces and disperse them, with each piece stored in a device called a hardware security module. When the government wants to decrypt communications, it brings together all the modules and combines them to form a working key. But that’s logistically complicated and could take too long for the real-time wiretaps that many investigations require—think ticking bombs or kidnapped children. Another option is to break apart the key and send it to “trusted stores” over the Internet. But that only complicates things further.

Creating encryption keys is a normal part of designing a system with encryption. Users exchange those keys, many times without realizing it, anytime they communicate on a secure platform. But sending keys to “trusted stores,” where they remain ready for law-enforcement use, introduces a whole host of problems.

The debate over backdoors that James Comey reignited in October 2014 dates back at least two decades, to a battle that privacy advocates and security experts won, then lost, and then partially won again. 

“Normally a key generation is very simple,” said Joseph Hall, chief technologist at the Center for Democracy & Technology (CDT), a civil-liberties group. “It’s just a cryptographically secure pseudorandom number generator … to generate a very, very long random number.” But to add a backdoor, the company designing the encryption will need to send the keys to various locations for redundancy and safety purposes. “You’ll have to add a step that communicates that key externally to some place that can keep those things,” Hall said. “How do you know you’re communicating with the right place? You have all the problems that we have in secure communications with that.”

Advertisement

The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption” is a landmark paper on the subject written by 11 of the leading researchers in the field of cryptography. Among its conclusions: “The massive deployment of key-recovery-based infrastructures to meet law enforcement’s specifications will require significant sacrifices in security and convenience and substantially increased costs to all users of encryption.” The authors warned that building a system that even approached safe and reliable operation was “beyond the experience and current competency of the field, and may well introduce ultimately unacceptable risks and costs.”

But that paper isn’t new. It’s not even recent. It was published on May 27, 1997, at the height of what the security community calls the “Crypto Wars.” The debate over backdoors that James Comey reignited in October 2014 dates back at least two decades, to a battle that privacy advocates and security experts won, then lost, and then partially won again. The first phase of the Crypto Wars was the fight to shape, and reshape, a landmark law called CALEA.

Max Fleishman

CALEA

The Communications Assistance for Law Enforcement Act (CALEA) is the basis for the government’s ability to place electronic wiretaps that give it access to secure communications. It was passed on Oct. 25, 1994, and became law on Jan. 1, 1995.

Advertisement

“The government was very interested in laws that would prohibit the use of encryption or criminalize the use of encryption … or require backdoors for encryption,” said Lee Tien, senior staff attorney at the Electronic Frontier Foundation (EFF). “CALEA was probably the second or third round of that in Congress.”

The FBI and other agencies were worried that encryption would lock them out of criminals’ systems, but they were also worried about the ability of telephone providers to respond to wiretap requests. As phone networks switched from analog to digital systems, the government began to doubt the phone companies’ capacity to execute wiretap requests. “Fiber is harder to tap than copper,” Tien said. “IP networks are different from packet-switched networks.” So the government went looking for new legal authority to shape how phone companies could build and configure their networks.

The resulting legislation required phone companies to be able to respond to a certain number of wiretap requests at a certain speed, thus ensuring that the government would have the evidence it needed on its own schedule. “There were things like switch capacities: ‘You must be able to handle this many simultaneous wiretaps in a given geographic area,’” Tien said. 

“There was effectively this order to the industry,” said Seth Schoen, senior staff technologist at the EFF. “Come up with technical standards that are acceptable to the government for an interface between the government and the phone companies around these wiretaps, and then deploy this capability so that you’ll be able to perform wiretaps in a standardized way on every phone facility in the U.S.”

Advertisement

Prior to CALEA, companies were required to respond to law-enforcement officials who brought them warrants to conduct wiretaps, but they weren’t required to prepare in advance for those requests or design their networks around them.

The EFF and other leading privacy groups, including the American Civil Liberties Union (ACLU), resisted the government’s attempts to mandate certain network arrangements, believing that it was foolhardy for policymakers to second-guess network engineers. “There is a general feeling in the tech world against tech mandates,” Tien said. “Don’t tell tech companies how to design their equipment … [T]hat stuff is a bad place for government to exert control.”

“A lot of people,” Tien continued, “viewed the requirements of CALEA as being tech mandates on the entire telecommunications industry to support this principle of eternal wiretap ability, which they saw as not necessarily great for civil liberties, not great for business, not great for privacy.”

Advertisement

Civil-liberties groups recognized that they weren’t going to stop CALEA, so they focused instead on limiting its reach. Their accomplishments are evident in the “Limitations” section of 47 U.S.C. § 1002, where CALEA is codified in U.S. law. The first part of the limitations section says that the government can’t require or prohibit specific features or designs in commercial technology. The second part exempts “information services” and facilities that connect one carrier to another.

The “information services” exemption remains one of the most important Internet-privacy provisions in all of U.S. law. When it originally appeared in CALEA, it was understood to exempt all Internet-related businesses, especially broadband Internet service providers and providers of voice-over-Internet-protocol (VoIP) services. But it also covers any company that sets up its operations on the Internet. Every email provider, social network, and instant-messaging client to which you have given your personal information is considered an information service. Those companies are exempt from CALEA’s wiretapping mandates because of the compromise that was struck in the mid-1990s.

The second limitation in CALEA essentially restricted its wiretapping mandates to the phone companies. The Internet was completely exempted. It was a major blow to the government at a time when the Internet was barely a few years away from taking off.

The third part of CALEA’s limitations is so important to the modern backdoor debate that it bears reprinting here in full:

Advertisement

A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.

“Essentially, each part of [the limitations section] provides a reason why the current CALEA couldn’t be applied to make Apple change its software,” Schoen said. “Because it doesn’t apply to Apple [as an information service], because it can’t be used to dictate features, and because companies aren’t responsible for decrypting unless they have the keys.” (Apple’s proudly self-proclaimed inability to access its users’ encryption keys would prove to be the inspiration for Comey’s “going dark” speech.)

“It’s not just the fact that CALEA currently applies to only phone companies and phone conversations,” said Neema Singh Guliani, a legislative counsel in the ACLU’s Washington office. “It’s also that there’s an affirmative clause in CALEA protecting the ability for companies to encrypt.”

The limitations in CALEA protect the encrypted-calling apps that now predominate in smartphone app stores but not the wireless carriers themselves. Verizon could offer customers encryption, but it would be legally required to undermine it if the FBI came calling. “The idea that you could pick up your phone … and have a call encrypted for you by your carrier … that was what CALEA lost for us,” Tien said. “It does not prevent me from running a crypto phone app on my iPhone and using it to encrypt my phone calls or other kinds of communications.”

Advertisement

CALEA hindered—but did not fatally wound—the development and spread of consumer encryption. Yes, it would be easier for everyone if carriers could promise wiretap-free “end-to-end” encryption at no cost of time, effort, or money to their customers. But ultimately, encryption proliferated on a user-by-user basis anyway. It was less widespread, but it was still secure.

“From a convenience and usability standpoint,” Tien said of having carriers encrypt calls, “that’s the way you would like to have crypto work. But it does require that somebody else be in charge of the crypto and know the secret.”

“Opponents of CALEA who reached this compromise really had a lot of foresight about this kind of issue,” Schoen said of the final deal. “A lot of folks in the government now have been saying, ‘Oh, we couldn’t have anticipated any of these issues back in the ’90s.’ But in fact, the government was talking about a lot of these issues back in the ’90s very openly. … If you look at this paragraph and say, ‘Wait, how did this paragraph about encryption get there?’ Well, it got there because there were people in law enforcement who were saying the same things in 1994 that they’re saying now.”

CALEA, with its exemptions for Internet companies, easily passed both houses of Congress and received President Bill Clinton’s signature. The government could now exert substantial pressure on phone companies to comply with wiretap requests.

Advertisement

“At the end of the day, it seemed politically impossible to stop CALEA,” Tien said. “The best that the defenders of the Internet were able to do was take advantage of public pronouncements by the FBI. They ended up making a deal that would subject the public-switch telephone network to CALEA requirements—tapability, limitations on crypto—but the idea was to leave the Internet alone.”

Having won substantial carve-outs designed to shield the emerging Internet from government surveillance, the EFF agreed to publicly support CALEA. That decision “led to a giant schism inside of EFF,” said Tien, who was not there at the time but heard about it from colleagues. “It led to EFF being abandoned and criticized by many of its former supporters for having sold out by supporting the FBI on CALEA, which many people thought was really an abomination.”

“We know how to send people up to the International Space Station, but what we’re talking about here is the equivalent of colonizing Mars.”

Within a year, EFF broke apart. One of its factions became the Center for Democracy & Technology. The remaining EFF employees relocated to California, where, Tien said, “we sat for a while until we got active again.”

Advertisement

The time to get active again came less than a decade later, when, after all of that pain and compromise, the CALEA fight erupted anew. The George W. Bush administration went to the Federal Communications Commission, which interprets CALEA, and asked it to apply the law to Internet service providers like Comcast and AT&T. The FCC obliged. In an Aug. 5, 2005 order, then-Chairman Kevin Martin, a Bush appointee, wrote, “We conclude that the Communications Assistance for Law Enforcement Act (CALEA) applies to facilities-based broadband Internet access providers and providers of interconnected voice over Internet Protocol (VoIP) service.”

CALEA’s opponents were stunned.

“We and a lot of other folks fought back against that and said, ‘Wait a minute, the whole idea here is that this doesn’t apply to the Internet,’” Schoen said. “That was the deal, that was the understanding, that was the way it was supposed to work; it was only about phones.”

Max Fleishman

Advertisement

The EFF and other privacy groups immediately requested that the FCC stay its order. The FCC declined to do so. The groups then requested that the U.S. Court of Appeals for D.C. Circuit issue a stay of the FCC’s decision pending a review. The appeals court declined to do so, ruling that the FCC had wide discretion in its legal interpretation of CALEA and the Communications Act of 1934, its founding legislation. From then on, CALEA’s wiretap mandates applied to both phone companies and Internet service providers.

“It’s hard to convey how disappointing that was to people who had been involved in the original controversy around CALEA,” Schoen said. “If you have some controversy, and you think you’ve reached some kind of compromise or some kind of deal, and then a decade later, someone comes back and says, ‘Actually, we’re reinterpreting it so that everything that you thought you won and everything that you thought you protected is lost.’ It was very frustrating and very disappointing to people.”

Even so, the limitations section of CALEA remained mostly intact. The reinterpretation only applied to the companies that provided Internet access—think Comcast or Time Warner Cable—not the companies that built products and services on the Internet. The government wouldn’t be able to tell Google how to design the encryption at the heart of Gmail. Apple wouldn’t have to comply with wiretap requests if it lacked the technical ability to hand over encryption keys. CALEA was a disappointment and its reinterpretation an even bigger blow, but the fight was not a total loss.

More importantly, the EFF and other groups thought they were done with the encryption fight. When the Sept. 11, 2001, terrorist attacks prompted new surveillance legislation, the privacy community closely tracked the USA Patriot Act for signs of new wiretap mandates designed to undermine encryption. They found none. At around the same time, following legal challenges, the government ended its restrictions on the kinds of encryption that could be published online and sold overseas. The Crypto Wars, it seemed, were over.

Advertisement

The return of the Crypto Wars

James Comey had been the director of the FBI for slightly more than a year when he delivered his “going dark” speech at the Brookings Institution.

“Those charged with protecting our people aren’t always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority,” Comey said. “We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technical ability to do so.”

Later in the speech, Comey delivered a chilling warning: “If the challenges of real-time interception threaten to leave us in the dark, encryption threatens to lead all of us to a very dark place.”

Advertisement

Then Comey got specific. Apple had recently announced that it would encrypt devices by default in a way that even it couldn’t break. If it couldn’t break the encryption, it couldn’t hand the government whatever lay behind it. Comey said that such encryption schemes would “have very serious consequences for law enforcement and national security agencies at all levels.”

“Sophisticated criminals will come to count on these means of evading detection,” Comey said. “It’s the equivalent of a closet that can’t be opened. A safe that can’t be cracked. And my question is, at what cost?”

The Crypto Wars were back.

“If the challenges of real-time interception threaten to leave us in the dark, encryption threatens to lead all of us to a very dark place.”
   —James Comey

Advertisement

Civil-liberties groups and security experts were stunned to see the issue of encryption reemerge as a government boogeyman. “Everybody is annoyed at having to deal with this issue again,” said Nadia Heninger, assistant professor of computer and information science at the University of Pennsylvania. “A lot of people spent a lot of effort and a lot of time demonstrating that backdoors were both bad technologically and bad for commercial business, and they won that argument in public. It seems like now we’re dealing with this argument again. It’s the same issues being raised by a different set of people with seemingly no memory about prior history.”

“It’s déjà vu all over again for them,” Tien said of the attitude among security researchers. “Folks continue to scratch their heads [about] what is, in a way, a lack of common sense.”

Comey wasn’t alone in banging the drums about encryption. National Security Agency Director Adm. Mike Rogers proposed a split-key form of key escrow in an April speech at Princeton University. At a House Homeland Security Committee hearing in early June, Michael Steinbach, the FBI’s assistant director for counterterrorism, said that tech companies needed to “build technological solutions to prevent encryption above all else.” (Steinbach added that the FBI was “not looking at going through a backdoor or being nefarious.”) But Comey has undeniably led the fight for encryption backdoors.

If universal encryption became the norm, the FBI director said in a May 20 Q&A session at Georgetown Law School, “all of our lives, including the lives of criminals and terrorist and spies, will be in a place that is utterly unavailable to the court-ordered process.”

Advertisement

Max Fleishman

It was in this climate of heated rhetoric and government-versus-industry sparring that the House Committee on Oversight and Government Reform convened a hearing on April 29 called “Encryption Technology and Potential U.S. Policy Responses.” Amy Hess, executive assistant director of the FBI’s Science and Technology Branch, represented the bureau. Joining her in calling for backdoors was Daniel Conley, the district attorney for Suffolk County, Massachusetts.

Hess repeatedly invoked the specter of terrorism when she described the dangers of strong encryption. She described the warrant process for encrypted electronic devices as “an exercise in futility,” adding, “Terrorists and other criminals know this and will increasingly count on these means of evading detection.”

“We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technical ability to do so.”

Advertisement

The FBI declined the Daily Dot’s request to interview Ms. Hess or any other senior bureau official. A bureau spokesman also repeatedly declined to answer specific questions about the FBI’s role in the backdoor debate, including how it read CALEA’s limitations provision. “The public can be assured law enforcement is not asking for new authorities,” the spokesman said. “Rather, law enforcement is advocating that industry be able to comply with court orders issued pursuant to existing legal authorities.”

Conley evidently didn’t get the memo about sticking to existing legal authorities. At the House hearing, he asked Congress to update CALEA to cover smartphones. He also attacked tech companies for improving their encryption. “Whatever goodwill or support they believe they will earn with these dangerous operating systems will erode rapidly,” he said, “as victims of physical and economic predation find their paths to justice blocked while those who hurt and exploit them are protected.”

A spokesman for the Suffolk County District Attorney’s office declined the Daily Dot’s requests for an interview with Conley. The spokesman, Jake Wark, said that Conley wasn’t avoiding interviews, just that the D.A.’s schedule was always full.

As Hess and Conley made the case for backdoors at the House Oversight Committee hearing, Rep. Jason Chaffetz (R-Utah), the committee chairman, listened incredulously. Chaffetz has been a thorn in the side of the Obama administration ever since he took over the gavel for the Oversight Committee in January. He is one of the White House’s most persistent congressional critics on the subject of the Sept. 11, 2012, attack on the U.S. consulate in Benghazi, Libya. He has relentlessly questioned the administration’s accounts of everything from former Secretary of State Hillary Clinton’s private email server to the Internal Revenue Service’s investigations of political groups.

Advertisement

Just as he had sparred with state department lawyers and IRS officials, Chaffetz battled Hess over what exactly the FBI collected and how it justified doing so. He asked her what the bureau considered geolocation data: Was it just GPS records? Did it include cell-tower triangulations and connections to Wi-Fi hotspots? Hess declined to get into specifics, promising that she could brief the committee in a classified session if Chaffetz wanted more detailed answers.

Hess’s non-answers were symptomatic of the FBI’s overall position on backdoors. Although Comey has repeated called for special government access, no one at the FBI has offered a specific technical scheme that they would like to see implemented. Comey and other FBI officials speak in general terms about wanting to have a conversation about security and privacy, but they have refused to provide details about what backdoor systems they believe they need.

“Deflection is the word that comes to mind,” Chaffetz said in an interview with the Daily Dot, when asked how he would describe Hess’s answers. Referring to the government’s simultaneous demands and evasions, he said, “I think they want to have their cake and eat it too.”

“The Department of Justice,” he said, “has been notoriously hiding the ball so that we don’t see all the information.”

Advertisement

Rep. Will Hurd (R-Texas), a freshman congressman who chairs the House Oversight Subcommittee on Information Technology and attended the encryption hearing, shared Chaffetz’s frustrations.

“It just increases vulnerabilities,” he told the Daily Dot, referring to a backdoor in cryptography. “If law enforcement can take advantage of a vulnerability, then the bad guys can take advantage of the vulnerability.”

Hurd is one of a precious few members of Congress with a computer-science degree, and he worked for the CIA in the Middle East before coming to Washington, uniquely positioning him to understand both sides of this debate.

“I recognize the difficulty that law enforcement is facing, with their duty to protect the country, and that they’re having to adapt their techniques and procedures,” Hurd said. “But we need to ensure that we’re always thinking about our civil liberties as well as our security.”

Advertisement

Hurd said he wanted to meet with the FBI’s “technical people” to learn more about the specific harms attributable to encryption. He didn’t hear much in the way of concrete evidence from Hess or Conley at the hearing.

“Some of the examples that were being used as theoretical future problems,” he said, “were not good examples for their case for backdoors.”

Hurd took issue with comments by NSA Director Rogers about the feasibility of backdoors for good guys only. When confronted at an industry conference with the fact that virtually the entire public cryptography community opposes backdoors, Rogers said, “I’ve got a lot of world-class cryptographers at the National Security Agency.”

“I don’t know what he’s talking about,” Hurd said. “All the folks that I’ve talked to who are experts in this field say that it’s impossible.”

Advertisement

The encryption hearing attracted scant attention on Capitol Hill—certainly nowhere near as much press as the Republican Party’s endless Benghazi hearings. While many representatives lambasted Hess and Conley for their dubious arguments, the issue failed to break out into the mainstream. Encryption is not a sexy issue, even if it has huge ramifications for privacy and civil liberties. Higher-profile, more partisan fights are consuming Washington right now; lawmakers would rather attend hearings and deliver speeches about those issues. That’s the way to rile up voters, score endorsements, and secure donations.

“There’s an educational component here,” Chaffetz said. “I would suspect most members have no idea what we’re talking about.”

Hurd aims to change that. “I’m trying to use my background experience to educate people on some of the use cases, the worst-case scenario, best-case scenarios,” he said.

As the chairman of the House subcommittee tasked with overseeing cybersecurity, privacy, and new technology, Hurd has a powerful platform from which to shape the debate over encryption policy. When asked how he planned to use that platform, Hurd said he was “definitely going to be doing more on topic” but declined to get into specifics. One possibility that he floated was bringing in technical experts to debunk myths about backdoors.

Advertisement

“The first step is making sure everybody in Congress understands this issue and the perils of backdoors to encryption,” he said, “so that if any attempt is made to try to do something, everyone’s prepared.”

Chaffetz promised that his committee would take Hess up on her offer of a classified briefing, but he also expressed frustration at senior officials’ repeated evasions on the specifics of encryption. “I’ve written letters asking this question,” he said, referring to the FBI’s lack of specifics on backdoors. “I don’t want the 30-second answer. I want to see what they’re doing or not doing.”

Hurd, too, has written letters. On June 1, he and Rep. Ted Lieu (D-Calif.), another privacy-minded congressman with a computer-science degree, sent Comey a letter warning him to forget about backdoors. “There is a difference between private companies assisting law enforcement and the government compelling companies to weaken their products to make investigations easier,” they wrote.

“If you have probable cause or even articulable suspicion, then I think you do have the right to go get this information,” Chaffetz said. “But to blanketly say, ‘We’re going to look at everybody all the time, just in case,’ I don’t buy it.”

Advertisement

Universally derided

On Tuesday, May 19, a comprehensive coalition of cryptographers, civil-liberties advocates, and tech companies sent President Obama a letter arguing that the FBI’s push for backdoors represented a grave threat to Internet security.

“Strong encryption is the cornerstone of the modern information economy’s security,” the letter read. “Whether you call them ‘front doors’ or ‘back doors,’ introducing intentional vulnerabilities into secure products for the government’s use will make those products less secure against other attackers. Every computer security expert that has spoken publicly on this issue agrees on this point, including the government’s own experts.”

In its story about the letter, the New York Times cited two signatories who epitomized the group’s deep expertise and diverse backgrounds: “Whitfield Diffie, one of the co-inventors of the public key cryptography commonly used on the Internet today, and the former White House counterterrorism czar Richard A. Clarke, who was one of a handful of experts the White House asked to review its security policies after the revelations by Edward J. Snowden.”

Advertisement

A week later, the United Nations Office of the High Commissioner for Human Rights released a report extolling the benefits of encryption. “States should avoid all measures that weaken the security that individuals may enjoy online,” the report said, “such as backdoors, weak encryption standards, and key escrows.”

On July 7, the day before Comey testified at two Senate hearings on encryption, a group of security experts—many of whom had worked on the 1997 key-escrow paper—published another paper excoriating Comey’s proposals.

Special government access to encrypted products would “open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend,” the researchers wrote. “The costs would be substantial, the damage to innovation severe, and the consequences to economic growth difficult to predict.”

“There is a misconception that building a lawful intercept solution into a system requires a so-called ‘backdoor,’ one that foreign adversaries and hackers may try to exploit.”

Advertisement

The Daily Dot spoke to numerous professors, security researchers, and cryptography experts who pointed out serious problems with the idea of backdooring an encrypted system. Their concerns ranged from the impossibility of actually building a working backdoor to the near-certainty that a determined adversary could penetrate it.

The starting point for any analysis of backdoor security is the indisputable fact that a backdoor is a new entry point into a secured system. Adding a backdoor, Heninger said, “increases the attack surface of the system.” Foreign governments and cybercriminals are constantly studying the encrypted systems of banks and email providers, looking for any weak point. By virtue of its very existence, a backdoor increases their options.

What worries researchers and tech companies the most about a backdoor is the fact that it adds a vulnerability to a system that the system’s operator cannot fully manage. When companies implement their own encryption, they scrutinize every aspect of it to ensure that it functions properly. They monitor attempted breaches and respond accordingly. They can fully evaluate their encryption because it is theirs and theirs alone.

A backdoor robs companies of that total control and awareness. It is not just that a backdoor is another way into a system; it is that a backdoor is a way into a system that cannot be guarded by the system’s operators.

Advertisement

“You have this backdoor out there that’s run by other people who aren’t telling you what kind of security measures they’re taking, what kinds of protections they have,” said Schoen. “If you think that there’s some precaution that they ought to be using, you have no ability to get them to take that precaution.”

Imagine that you own a home with locks on all your doors made by a trusted, respected company. Then the police ask you to add a new door to your house, protected by a lock whose key you don’t have. The police say they are the only ones with the key, but they won’t tell you how they’re guarding that key, and they have a history of being hacked and losing sensitive data. They won’t even promise to tell you if someone steals their key. Would you agree to add that door?

“You have this whole set of security risks about the people who administer the backdoor, and their security and their security measures and their defenses against attacks,” Schoen said.

Backdoor opponents love pointing out that, in the words of Sen. Ron Wyden (D-Ore.), “There’s no such thing as a magic door that can only be used by the good people for worthwhile reasons.” Nearly every security expert interviewed for this story stressed the fact that backdoors have no way of distinguishing between lawful and unlawful uses of their secret access.

Advertisement

“A vulnerability is a vulnerability,” Tien said. “It doesn’t know whether you’re the FBI or China.”

China, a major state sponsor of cyberattacks, allegedly used a government backdoor in Gmail to hack the email provider in January 2010. Confirmed backdoor exploits extend beyond state actors. In 2007, a government backdoor in the Greek wireless carrier Vodafone-Panafon allowed hackers to steal the data of Athens’ mayor and more than 100 Greek and international officials. And in 2006, the Italian government began investigating a “spy ring” hidden inside Telecom Italia that “taped the phone conversations of politicians, industrialists, and even footballers.” Hall described the Greek and Italian incidents as cases where “dormant wiretapping functionality that was essentially a backdoor was activated.”

Max Fleishman

Security researchers repeatedly pointed to the technical lessons learned from the most famous hardware backdoor ever proposed: the so-called “Clipper chip.” The NSA developed the chip, which used an encryption scheme called Skipjack, to be a one-size-fits-all backdoor module that could be inserted into computers, phones, and other devices. Each Clipper-chipped device would carry a unique encryption key that the government could access with a warrant. But in 1994, a year after the NSA proposed the chip, a cryptography expert named Matt Blaze published a paper laying bare the Skipjack algorithm’s many security flaws. The government abandoned the chip two years later, and its cryptographic design now serves as a textbook example of the dangers posed by poorly configured backdoors.

Advertisement

Jake Laperruque, a fellow on privacy, surveillance, and security at CDT, said that a government-only backdoor was simply “not technologically feasible.” He offered a timely pop-culture analogy: The Avengers: Age of Ultron. “The FBI, what they’re basically asking for is a Thor’s hammer that only a good guy can pick up.”

By setting up a system in which it can access a backdoor, the government turns itself into a huge target for foreign governments and other malicious actors. Backdoors would be concerning enough from a civil-liberties perspective if they truly were limited to lawful use by the government. But the government’s own security vulnerabilities, laid bare by years of cyberattacks and leaks, show that even a well-intentioned FBI couldn’t prevent a backdoor from being exploited.

“All an attacker has to do is to obtain the master key and they can now compromise everything,” Nicolas Christin, assistant research professor in electrical and computer engineering at Carnegie Mellon University, told the Daily Dot via email. “It doesn’t matter if you split the master key in half—determined attackers will look for both halves.”

Because key-escrow systems are triggered by the authorities inserting their piece of a key into a backdoor, exploitation would be as simple as acquiring that key. As Heninger put it, referring to the Snowden leaks, “We have some examples of how the government hasn’t been doing such a good job of keeping that information secret.”

Advertisement

“A single master key that can decrypt every single message sent by anybody in America to anybody else in America … becomes an incredible target, both for theft and also for abuse,” said Matthew Green, assistant research professor at the Johns Hopkins Information Security Institute.

On March 17, 2011, RSA Data Security, one of the oldest and most trusted security companies, announced that it had been the victim of “an extremely sophisticated cyberattack.” Hackers had stolen the master keys to the company’s SecurID authentication devices, which the world’s largest companies used to add a second layer of security to employee logins. A few months later, Lockheed Martin, a major U.S. defense contractor, announced that hackers had stolen military secrets from it by exploiting the SecurID system.

“Even sophisticated security companies who have been building systems to protect military secrets have not managed to keep their keys from getting hacked,” Green said of the RSA hack. “And [a backdoor master] key would be a million times more sensitive than that.”

More worrisome, backdoor exploitation by a stolen master key might not even be observable. “If somebody were to, for example, hack into the government database and get a copy of that key and then use it to intercept data going across [a communications network],” Heninger said, “that would be completely passive, and it would not be detectable by anybody.”

Advertisement

These may seem like worst-case scenarios, but security experts point out that the sheer human scope of a backdoor system makes such an intrusion almost inevitable. Hundreds of U.S. judges can approve wiretap requests. Thousands of local police departments would want to be part of the backdoor search request process. “Somehow, we’re supposed to build a system that is secure with all these different parties,” said Daniel Weitzner, a lecturer in the computer-science department at the Massachusetts Institute of Technology.

Weitzner, who also directs part of MIT’s Computer Science and Artificial Intelligence Laboratory, participated in the January 1997 meeting that led to the landmark “Risks of Key Recovery” report. He recalled the group’s extreme pessimism about the idea of a working, secure backdoor. “Just the scale of the system was so big that it just would have inevitable security problems,” he said.

Every person with even partial access to a law-enforcement backdoor is a ripe target for the simplest and most troubling form of exploitation: social engineering—the practice of duping someone into providing sensitive information, like a password, by pretending to be a trusted party like a bank, spouse, or colleague. “The credentials, the keys, other tools that you need to break into a communication are going to be spread out very widely,” Weitzner said, “and all someone has to do is steal those or impersonate the person who is authorized to use the backdoor.”

A few weeks after the RSA hack that exposed the SecurID master keys, an RSA executive revealed that the hackers had used social engineering to breach the company’s servers.

Advertisement

Exploitation isn’t the only thing that worries security researchers. They also remain unconvinced that a backdoor with the other necessary qualities can even be built.

The FBI often cites situations like child abductions or active terrorist plots when it calls for backdoors. But such ongoing threats would require real-time access to the backdoor mechanism. That real-time access, experts said, is not realistic given how backdoors are built and maintained.

“You’re going to need to essentially perform very sensitive security operations that are going to require a bunch of human steps really, really fast,” said Hall. “A lot of these are going to be in airgapped facilities [where servers are deliberately cut off from the Internet], and you’re going to have to jump the airgaps to get the keying material together in one place that’s probably also an airgapped facility.”

Daniel Weitzner argued that there was simply no way to reconcile a backdoor’s dual requirements of security and accessibility. If you physically disperse keys across the country to make them easier for law enforcement to reach, you add more venues for exploitation, he said. If you put one hardware security module in the FBI’s heavily guarded Washington headquarters, you prevent disparate law-enforcement groups from quickly accessing it to launch real-time monitoring operations.

Advertisement

“I’m not even sure we’re good at doing that, keeping keys like that technically secure,” Green said. “I’m not sure we have any hardware that’s ever been put to that test.”

Another technical concern surrounds backdoors in mobile operating systems. On April 28, a Stanford University graduate student named Jonathan Mayer posted a detailed analysis of what it would take to backdoor something like Google’s Android OS, which encrypts users’ devices by default. He described an increasingly unrealistic set of technical requirements that would cascade from Google adding a backdoor to its core encryption.

Google couldn’t just backdoor its disk encryption, Mayer wrote. It would have to backdoor its library of code that many developers used to implement Android-compatible cryptography in their apps. But some apps might not use that library. So Google would have to remove apps that used other cryptographic libraries from its app store. But how would it be able to tell which ones used which libraries? And even if it could do that, how could it police third-party app stores? And what about Web apps that weren’t installed at all?

Max Fleishman

Advertisement

Meyer’s conclusion is stark: “In order to prevent secure data storage and end-to-end secure messaging, the government would have to block these Web apps. The United States would have to engage in Internet censorship.” (Asked what he thought of Mayer’s post, Christin said he “would concur with his views.”)

Then comes the issue of transparency. If the government ever mandated a backdoor, the question of whether to publicly disclose it would raise competing political and technical concerns.

The technical argument for disclosure is strong. The notion that encryption schemes should be public knowledge so that experts can dissect and test them has been understood since the late 19th century, when the Dutch cryptographer Auguste Kerckhoffs introduced the idea in what became known as Kerckhoffs’s principle. “Not fully disclosing a system design to ensure its security is not a sound practice,” said Christin, channelling Kerckhoffs.

But the political argument against disclosure is also strong. The announcement of a U.S. government backdoor would be tantamount to inviting other countries to make the same demands of U.S. tech companies. Alex Stamos, then the chief information security officer at Yahoo, raised this concern when he confronted Mike Rogers, the NSA director, at a conference in late February.

Advertisement

“If we’re going to build defects/backdoors or golden master keys for the U.S. government,” Stamos asked Rogers, “do you believe we should do so … for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government?”

Rogers ignored the question, saying that he understood that there were “international implications” but added, “I think we can work our way through this.”

Security experts aren’t so confident. What the intelligence agencies want, they say, is as unrealistic as it is dangerous.

“We know how to send people up to the International Space Station, but what we’re talking about here is the equivalent of colonizing Mars,” Green said. “All the techniques are known, but could we actually do it at scale? I don’t know.”

Advertisement

Divided government

As Comey, Rogers, and other national-security officials campaign for backdoors, one important voice has been largely absent from the debate.

“I lean probably further in the direction of strong encryption than some do inside of law enforcement,” President Barack Obama told Recode’s Kara Swisher on Feb. 15, shortly after Obama spoke at the White House summit on cybersecurity and consumer protection in Silicon Valley.

President Barack Obama

President Barack Obama

Jason Reed

Advertisement

Obama’s interview with Swisher marked a rare entrance for the president into the backdoor debate, which has pitted his law-enforcement professionals against the civil libertarians who were encouraged by his historic 2008 election and disappointed by his subsequent embrace of the surveillance status quo.

If the president felt strongly enough about strong encryption, he could swat down FBI and NSA backdoor requests any time he wanted. White House advisers and outside experts have offered him plenty of policy cover for doing so. The President’s Review Group on Intelligence and Communications Technologies, which Obama convened in the wake of the Snowden disclosures, explicitly discouraged backdoors in its final report.

“A vulnerability is a vulnerability. It doesn’t know whether you’re the FBI or China.”

The review group recommended “fully supporting and not undermining efforts to create encryption standards,” … “making clear that [the government] will not in any way subvert, undermine, weaken, or make vulnerable generally available commercial encryption,” and “supporting efforts to encourage the greater use of encryption technology for data in transit, at rest, in the cloud, and in storage.”

Advertisement

The report also warned of “serious economic repercussions for American businesses” resulting from “a growing distrust of their capacity to guarantee the privacy of their international users.” It was a general warning about the use of electronic surveillance, but it nevertheless applies to the potential fallout from a backdoor mandate.

The White House’s own reports on cybersecurity and consumer privacy suggest that the president generally supports the use of encryption. “To the extent that you’ve heard anything from the White House and from the president, it’s in favor of making sure that we have strong encryption and that we’re building secure, trustworthy systems,” said Weitzner, who advised Obama as U.S. deputy chief technology officer for Internet policy from 2011 to 2012.

Weitzner pointed out that the president had subtly quashed a push for backdoors by the previous FBI director, Robert Mueller.

Mueller “hoped that the administration would end up supporting a very substantial [Internet-focused] expansion of CALEA,” Weitzner said. “That didn’t happen, and … despite the fact that you had the FBI director come out very strongly saying [criminals] were going dark, the administration never took a position as a whole in support of that kind of statutory change. You can read between the lines.”

Advertisement

Obama’s reluctance to directly confront his FBI chief reflects the bureau’s long history of autonomy in debates over law-enforcement powers, said a former Obama administration official.

“It’s pretty well understood that the FBI has a certain amount of independence when they’re out in the public-policy debate advocating for whatever they think is important,” the former official said.

The White House is reviewing “the technical, geopolitical, legal, and economic implications” of various encryption proposals, including the possibility of legislation, administration officials told the Washington Post this week. Weitzner said that Obama may also be waiting until the latest round of the Crypto Wars has progressed further.

“The White House tends to get involved in debates once they’ve matured,” he said. “If you jump in on everything right up front, the volume can become unmanageable. I know that there’s a lot of attention being paid, and I think that’s the right thing to do at this point.”

Advertisement

The president’s noncommittal stance has earned him criticism from pro-encryption lawmakers who say that their fight would be much easier if the commander-in-chief weighed in. “The best way to put all this to bed,” Hurd said, “would be for the president to be very clear saying that he is not interested in pursuing backdoors to encryption and believes that this is the wrong path to go, in order to squash the debate once and for all.”

If Obama ever formally came out against backdoors, it would represent a significant shift away from decades of anti-encryption government policies, including undermining industry-standard security tools and attacking tech companies through public bullying and private hacking.

The Daily Dot requested an interview with Michael Daniel, the Obama administration’s cybersecurity coordinator. A White House spokesman said he would look into it but did not arrange an interview in time for this story. A spokesman for the Department of Justice, which houses the FBI, did not respond to questions about whether Attorney General Loretta Lynch supported Comey’s call for backdoors. A spokesman for Director of National Intelligence James Clapper—who oversees the FBI, NSA, CIA, and 14 other intelligence agencies—declined to comment. He referred the Daily Dot to NSA Director Rogers’ comments.

Eroding trust

No story better illustrates the government’s attack on commercial encryption than the drama surrounding a secret backdoor in an NSA algorithm approved by the National Institute of Standards and Technology.

Advertisement

NIST, as its name implies, is the federal agency responsible for promoting and coordinating the development of technical standards. In 2006, the NSA asked NIST to approve Dual_EC_DRBG, an algorithm that generated pseudorandom numbers. Random-number generators lie at the heart of encryption technology. By scrambling the underlying code so that it cannot be guessed or approximated, they shield whatever cryptography is built on top of them with strong security. But the NSA had designed its algorithm in such a way that it could undo the resulting encryption. It had, in effect, created a backdoor in the code.

RSA, the SecurID maker, quickly adopted Dual_EC and implemented it in its BSafe software. Years later, Reuters would reveal that this was because the NSA had paid RSA $10 million. The NSA wanted to give its algorithm an early boost to improve the chances of NIST approval. The gambit worked. After RSA adopted Dual_EC, various government agencies followed suit. The NSA “then cited the early use of [Dual_EC] inside the government to argue successfully for NIST approval,” Reuters reported.

Less than a year later, security expert Bruce Schneier, citing an August 2007 presentation by two other researchers, announced that Dual_EC contained “a weakness that can only be described as a backdoor.”

Almost immediately, RSA told its customers to stop using Dual_EC. NIST announced that it was investigating the NSA’s work on the algorithm and promised to put in place measures to prevent approvals of flawed code in the future. Heninger called the entire episode “a tragedy.” NIST, which had long been a trusted partner of the security community, was as much a victim of the NSA’s deception as RSA, which denied knowing of Dual_EC’s weakness, and its customers.

Advertisement

“The biggest impact that you see from [Dual_EC] is that people don’t look at the government as a trusted arbiter of these things anymore,” said Green.

A NIST spokesman said he would try to find an expert to comment on the agency’s work with the NSA, but he did not arrange an interview by this story’s deadline. The NSA did not respond to multiple requests for comment about Dual_EC and its continuing cryptography work.

On June 26, NIST officially abandoned Dual_EC, eliminating it from the latest version of its recommendations for random-number generators.

For years, the NSA worked with the security community to harden systems, building on its expertise as a wartime codebreaker. But in the second half of the 20th century, the NSA adopted the view that weakening encryption was better for national security than strengthening it, because that would let the NSA snoop on foreign users of commercial technology.

Advertisement

Encryption was considered as dangerous an export as weapons of war.

Heninger said that “this underhanded activity, where companies can’t trust the NSA as a contractor [and] NIST can no longer trust the NSA,” was a tremendous loss for the security industry. Some experts have suggested abandoning other encryption standards with known NSA input, even though those standards are not known to be backdoored. “That’s actually causing a decrease in security,” she said.

The NIST–NSA drama still reverberates in Washington today. On June 11, the House passed a surveillance-reform amendment to a bill funding, among other things, the Department of Commerce, which houses NIST. That amendment, from Reps. Zoe Lofgren (D-Calif.) and Thomas Massie (R-Ky.), would prevent NIST from approving any cryptographic standards that weakened encryption.

Heartbleed as a harbinger

Government efforts to weaken encryption go far beyond NIST. One of the earliest clashes between government and industry over commercial encryption involved laws restricting the export of strong cryptography. For decades, the government had severely limited the kinds of encryption that could be discussed, published, and exported. The intelligence agencies didn’t want U.S. tech companies making it easier for their overseas enemies to protect themselves. Cryptography appeared alongside tanks, bombs, and missiles on the United States Munition List. Encryption was considered as dangerous an export as weapons of war.

Advertisement

The government also defined “export” to mean any process of making something available to a foreign person, essentially banning cryptographers from publishing their work on the nascent Internet unless it met certain “export crypto” restrictions. It was no coincidence that these restrictions watered down the strength of commercial encryption to levels that the NSA could crack.

Max Fleishman

The export-crypto restrictions suffered a fatal blow on May 6, 1999, when the U.S. Court of Appeals for the Ninth Circuit issued a ruling in Bernstein v. United States Department of Justice. The court said that computer code was speech and that the government’s restrictions were tantamount to censorship. “Because the [export-crypto system] applies directly to scientific expression, vests boundless discretion in government officials, and lacks adequate procedural safeguards,” the court said, “we hold that it constitutes an impermissible prior restraint on speech.”

The government could no longer require cryptographers to check with it before publishing code or ban publication of that code if it didn’t like what it saw.

Advertisement

Cindy Cohn, the lawyer who represented encryption developer Daniel Bernstein in the case, is now the executive director of the EFF. She said that the arguments she faced from the Justice Department in court were “remarkably similar” to what the government is saying now about backdoors.

“The government’s position was that bad guys could use crypto and so therefore they needed to regulate it,” Cohn said, “and our position was good guys need crypto in order to protect us from bad guys.”

Bernstein forced the government to substantially scale back the export-crypto restrictions, and the rules eventually disappeared altogether. But the legacy of that era, in which only weak cryptography was permitted to spread around the world, continues to plague businesses and consumers today.

Jason Reed

Advertisement

A team of researchers that included Green and Heninger has connected two major Internet security vulnerabilities called FREAK and Logjam to flaws in weak cryptography left over from the 1990s. The attacks, which target the export-grade cryptography that was developed to pass government muster, sent a shockwave through the security world and forced major U.S. tech companies to scramble for a fix.

“This was not known to be broken or breakable in the ’90s,” Heninger said, “but in the past couple of months there have been several really serious security vulnerabilities due to the fact that these cipher suites are still around and being used unintentionally.”

The attacks that prey on export-grade cryptography point to the lasting impact of the same government attitude that now motivates the push for backdoors. A third attack, unrelated to export crypto, offers an example of a devastating flaw that backdoors would make immeasurably worse.

The attack, called Heartbleed, targets the massively popular OpenSSL encryption scheme (the tech that gives you a secure “HTTPS connection on the Web). Its discovery in April 2014 practically set the Internet on fire. Green saw Heartbleed as a warning against backdoors, because they would only make it easier for hackers using a similar exploit to reach more victims.

Advertisement

“It’s not going to be a Heartbleed that affects a few people,” Green said of an attack using backdoors. “it’s going to be a Heartbleed that affects everybody.”

To Green, the human fallibility exposed by Heartbleed, FREAK, and Logjam was the biggest reason to avoid backdoors. “We know we get things wrong,” he said. “When we get things wrong in the future with backdoors, it’s going to be everybody’s problem, everybody in America. And that’s a really scary thing.”

“Someday maybe we’ll be good enough to not have problems in our crypto,” he said, “but we’re not that good now.”

The FBI says that it wants backdoors because it needs universal access to facilitate targeted enforcement. But Cohn pointed out that this philosophy does not apply in any other realm of law enforcement. “We don’t make people live in houses with their doors unlocked just in case they happen to be a thief and we need to investigate them,” she said.

Advertisement

Cohn called Comey’s “going dark” speech “really frustrating.”

“It was our first fight,” she said, referring to the EFF’s work on encryption in the 1990s. “We thought we won it.”

Despite the setback of losing the export-crypto restrictions in that fight, the government pressed ahead with other methods of weakening encryption. In September 2013, the New York Times revealed the existence of an NSA program called “Bullrun” based on documents provided by Snowden. As part of the program, the NSA “began collaborating with technology companies in the United States and abroad to build entry points into their products.”

“Having lost a public battle in the 1990s to insert its own ‘backdoor’ in all encryption,” the Times reported, the NSA “set out to accomplish the same goal by stealth.”

Advertisement

“The Snowden revelations show you very clearly how much the U.S. government cares about encryption or its lack thereof,” said Tien. “So much of what we have seen [involves] exploits and vulnerabilities designed to get around crypto.”

Max Fleishman

As their agencies work behind the scenes to break companies’ codes and twist their arms, Comey and Rogers have also tried one more approach: publicly shaming the companies by arguing that their resistance to backdoors amounts to shielding criminals.

“He’s not threatening to regulate them,” Cohn said, referring to Comey. “He’s trying to bully them into dumbing down the cryptography they make available to us.”

Advertisement

You can’t legislate the weather

Security researchers don’t often wade into highly charged political fights. They work with code, and code is apolitical. But the security researchers who have followed the encryption debate say they’re tired of watching the FBI either misunderstand or dismiss the technical reality surrounding backdoors.

“A lot of the arguments you hear from the FBI and the NSA come from people most separated from technical reality,” Hall said.

One major frustration is that the government hasn’t laid out the specifics of its desired intercept solution.

Advertisement

Comey and Deputy Attorney General Sally Yates continued to duck questions about specifics at a July 8 Senate Judiciary Committee hearing, one of two Senate encryption hearings at which Comey testified that day.

“We’re not ruling out a legislative solution,” said Yates, the second-ranking official at the Justice Department, “but we think that the more productive way to approach this … is to work with the industry” on per-company solutions.

“The government’s position was that bad guys could use crypto and so therefore they needed to regulate it, and our position was good guys need crypto in order to protect us from bad guys.”

Later in the hearing, Comey suggested “some sort of regime where it’s easier to compel people to unlock their devices,” though he acknowledged the inherent Fifth Amendment concerns of self-incrimination.

Advertisement

Perhaps the Snowden leaks and general public anxiety about surveillance have made the government cautious. Perhaps it wants to cut deals with specific segments of the tech industry and it worries that going public with its ideas will make private negotiations harder. Schoen said that he had heard of “informal contacts” between the government and tech companies, with officials “contacting businesses and saying, ‘We find such and such product objectionable’ or ‘We want to make such and such a request about a product.’”

But there is another possibility. Specific proposals expose the government to specific criticism, and when that criticism comes from respected researchers, it can kill a proposal in its infancy. For such a small device, the Clipper chip certainly casts a long shadow.

“If you say, ‘Well, I think that people’s crypto keys should be copied in this way and transformed in this way, and then stored by this entity,’” Schoen said, “then it’s very easy for security experts to say, ‘That’s a terrible risk for reasons X, Y, and Z, because attackers can attack it in this way.”

Instead of offering specifics, FBI officials say they want more dialog with Congress, the security community, and the public. But that appears to be little more than a stalling tactic.

Advertisement

“We’ve been talking about this for 30 years,” said Hall, “and we have to see a serious fucking technical proposal from them about how they would actually do this.”

Rep. Hurd thought that the FBI’s problem was that it was moving too fast.

“I don’t think they know what the standard is,” he said. “If leading cryptographers don’t have an idea on how to make something work, then I don’t think that the FBI has a solution, a crystallized solution, on what they’re seeking. They have probably come out a little bit too premature with their conversations.”

“I’ve asked a lot of the questions of what, specifically, are they asking [for],” Hurd said, “and I haven’t gotten a very clear answer.”

Advertisement

But a lack of specifics is only part of what bothers these researchers. They are also annoyed at the FBI for apparently forgetting the Crypto Wars and their lessons altogether.

In a blog post on the EFF’s website titled “Eight Epic Failures of Regulating Cryptography,” Cohn compared two quotes from senior FBI officials nearly two decades apart.

On September 27, 2010, FBI General Counsel Valerie Caproni said of tech companies, “They can promise strong encryption. They just need to figure out how they can provide us plain text.”

On May 11, 1995, FBI Director Louis Freeh said, “[W]e’re in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want to make sure we have a trap door and key under some judge’s authority where we can get there if somebody is planning a crime.”

Advertisement

“Apparently in 1995 it was called a trapdoor,” Cohn said. “We call it a backdoor. Now they call it a frontdoor. It really doesn’t matter. The argument doesn’t work anymore. It never worked.”

A report from New America’s Open Technology Institute released in mid-June underscores the privacy community’s frustrations at seeing this debate resurface. The report, “Doomed to Repeat History? Lessons From the Crypto Wars of the 1990s,” touches on CALEA, the Clipper chip, export-crypto rules, and numerous other flashpoints in a debate that many participants assumed was settled.

“By the end of the 90s, after nearly a decade of debate, there was a broad bipartisan consensus that policies intended to weaken or restrict access to strong encryption were bad for privacy, bad for security, bad for business, and a bad strategy for combatting crime,” Kevin Bankston, OTI’s director, said in a statement. “Encryption backdoors are just bad policy, period, and that’s as true now as it was twenty years ago—even more so, when we need strong encryption to protect us from a growing range of cyberthreats.”

To Cohn, the quotes in her blog post underscore the extent to which the FBI is living in the past. “In some ways,” she said, “it’s the same arguments as if the world was still the same as in the 1990s.”

Advertisement

But the encryption debate is starkly different from what it was in the 1990s. These days, Cohn said, “we live in a world that’s dripping with encryption.” Everyone understands, on at least a basic level, that security matters. Cohn said that this awareness “makes it easier for us to explain why the government’s position is so wrong.”

Yet paradoxically, Tien said, the government is even more insistent about backdoors now than it was during the first Crypto Wars. “They seem to view it as an entitlement now,” he said. “They have become, in some ways, more hardened, more extreme, more out of touch with the reality of the everyday world.”

To researchers who have spent their careers studying code, the FBI’s belief that it can shut down the development of strong cryptography is ludicrous. Code, after all, is just math.

“This all requires an idea that people just won’t innovate in areas where the government doesn’t like them to,” said Cohn. “And that’s really never been the case.”

Advertisement

Hall said, “You’re basically trying to prevent people from doing certain kinds of math.”

Philip Zimmerman, the inventor of the widely used PGP encryption scheme, neatly summed up the problem with government encryption limits when he told a 1996 Senate subcommittee hearing, “Trying to stop this is like trying to legislate the tides and the weather.”

The mathematical nature of encryption is both a reassurance that it cannot be banned and a reminder that it cannot be massaged to fit an agenda—even agendas ostensibly meant to save lives. Software engineers simply cannot build backdoors that do what the FBI wants without serious security vulnerabilities, owing to the fundamental mathematical nature of cryptography. It is no more possible for the FBI to design a secure backdoor than it is for the National Weather Service to stop a hurricane in its tracks.

“No amount of presto change-o is going to change the math,” Cohn said. “Some people say time is on their side; I think that math is on our side when it comes to crypto.”

Advertisement

Civil-liberties groups and security experts do recognize the government’s need to collect information in order to stop criminals and terrorists. But they expressed skepticism that the FBI’s access to that information was as limited as the bureau claimed.

“The reality is they have access to more information today than they ever have,” Guliani said.

While the bureau might be facing difficulties with specific communications streams, Schoen said, “the overall trend” was a positive one for law enforcement. Facebook and Google are, after all, multibillion-dollar businesses built on the data we provide. But it was easy for Schoen to see how limited instances of “going dark” could spark agency-wide panic. “They feel like their information flows and their knowledge about people and their ability to monitor people should only grow and never contract or diminish in any respect,” he said.

There is some evidence that the “going dark” phenomenon has been overhyped. According to the 2013 Wiretap Report, state law-enforcement officials only encountered encryption in 41 wiretaps that year, and they were only unable to break the encryption in nine of those cases. In the 2014 report, state law-enforcement officials couldn’t break the encryption in only two of 22 wiretaps.

Advertisement
Apple CEO Tim Cook

Apple CEO Tim Cook

Jason Reed

Private-sector pressure

In the late 1990s and early 2000s, a powerful group of stakeholders largely remained silent as the government implemented and then reinterpreted CALEA. Now, with consumers learning more about encryption and foreign companies posing more serious competition, the tech giants of Silicon Valley are done staying on the sidelines.

Apple, Facebook, Google, Microsoft, Twitter, Yahoo, and other leading tech companies have repeatedly released statements supporting strong encryption standards and dispatched top executives to security conferences to rail against backdoors. The companies partnered up to send that May 19 letter to President Obama, which Comey called “depressing” in his Georgetown speech. In early June, the Information Technology Industry Council and the Software & Information Industry Association, two major trade groups, sent another letter to Obama asking him “not to pursue any policy or proposal that would require or encourage companies to weaken these technologies, including the weakening of encryption or creating encryption ‘work-arounds [sic].”

Advertisement

“We think the government was wrong then, and they’re wrong now.”

In addition to principled concerns about privacy and security, the tech companies are also concerned about the economic effects of being forced to build backdoors. American businesses have already lost more than $35 billion since the Snowden leaks began, largely because foreign customers are concerned about buying NSA-accessible systems. While Americans are better protected from the NSA than foreigners are, they, too, are worried about their privacy and security. The backdoor debate is just the latest iteration of Silicon Valley’s ongoing frustration with government surveillance requirements.

It’s no surprise, then, that the companies have lobbied fiercely against a backdoor mandate. Perhaps the most memorable example is Apple CEO Tim Cook’s speech at the White House cybersecurity summit in February. Cook was one of the only major tech CEOs to attend the summit; the heads of Facebook, Google, and Yahoo skipped the event and sent deputies instead, in what was viewed as a sign of displeasure at the White House for entertaining notions like backdoors. When Cook took the stage at Stanford University, where the event took place, he left no doubt that he felt the same way.

“People have trusted us with their most personal and private information, and we must give them the best technology we can to secure it,” Cook said. “Sacrificing our right to privacy can have dire consequences.”

Advertisement

“If you put a key under the mat for the cops, a burglar can find it, too.” 
   —Tim Cook

Cook went even further in a speech at the EPIC Champions of Freedom Awards Dinner in early June. “Some in Washington are hoping to undermine the ability of ordinary citizens to encrypt their data,” he said in a prerecorded video. “We think this is incredibly dangerous.”

“If you put a key under the mat for the cops, a burglar can find it, too,” Cook said. “Criminals are using every technology tool at their disposal to hack into people’s accounts. If they know there’s a key hidden somewhere, they won’t stop until they find it.”

The Daily Dot asked Apple, Google, and Microsoft for interviews with company executives about encryption and backdoors. Representatives for all three companies declined to make anyone available for interviews, but they pointed to various comments that reflected their corporate stances on the issue.

Advertisement

A Google spokeswoman directed the Daily Dot to comments made by Richard Salgado, Google’s director of law enforcement and information security, and David Lieber, the company’s senior privacy policy counsel, in a Reddit AMA.

“The security encryption provides is fundamental to our services,” Salgado wrote in response to a question about current authority to mandate backdoors, “and the government could not force us to change that by weakening or introducing a vulnerability into our encryption.”

In another comment, Salgado was even more blunt: “We don’t build backdoors into our services. We do not have a ‘surveillance portal.’”

A Microsoft representative pointed the Daily Dot to two resources: a World Economic Forum Q&A with Brad Smith, general counsel and executive vice president of legal and corporate affairs at Microsoft, and a blog post by his deputy, John Frank.

Advertisement

“The path to hell starts at the backdoor,” Smith said at the World Economic Forum. “You should not ask for backdoors. That compromises protection for everyone against everything.”

In his blog post about transparency, Frank wrote that backdoors and other anti-encryption solutions “should concern all of us.”

It is not as if these companies weren’t concerned in the 1990s. Rather, the early CALEA debate simply exempted them, allowing them to stay on the sidelines and avoid angering the U.S. government.

“The Microsofts of the world were not going to be worried about it” in the 1990s, Tien said. “Now they have to be.”

Advertisement

Max Fleishman

Two things have changed since the first Crypto Wars: The government is now targeting Internet services instead of phone companies, and the Snowden leaks have convinced more Americans to care about personal security and encryption.

“When you were talking about the first CALEA, it was a handful of telephone companies,” said Guliani. “Now, when you’re talking about potentially applying a front door or backdoor … onto a lot of different Internet-based companies, the potential to kill innovation and … stifle smaller companies is substantial in this circumstance.”

Just as important as the threat to innovation is the threat to user trust. Snowden’s documents, which exposed the breadth and depth of U.S. surveillance, sent a wakeup call to Americans who hadn’t been paying much attention to their privacy. Suddenly, companies faced enormous pressure to resist things like backdoors—and to prove that they weren’t secretly holding doors open for the government while protesting in public.

Advertisement

“They don’t want people to be afraid of their smartphones,” Rep. Chaffetz said of the tech companies. “They want people to have the confidence that what they type and what they see on their own phone is their own business. And I think that’s a reasonable request.”

Two things have changed since the first Crypto Wars: The government is now targeting Internet services instead of phone companies, and the Snowden leaks have convinced more Americans to care about personal security and encryption.

The Snowden disclosures, said Cohn, “really put a lot of support behind the idea that the government is going to take every advantage that it can to try to get access to our information, and so we need to take real steps to protect ourselves.”

Guliani noted that people are more concerned—and tech companies are under more pressure—in part because of how inescapable online banking and other sensitive transactions have become.

Advertisement

“[The] security of Internet transactions, mobile-phone data, all of that is much more important [now] given how critical that information is and how much more of that information is available,” she said.

Naturally, U.S. companies want to keep their American customers happy, but the backlash to backdoors is driven just as much, if not more, by concerns about international sales.

“Even if Americans were comfortable with the FBI having backdoors—and I don’t think they are—you can be damn sure that the rest of the world is even less happy about the idea that there’s some sort of open season on foreign communications inside of U.S. communications networks,” Tien said.

“The tech industry,” said Laperruque, “has had a lot of problems abroad because of the NSA revelations of the past couple years.”

Advertisement

Tien said that he and others at the EFF were “glad that the companies are as active as they have been lately.” Whether it was mostly due to principles or pragmatism, the fact that they were speaking up was what mattered most.

“In a modern era of highly intermediated communications,” Tien said, “we actually need for the intermediaries, the communications companies, to stand up for their users, stand up for Internet rights.”

The murky way forward

No one knows where the encryption debate is headed. One of the questions that arises from the Bernstein case striking down the government’s export-crypto rules is whether Congress can mandate backdoors at all. Cindy Cohn, the lawyer who helped win that case for the security community, said that it depended on how lawmakers crafted the bill.

Advertisement

Congress probably can’t require tech companies to get their encryption approved by the government—a system in which the government could simply keep disapproving encryption without backdoors until companies relented and added them. That system would constitute prior restraint against free speech, like in Bernstein and a case in another judicial circuit, Junger v. Daley, in which an appeals court held that “computer source code is an expressive means for the exchange of information and ideas about computer programming” and thus “protected by the First Amendment.”

“We don’t build backdoors into our services. We do not have a ‘surveillance portal.’”

Congress could avoid the prior-restraint problem by outright banning uncompromised cryptography (there would be no pre-approval of code), but then it would face an even tougher challenge. Bernstein and later software-code cases have created a legal regime in which that code is considered speech. When the government tries to regulate speech because of its content, it must meet the highest judicial standard of proof that the regulation is a necessary and narrowly focused execution of some important government responsibility. This standard, called strict scrutiny, is the main legal bulwark against wholesale censorship of things like inflammatory political speech.

“I think a broad ‘must have backdoor’ mandate would be massively overbroad and not meet the stringent First Amendment standards that would apply,” Cohn told the Daily Dot via email.

Advertisement

Still, there might be other ways to mandate backdoors in practice without writing a law that did so explicitly, though security experts weren’t sure what such a mandate would look like. They noted that the FBI has been careful to avoid suggesting it wants such a mandate; instead, it has suggested that it hopes the tech industry will come around of its own volition. Given the rhetoric from companies like Apple—and the peer pressure that the loudest voices implicitly exert on the quieter ones—voluntary industry cooperation seems unlikely.

It is also unclear whether there is an appetite in Congress for taking any action on this issue. The offices of Senate Majority Leader Mitch McConnell (R-Ky.) and Minority Leader Harry Reid (D-Nev.) did not respond to requests for comment about the tech companies’ May 19 anti-backdoors letter.

“There’s no official proposal or request or anything in front of Congress,” Rep. Hurd said. “I think anybody who’s even entertaining this idea recognizes that this is a non-starter.”

If a backdoor mandate somehow became law and survived legal challenges, the world of encryption would bifurcate. Tech companies wouldn’t be allowed to build strong, backdoor-free encryption into their products, but researchers could still develop and publish robust cryptographic standards on their own.

Advertisement

“It’s going to be fine for the tinkerers,” Hall said. “The trick is [that] you’re going to dramatically reduce the ability for folks to use it in commerce.”

That’s a major problem, Hall said, because cryptography works best when it is widely disseminated, evaluated, improved, and implemented. “Crypto is one of those things where, when you do it once, and you do it right, you want to make sure that you use that one thing,” he said, “because you want to have the scrutiny of the masses in order to make sure that you understand the weaknesses in a given cryptosystem.” If tech companies couldn’t use strong encryption, they wouldn’t investigate it—and potentially discover flaws in it, thus aiding the researchers—in the first place.

“It would be greatly disheartening to live in a country that weakened encryption and weakened software security being widely deployed,” said Heninger. She pointed to the abundance of data breaches occurring in a world where strong encryption is legal and said that things would only get worse if strong encryption were banned. “We would almost certainly see more possible breaches and more disasters in the long run,” she said, “due to various weakened security measures in deployed cryptosystems.”

This is a world that researchers dread. Hall said he understood the concerns that fueled an interest in backdoors. He recognized that his recommendations would mean “sacrific[ing] being able to get access to some information in very acute, very emotional, very visceral kinds of cases—child porn, child abduction, terrorism, these kind of things.” But, he said, “to undermine the cybersecurity and the trustworthiness of everything else to do that is just absurd.”

Advertisement

Still, Hall looked around the world—at the Charlie Hebdo attacks and murders of British soldiers that have prompted anti-encryption laws in France and the U.K., respectively—and acknowledged that there would be increasing pressure to regulate the cryptography shrouding similar attackers’ communications from law enforcement.

As CALEA-era arguments rear their heads again—the same words coming out of new mouths—Cohn sounded like a veteran military commander reluctantly gearing up once more.

“We think the government was wrong then, and they’re wrong now,” she said. “But we may have to spend a lot of energy to fight a war that we already won.”

Update 2:41pm CT, July 10: Added 2014 Wiretap Report details.

Advertisement

Correction: Microsoft’s deputy general counsel is John Frank.

Illustration by Jason Reed

 
The Daily Dot