Encryption is finally mainstream.
Government officials and technologists have been debating since the early 1990s whether to limit the strength of encryption to help the law-enforcement and intelligence communities monitor suspects’ communications. But until early 2016, this was a mostly esoteric fight, relegated to academic conferences, security agencies’ C-suites, and the back rooms of Capitol Hill.
Everything changed in mid-February, when President Barack Obama‘s Justice Department, investigating the terrorists who carried out the San Bernardino, California, shooting, asked a federal judge to force Apple to help the Federal Bureau of Investigation unlock one attacker’s iPhone.
What followed was an unexpectedly rancorous and unprecedentedly public fight over how far the government should go to pierce and degrade commercial security technology in its quest to protect Americans from terrorism.
But the San Bernardino iPhone dispute was only the most visible sign that a decades-old encryption war was not yet over. The second phase of this war had actually been raging for years. The rise of digitally savvy lone-wolf terrorists, combined with Silicon Valley’s increasing adoption of unbreakable encryption in the aftermath of the 2013 Edward Snowden intelligence leaks, had merely heightened existing and unavoidable tensions between the technology industry and government investigators.
Security agencies accustomed to being able to open any safe balked at the notion—advanced by American tech giants like Apple and their allies in the cryptography and privacy communities—that encryption was sacrosanct. City, state, and federal law-enforcement officials began pushing Congress to require that tech companies only use encryption they could break, in case investigators needed to serve them with warrants for user data.
The computer-science professors, cryptographers, and digital-rights advocates who had beaten back similar demands in the 1990s experienced deja vu. That series of fights, which they called the “Crypto Wars,” had only been the first round of a prolonged conflict. Now it was time to settle in for Crypto Wars, Round 2.
But where and when did this new phase began? Pinpointing that exact moment may be impossible, but the timeline below provides a relatively comprehensive overview of the different fights that constitute the new Crypto War, from the early months after the Sept. 11, 2001, terrorist attacks to the present day.
Jan. 9, 2003: The Justice Department drafts the Domestic Security Enhancement Act of 2003, aka “Patriot Act II”
As Congress debated the USA Patriot Act after the Sept. 11, 2001, terrorist attacks, supporters of strong encryption worried that lawmakers would use the climate of fear and anxiety to slip in a provision targeting encryption. That didn’t happen; the Patriot Act didn’t address the subject at all. But nearly two years after it took effect, President George W. Bush’s Justice Department prepared a successor bill, the Domestic Security Enhancement Act of 2003, that did address encryption—in a major way.
A Jan. 9, 2003, draft of the bill, first obtained by the Center for Public Integrity, included a provision adding a minimum of five years to the sentence of any convicted felon who used encryption to conceal “incriminating communication or information” related to their crime.
The bill, which critics dubbed “Patriot Act II,” never became law, but it was the government’s first foray into encryption regulation in the modern era—and like the eventual Apple–FBI fight over the San Bernardino iPhone, it took place in a climate of intense terrorism anxiety.
August 2007: Security researchers expose backdoor in NSA-backed encryption protocol
In March 2007, the National Institute of Standards and Technology published a document describing four methods of generating pseudorandom numbers. Random number generation is the basis of cryptography; encryption protects people’s data by scrambling it according to this random process, making it impossible to predict how the scrambling is occurring—and how to unscramble it.
Five months later, cryptographers Dan Shumow and Niels Ferguson delivered a talk at a security conference outlining weaknesses in one of NIST’s four methods, known as Dual Elliptic Curve Deterministic Random Bit Generator (Dual_EC_DRBG). They were the first people to widely publicize the idea that Dual_EC might contain a backdoor. (Two members of a standards-writing body had previously written a little-known patent detailing the possibility of such a vulnerability.)
Reviewing their presentation, the eminent security researcher Bruce Schneier wrote that Dual_EC “contains a weakness that can only be described as a backdoor.”
Essentially, Dual_EC was created such that there was a way to predict the supposedly random number-generation process. Anyone who could do this could unscramble any data encrypted with Dual_EC.
In September 2013, the New York Times reported that the NSA had “aggressively pushed” NIST and the International Organization for Standardization to adopt Dual_EC as a cryptographic standard. “Eventually,” read a document provided to the Times by former intelligence contractor Edward Snowden, the “NSA became the sole editor” of the standard.
In December 2013, it emerged that the NSA had paid leading security firm RSA $10 million to add Dual_EC to one of its products. At the time, RSA’s decision had encouraged many others in the industry to adopt the code. RSA’s chief technologist later told Reuters that the company “could have been more skeptical of [the] NSA’s intention.”
In April 2014, NIST formally withdrew its recommendation of Dual_EC. NIST’s unwitting assistance in weakening global security seriously damaged its reputation as a reliable government partner for the security industry.
January 2008: FBI begins briefing lawmakers about the encryption threat
The first instance of the FBI using its now-famous name for encryption shrouding criminals’ communications—“going dark”—appears to have been in early 2008. In January, then-FBI Director Robert Mueller testified before both houses of Congress and included a “going dark” page in his briefing book, according to the Electronic Frontier Foundation, which obtained the documents through the Freedom of Information Act.
Senior FBI officials continued to meet with key lawmakers and committee staff to discuss “going dark.” Kerry Haynes, the bureau’s executive assistant director for science and technology, told members of Congress that “the ability of the FBI to collect intelligence and conduct investigations through the use of technology is shrinking ever[y] day.”
Included in the EFF’s FOIA documents was a page from the FBI’s internal wiki describing problems with lawful-intercept capabilities, which read:
In the face of more diverse and complex communications services and technologies, including the rapid growth in diverse protocols, proprietary compression techniques, encryption, and other technological factors, law enforcement is now faced with several especially daunting lawful interception challenges.
Feb. 17, 2011: The FBI’s top lawyer brings “going dark” to the public’s attention
Valerie Caproni, the FBI’s general counsel, provided the first lengthy explanation of the “going dark” problem at a House Judiciary Committee subcommittee hearing in February 2011.
At the time, Caproni downplayed the perceived threat of encryption itself. “Addressing the Going Dark problem does not require fundamental changes in encryption technology,” she told lawmakers. “We understand that there are situations in which encryption will require law enforcement to develop individualized solutions.”
That attitude would not last.
May 4, 2012: FBI pushes legislation to ensure a wiretap-friendly Web
While Caproni publicly promised that the FBI wasn’t seeking “fundamental changes in encryption technology,” her office prepared draft legislation that would force email providers, VoIP service operators, IM clients, and social networks to modify their services to ensure that they were wiretap-friendly. The bill would modify the Communications Assistance for Law Enforcement Act (CALEA), a 1994 U.S. law that requires phone and Internet providers to design their equipment to allow for law enforcement wiretaps.
The FBI had been working on the draft legislation for years, but by the middle of 2012, it was ready for primetime. Yet while the Justice Department signed off on the bill, the Obama White House declined to push it on the Hill. In the end, the legislation was never introduced.
June 6, 2013: The second Snowden document exposes a massive Internet surveillance program
For years, the NSA and the FBI directly accessed U.S. technology companies’ servers to scoop up their users’ data without a warrant. By relying on the program, codenamed PRISM, the agencies could circumvent warrant requirements and avoid alerting the companies altogether.
Silicon Valley exploded with outrage in response to the PRISM revelations, which came from former NSA contractor Edward Snowden’s trove of documents.
A few months later, more Snowden files revealed that the NSA had penetrated the links between data centers owned by Google and Yahoo around the world.
The PRISM and data-center leaks accelerated a trend toward strong encryption that would eventually create new problems for law-enforcement agents. In March 2014, partly in response to the leaks, Google announced that it was encrypting all Gmail data that flowed between its data centers. Yahoo followed suit a few weeks later.
Sept. 5, 2013: Snowden documents expose NSA campaign to break encryption
The same Snowden-sourced New York Times story that confirmed the NSA’s involvement in Dual_EC also exposed a vast anti-encryption operation called Bullrun.
“Having lost a public battle in the 1990s to insert its own ‘back door’ in all encryption,” the Times reported, “[the NSA] set out to accomplish the same goal by stealth.”
The Bullrun program involved “custom-built, superfast computers to break codes” and partnerships with U.S. and foreign technology firms “to build entry points into their products.”
“In some cases, companies say they were coerced by the government into handing over their master encryption keys or building in a back door,” the Times revealed. “And the agency used its influence as the world’s most experienced code maker to covertly introduce weaknesses into the encryption standards followed by hardware and software developers around the world.”
The Times story set off a firestorm. NIST declared that it “would not deliberately weaken a cryptographic standard” and explained that it was legally required to consult with the NSA on its cryptographic work. Security researcher and Johns Hopkins professor Matthew Green told the Times that “a number of people at NIST feel betrayed by their colleagues at the NSA.”
September 2014: Apple and Google add full-disk encryption to their mobile operating systems
Apple fired a shot across the bow of police everywhere when it announced on Sept. 17, 2014 that iOS 8, the just-unveiled update to its mobile operating system, included encryption designed to prevent the company from decrypting its users’ data.
iOS’s new full-disk encryption shut out Apple by tying the security features on iOS devices to a user’s passcode. Because only users knew their passcodes, no one else could reverse their devices’ encryption.
Google’s Android operating system already offered full-disk encryption as an option, but the company announced two days after Apple unveiled iOS 8 that it would soon become a default feature. It later backed away from this stance—unlike Apple, Google does not manufacture Android phones, and the companies that do worried about performance issues associated with full-disk encryption—but it returned to it in late 2015.
January 12, 2015: U.K. prime minister proposes banning end-to-end encrypted apps
While promoting his government’s planned surveillance bill, U.K. Prime Minister David Cameron argued that messaging services without a decryption capability should be banned.
“In our country, do we want to allow a means of communication between people which … we cannot read?” Cameron asked.
If Cameron’s proposal became law, Facebook‘s WhatsApp, Apple’s iMessage, and Telegram, among other popular services, would have to close. These apps use end-to-end encryption that even their creators cannot pierce.
Five days before Cameron’s speech, two terrorists killed 11 people and injured 11 others in an attack on the French satirical newspaper Charlie Hebdo. It was the first of several terrorist attacks over the next few years that officials would seize upon to push for expanded government surveillance powers.
Oct. 10, 2015: Obama administration says it won’t seek encryption backdoor legislation
Despite lobbying by officials like Comey, Obama’s White House decided against asking Congress for legislation mandating access to encrypted data.
The White House’s Office of Science and Technology Policy and the FBI had butted heads internally over the decision, the New York Times noted. The NSA and other intelligence agencies with powerful secret tools, meanwhile, “were less vocal.”
A few weeks earlier, the Washington Post had obtained a National Security Council memo outlining possible ways to guarantee access to encrypted data, including a split-key approach where the government held part of the key; a mandate for companies to add backdoors to their code; and a requirement that they build a special hardware port only accessible with a law-enforcement tool.
Oct. 20, 2015: Apple says it shouldn’t be forced to unlock New York drug suspect’s iPhone
In a little-noticed filing in a Manhattan district court that presaged the eruption of a global news event, Apple strongly objected to the government’s request for an order requiring the company to help police unlock a criminal suspect’s iPhone.
Citing the inapplicability of a 1789 law called the All Writs Act and the precedent that complying with the order would set for other types of demands, Apple indicated that it would fight the order if the magistrate judge issued it.
Oct. 27, 2015: White House petition for Obama to reject backdoors passes 100,000-signature mark
As officials like Comey continued to beat the drum for encryption backdoors, a petition organized by leading civil-liberties groups indicated that the public might oppose a backdoor push.
The petition on the White House’s website, which urged Obama to publicly oppose mandated access to encrypted data, crossed the 100,000-signature threshold on October 27, earning the right to receive a formal response from the administration.
In its initial response more than a month later, the White House dodged the substantive issue and merely promised to meet with the activists who launched the campaign.
Attendees at that meeting a few days later told reporters that the White House was poised to say more in the coming weeks. A senior administration official promised “a more fulsome response soon,” but as of this writing, that has not arrived.
Nov. 4, 2015: British government introduces Investigatory Powers Bill
While Americans debated encryption policy with their government, the United Kingdom’s ruling Conservative Party pursued a controversial expansion of counterterrorism powers that left the door open for British backdoor mandates.
The Investigatory Powers Bill immediately earned scorn from Apple and other tech companies for its vague language concerning encryption. Several parliamentary committees also recommended that the Home Office clarify what it expected from tech firms.
In a statement to the Daily Dot on March 31, a Home Office spokesperson rejected the notion that the bill envisioned backdoors, writing, “It maintains the existing obligation for telecommunications companies to assist in the execution of warrants which can themselves only be issued where necessary and proportionate.”
Of course, in order to comply with warrants targeting encrypted data, companies would need to design their encryption to facilitate that compliance—and that would amount to a backdoor in otherwise unbreakable encryption.
Nov. 13, 2015: Terrorists kill 130 people in Paris
Coordinated suicide bombings and shootings at a stadium, a restaurant, and a theater in Paris and a nearby suburb sent France into a national state of emergency, with the government issuing a curfew for the first time since World War II. As the Islamic State claimed responsibility for the deadliest attack in the European Union since the 2004 Madrid train bombings, Western nations braced for a new wave of ISIS-inspired extremism at home.
Almost immediately, American officials began proposing new surveillance powers, including encryption backdoors, to help catch extremists in the United States before they struck. Former intelligence-agency heads and the leader of the House Homeland Security Committee argued that unbreakable encryption increased the likelihood of a successful attack.
In the months that followed, encryption panic sometimes obscured the facts. Many media outlets ran with a story about an Islamic State encrypted messaging app called Alrawi that fed the narrative advanced by Comey and other opponents of unbreakable encryption. But the Daily Dot reported that the app was fake; the Android application file that purported to send and receive encrypted messages did no such thing.
Nov. 17, 2015: Congress starts looking at encryption
The first promise by a committee chairman to examine encryption came four days after the Paris attacks.
Both Richard Burr, the North Carolina Republican who chaired the Senate Intelligence Committee, and John McCain, the Arizona Republican who headed the Senate Armed Services Committee, promised to study the issue, with McCain calling the ability of companies to offer unbreakable encryption “unacceptable.”
Six days later, the heads of the House Homeland Security and Intelligence Committees also promised to examine whether unbreakable encryption constituted a national security threat. Both congressmen—Mike McCaul (R-Texas) and Devin Nunes (R-Calif.), respectively—sounded confident that this was the case.
Dec. 2, 2015: The San Bernardino shooting
The world was still reeling from the deadliest terrorist attack on an E.U. country in a decade when two ISIS-inspired jihadists shot and killed 14 people at a San Bernardino, California, health center in the deadliest terrorist attack on U.S. soil since Sept. 11, 2001.
Seven days later, at a Senate Intelligence Committee hearing with the heads of the major intelligence agencies, Comey revealed that the FBI had been unable to access an iPhone used by one of the now-deceased shooters, Syed Rizwan Farook.
Dec. 9, 2015: Senate Intelligence Committee leaders announce plans for backdoor bill
A few hours before Comey first publicly mentioned Farook’s now-infamous iPhone, Sen. Dianne Feinstein (D-Calif.), the top Democrat on the Senate Intelligence Committee, announced that she and Burr, the committee’s chairman, were working on a bill to guarantee investigators’ ability to read encrypted data.
“Today’s messaging systems are often designed so that companies’ own developers cannot gain access to encrypted content—and, alarmingly, not even when compelled by a court order,” Burr wrote in a Dec. 23 Wall Street Journal op-ed, which he concluded by saying, “It’s time to update the law.”
Dec. 17, 2015: Firewall maker reveals likely backdoor in its code
Almost two years after NIST told companies to stop using the Dual_EC random-number generator in their encryption, Juniper Networks announced the discovery of “unauthorized code” in several versions of its ScreenOS software, which ran on its NetScreen firewalls.
A few days later, security researchers revealed that the code was based on Dual_EC, the NIST pseudorandom-number generator that the NSA had secretly sabotaged with a backdoor.
In 2013, Juniper had defended its decision to continue using Dual_EC—despite the revelation of its weakness—by saying that it had paired Dual_EC in its firewall software with a much stronger number generator. But the researchers discovered that Juniper had actually added Dual_EC to its code more than a year after its flaws were publicly revealed.
Juniper removed Dual_EC and upgraded ScreenOS’s security in January, but the fallout from the incident provided a case study in the dangers of encryption vulnerabilities. The House Oversight Committee began investigating whether any federal agencies had used Juniper firewalls, raising the possibility that, with its encryption backdoor, the NSA had exposed the federal government to hackers.
Dec. 23, 2015: China says it modeled its new encryption policy on U.S. law
China’s recently passed counterterrorism law, which included vague language that could morph into a backdoor mandate, earned it significant criticism, but Beijing stressed that it was only following the lead of the United States.
Foreign Ministry spokesman Hong Lei told Reuters that the government had modeled its law in part on CALEA, the U.S. law that mandates wiretap access on Internet- and phone-provider networks.
“While formulating this law,” Lei said, “we referred to the laws of other countries, including the United States.”
The Chinese government’s move to further consolidate its control over tech companies in this way reflected the global nature of encryption and technology policymaking—and the possibility that future U.S. government actions to undermine encryption would spread overseas, finding especially receptive audiences in repressive regimes.
Dec. 27, 2015: Lawmakers unveil bipartisan proposal for encryption commission
Sens. Burr and Feinstein weren’t the only U.S. lawmakers working on legislation to address the challenges posed by encryption. As the new year approached, Sen. Mark Warner (D-Va.) and Rep. Michael McCaul (R-Texas) penned a Washington Post op-ed announcing their intent to convene a commission to study digital-security issues like encryption.
The authorizing legislation, unveiled on Feb. 29, proposed a 16-member commission that included one Republican and one Democratic appointee from each of eight fields.
As Warner and McCaul began announcing co-sponsors for their effort, civil libertarians decried the commission as a sham unlikely to produce a technologically serious or politically risky solution like banning backdoors.
Jan. 4, 2016: The Netherlands becomes the first country to formally reject backdoors
“[T]he government believes that it is currently not desirable to take legal measures against the development, availability and use of encryption within the Netherlands.”
With these words, the administration of Dutch Prime Minister Mark Rutte became the first national government to oppose encryption backdoors.
Jan. 12, 2016: French lawmakers considers backdoor proposal
The French legislature briefly flirted with adding a backdoor mandate to the government’s sweeping counterterrorism powers. Conservative delegate Nathalie Kosciusko-Morizet and 17 supporters added an amendment to France’s wide-ranging “Digital Republic” bill that would require tech companies to design their systems so that they could turn over encrypted data in readable form.
This effort fizzled two days later, when the French secretary of state described backdoors as a “vulnerability by design,” “inappropriate,” and “not the right solution.”
Jan. 21, 2016: U.S. state lawmakers propose bans on unbreakable encryption
California and New York became the first two states to consider backdoor mandates when state lawmakers there introduced bills banning unbreakable encryption. The New York proposal cited terrorism as the threat necessitating the legislation, while the California bill mentioned human trafficking.
Jan. 28, 2016: Senior Republican senator suggests that backdoors are the wrong approach
Senate Homeland Security Committee Chairman Ron Johnson (R-Wisc.) broke with party orthodoxy when he criticized backdoors using the same arguments as civil libertarians and cryptographers.
Johnson pointed out that a U.S. law would push criminals onto foreign-made platforms, and he suggested that, after terrorist attacks, Americans should be wary of legislative “rush[es] to judgment.” Even more strikingly, he suggested that support for backdoors came from “just not understanding the complexity” of the issue.
Feb. 1, 2016: Study questions extent of “going dark”
Computer-security experts at Harvard University’s Berkman Center for Internet and Society produced a study that undercut the FBI’s “going dark” argument by explaining why fears of an encryption revolution were overblown. Among their findings: Unbreakable encryption is not likely to become the norm; metadata is not encrypted; and the growing Internet of Things is producing new pools of unencrypted behavioral and biological data all the time.
Feb. 1, 2016: Massachusetts judge secretly orders Apple to help cops access iPhone
In an opinion that remained sealed until April 8, Magistrate Judge Marianne Bowler ordered Apple to help Massachusetts police access data on an alleged gang member’s iPhone by providing “reasonable technical assistance.” The order did not require Apple to unlock the device or decrypt any encrypted data.
Most of the details in the case remained unknown as of this writing because many court documents were still under seal.
Feb. 10, 2016: House lawmaker introduces bill to prevent state encryption bans
Responding to efforts in California and New York, Rep. Ted Lieu (D-Calif.) introduced a bill to ban states from regulating the kind of encryption that companies could use. No state, the bill said, could require or even ask a company to “have the ability to decrypt or otherwise render intelligible information that is encrypted or otherwise rendered unintelligible using its product or service.”
Feb. 11, 2016: Study casts doubt on ability to ban unbreakable encryption
Opponents of encryption backdoors frequently assert that a U.S. law—or any national or regional law, for that matter—would only push criminals and terrorists to products and services created elsewhere
Security expert Bruce Schneier and two other researchers lent scholarly credibility to this argument with a paper entitled “A Worldwide Survey of Encryption Products,” which identified 546 encrypted products produced outside the United States.
The American government, the study warned, cannot control the global encryption environment on its own.
Feb. 16, 2016: The San Bernardino iPhone court battle erupts
In the early evening hours of an otherwise quiet Tuesday in February, a magistrate judge in Riverside, California, ordered Apple to write a custom version of its iOS mobile operating system that would disable several security features on an iPhone 5c running iOS 9.
This was the hardware and software configuration of San Bernardino shooter Syed Rizwan Farook’s phone, the one that Comey’s agents couldn’t access.
The FBI wanted to flood the phone with password guesses, which would eventually unlock it when the bureau found the right password. But to do this, it needed Apple to override, among other things, the auto-erase function that wiped an iPhone after 10 incorrect password guesses.
Magistrate Judge Sheri Pym based her order on the All Writs Act, the law that Apple had already said was inapplicable in the New York drug case. Unlike her colleague in Manhattan, however, Pym quickly accepted the Justice Department’s argument in this highly charged terrorism investigation and granted its request for an All Writs Act order.
Apple CEO Tim Cook promised in a letter to customers that his company would fight the order, and Apple’s lawyers soon filed a scathing brief urging Pym to vacate her order. But even before Apple had submitted that brief, the Justice Department—in a rare preemptive rebuttal based on Cook’s letter—filed a motion to preserve the order that assailed Apple’s motives and reasoning. The battle was joined.
Polls taken in the first few weeks of the court fight offered mixed results, with two surveys backing Apple and one backing the FBI. In one of the surveys that found stronger support for Apple, 55 percent of respondents said they believed that the government was seeking a legal precedent to “spy on iPhone users.”
It was true that, across the country, Apple was fighting at least nine court battles over access to locked iPhones. Comey initially denied seeking a precedent with the San Bernardino order, but he eventually admitted that the order could “potentially” set a precedent and said, “I happen to think it’s a good thing.”
Feb. 29, 2016: New York judge denies government’s request for iPhone unlocking order
As the San Bernardino fight entered its second week, the magistrate judge in the New York drug case, James Orenstein, denied the Justice Department’s motion for an All Writs Act order.
“Congress has considered legislation that would achieve the same result [as the requested order] but has not adopted it,” Orenstein wrote in explaining his reasoning. This was a reference to two things: the fact that CALEA, the 1994 wiretapping law, exempted companies like Apple from having to provide the kind of assistance that the government demanded; and the fact that lawmakers had recently debated, but declined to proceed further on, backdoor mandates.
The government appealed Orenstein’s ruling to a different judge on his district court, and as of this writing, the appeal remained active.
March 1, 2016: Apple appeals San Bernardino court order
Apple formally appealed Judge Pym’s ruling and asked another judge on her district court to review it. The Justice Department soon responded with another brief urging the court not to do so. Then, the week before both parties were set to testify in Pym’s courtroom, Apple reiterated its position in a final brief that took the government to task for what it called “an exercise in wishful thinking, not statutory interpretation.”
Apple deployed several arguments to support its rejection of the order: That CALEA’s exemptions for companies like Apple indicated that Congress didn’t want to force it to design wiretap-friendly encryption; that writing the requested software would constitute a significant technical and logistical burden; that forcing engineers to undermine their own code violated the First Amendment by compelling adverse speech; that laws requiring companies to help execute warrants don’t cover writing new code; and that the test established by a landmark All Writs Act case did not properly cover Apple in this instance.
The company also warned the court that, if it were forced to write this code, a precedent would be set that could allow another judge elsewhere in the court to compel the creation of more insidious code, such as surveillance malware that remotely activated a Mac’s webcam.
March 1, 2016: San Bernardino fight takes center stage in Congress
Lawmakers began weighing in on the Apple–FBI fight as soon as it erupted, but it wasn’t until a March 1 House Judiciary Committee hearing that Congress brought in Comey and Apple’s top lawyer for marathon testimony on the issue.
Lawmakers alternately grilled Comey on the troubling consequences of the San Bernardino order and praised him for dealing with a recalcitrant holier-than-thou tech firm, depending on their ideological leanings and technical expertise.
Meanwhile, two of Congress’s most zealous privacy advocates, Sen. Ron Wyden (D-Ore.) and Rep. Zoe Lofgren (D-Calif.) promised to use the legal showdown as an opportunity to teach their fellow lawmakers about encryption.
March 3, 2016: Silicon Valley backs Apple in San Bernardino court fight—but other tech players stay silent
Apple assembled a diverse coalition of supporters in its fight against the San Bernardino court order, including social-media companies like Facebook and Twitter; groups representing Silicon Valley giants like Google and Microsoft; and the usual array of civil-liberties organizations. But Apple’s hardware competitors in the smartphone and computer spaces remained silent. The Daily Dot contacted all of the major players in these industries, and none of them offered a comment on the issue.
March 3, 2016: Law-enforcement groups back government in San Bernardino court fight
Apple’s critics didn’t sit idly by while tech companies rushed to embrace it. As the deadline for amicus briefs loomed, major national law-enforcement groups like the Federal Law Enforcement Officers Association and the National Sheriffs’ Association filed briefs backing the Justice Department.
In New York City, the Manhattan district attorney and the NYPD deputy commissioner assailed Apple for withholding its help. D.A. Cyrus Vance had frequently criticized the company for not helping New York police unlock phones in his cases, saying he was holding onto more than a hundred such phones that could not be accessed.
Relatives of some of the San Bernardino victims—but not all of them—also sided with the government.
March 13, 2016: The San Bernardino iPhone fight makes Last Week Tonight
These days, the test of an issue’s mainstream potency is its ability to fill the main segment on John Oliver‘s late-night HBO show Last Week Tonight.
With tensions high and misinformation rampant, the former Daily Show host nodded at the raging debate by giving encryption his signature comedic treatment, topping it off with a fake Apple ad showing engineers desperately trying to head off attackers who keep puncturing their software’s defenses.
March 21, 2016: The Justice Department says that a third party has stepped forward with a possible way to unlock the San Bernardino iPhone
The day before the government and Apple were set to face off in a California courtroom over the San Bernardino shooter’s iPhone, Obama administration lawyers told the judge that a third party had presented them with an alternate unlocking method, possibly rendering their request moot and necessitating a delay in the hearing.
While FBI agents tested the method, security experts speculated on what it might involve. One suggestion: NAND mirroring, in which the device’s flash memory is copied onto an external source so that it can be reloaded every time 10 incorrect passcode guesses trigger a device erasure.
March 28, 2016: The Justice Department accesses the San Bernardino iPhone and drops its demand for the court order
The most famous judicial battle over encryption in U.S. history ended before it even reached a courtroom.
A week after it revealed the existence of the third-party technique, and 41 days after it secured the initial court order, the Justice Department told the court that it had accessed the San Bernardino shooter’s iPhone and asked the judge to vacate her order.
March 28, 2016: The FBI gets its first post-San Bernardino request from local police for help unlocking an iPhone
On the same day that the Justice Department abandoned its court fight with Apple, Arkansas police asked the FBI for help accessing an iPhone and an iPad used by two teenagers arrested for a double murder.
While Comey later revealed that the third-party technique was limited in scope to the iPhone 5c, the reminder that local police often asked their federal colleagues for help unlocking devices transformed the San Bernardino story into a new kind of security conversation.
March 31, 2016: The debate turns to vulnerability disclosure
As the dust from the San Bernardino fight settled, security engineers—especially those at One Infinite Loop—began processing an unsettling fact: Someone had given the U.S. government a previously unknown iPhone exploit. Now the question was, would the government tell Apple about the problem so that the company could could fix it?
Several factors weighed against this outcome. For one thing, according to Reuters, the government didn’t own the technique—it had only bought the ability to use it—so it couldn’t submit the exploit to the White House-coordinated disclosure review process. Another problem was that the FBI had classified the tool that it designed with the exploit and used on the phone, limiting the government’s ability to speak publicly about it.
Security experts and technologists wasted no time in calling for disclosure of the vulnerability, arguing that Apple’s interest in locking down a flaw that could endanger vast numbers of its customers outweighed the government’s interest in retaining the ability to exploit the unpatched hole for occasional investigative purposes.
April 5, 2016: WhatsApp, with 1 billion users, deploys end-to-end encryption
Facebook’s massively popular messaging service WhatsApp locked down all of its users’ communications by implementing end-to-end encryption in early April. The feature was previously only available for communications between Android devices.
Moxie Marlinspike, who helped create the Edward Snowden–approved messaging app Signal, assisted WhatsApp in the process.
Congressional pushback was swift, with Sen. Tom Cotton (R-Ark.), a national-security hawk, calling the move “an open invitation to terrorists, drug dealers, and sexual predators to use WhatsApp’s services to endanger the American people.”
April 13, 2016: Senators formally unveil bill mandating weakened encryption
After months of speculation and several days of analysis of a leaked draft, Sens. Richard Burr (R-N.C.) and Dianne Feinstein (D-Calif.) officially introduced their bill requiring companies to provide authorities with encrypted user data in an “intelligible” format or render any assistance necessary to make it readable.
Security experts immediately assailed the bill, dubbed the Compliance with Court Orders Act of 2016. University of Pennsylvania professor Matt Blaze said it was worse than the Clipper chip, the NSA backdoor device whose major flaw he famously exposed in the 1990s, leading the Clinton administration to shutter its plans to mandate the chip’s use in consumer technology.
Tech companies remained individually mum on the bill, but trade groups representing them did not hold back.
The Internet Association said it would “harm national security and put Americans at risk.” Reform Government Surveillance—whose members include Apple, Facebook, Google Microsoft, and Twitter—signed a letter to Burr and Feinstein warning of its “unintended, negative consequences.”
Major law-enforcement groups, meanwhile, swiftly endorsed the bill, praising the senators’ effort to preserve authorities’ access to evidence.
April 19, 2016: San Bernardino iPhone search turns up nothing
FBI agents scouring Syed Rizwan Farook’s infamous iPhone didn’t find any evidence that he had communicated with other terrorists during the 18 minutes that authorities previously couldn’t account for, officials told CNN.
Officials described the anticlimactic results as a vindication of their desire to access the phone, saying that they could now discount scenarios involving unknown third parties that they hadn’t been able to rule out before.
The phone did contain other previously unavailable data, which is “still being analyzed,” CNN reported.
April 19, 2016: Law-enforcement officials struggle to grasp encryption’s technical reality
The House Energy and Commerce Committee’s encryption hearing was a study in contrasts. While technologists described the fundamental mathematical realities of encryption and the futility and inadvisability of attempts to weaken it, witnesses representing police agencies fumbled with strange and alarming technological analogies to make their case.
One witness shrugged off the security implications of weakening encryption by saying that tech companies could still protect their users’ data with “firewalls.” He later said that companies could strip the encryption from a single device in a safe way because they were doing it inside their own firewall, which he compared to unlocking a safe deposit box after locking the bank doors.
These and other arguments baffled the cryptographers who were watching—and attending—the hearing. Matt Blaze, the University of Pennsylvania professor who helped defeat the Clinton administration’s Clipper chip, said the firewalls comments had “baffled” him.
April 20, 2016: U.K. law-enforcement official confirms that new spy bill would let cops force companies to decrypt data
Chris Farrimond, the director of the National Crime Agency, confirmed the worst fears of technologists when he told members of Parliament that a section of the Investigatory Powers Bill, which the legislature was still debating, could be used to force the decryption of user data.
Under sections 217 and 218 of the bill, companies that receive “technical capability notices”—which can include directives to remove “electronic protection applied … to … communications or data”—must comply with those notices.
The British government previously told the Daily Dot that “the bill does not create backdoors” but rather “maintains the existing obligation for telecommunications companies to assist in the execution of warrants.”
But given that a requirement to comply with a demand to decrypt data equates to a requirement not to use unbreakable encryption, the bill would effectively outlaw unbreakable encryption, producing the same effect as a more deliberately worded backdoor mandate.
April 22, 2016: A third party presents the Justice Department with the passcode to the New York iPhone, ending another legal fight
The U.S. government dropped its request for an All Writs Act order in the New York drug suspect’s case after someone gave authorities the passcode to his iPhone. The Justice Department would not identify the individual who provided the code.
At the time of the government’s surprise Friday-night filing, a U.S. district court judge was reviewing its appeal of a magistrate judge’s ruling denying its request for the order.
April 25, 2016: Director of National Intelligence says Snowden leaks accelerated adoption of strong encryption
Director of National Intelligence James Clapper, who oversees the 17-member U.S. intelligence community, linked Edward Snowden’s disclosures about mass surveillance, hacking, and tapping into corporate centers to the rise of unbreakable encryption, which he said was “not a good thing” for spies tasked with intercepting foreign adversaries’ communications.
“As a result of the Snowden revelations,” Clapper said at a Christian Science Monitor breakfast, “the onset of commercial encryption has accelerated by seven years.”
Clapper attributed the finding to an NSA analysis.
Snowden, a staunch advocate of strong encryption, responded by tweeting, “Of all the things I’ve been accused of, this is the one of which I am most proud.”
April 27, 2016: FBI says it won’t submit San Bernardino iPhone exploit to government review board
The FBI announced that it would not submit the tool that allowed it to access Syed Rizwan Farook’s iPhone to a White House-managed review process because it did not know enough about how the tool worked.
In a statement, Amy Hess, the FBI’s executive assistant director for science and technology, said that the FBI had purchased the ability to use the tool but not “the rights to technical details about how the method functions, or the nature and extent of any vulnerability upon which the method may rely in order to operate.”
Privacy activists and technologists had been pressing the FBI to submit the tool to the government’s Vulnerability Equities Process, which weighs whether to tell tech companies about flaws that police and spies use to hack into their products.
VEP reviews are seen by many observers as flawed because they give overwhelming weight to national-security professionals’ arguments against disclosure. But those observers had still hoped that the process might result in the government informing Apple of the flaws underpinning the San Bernardino technique.
The FBI’s conclusion that it lacked the information necessary for a thorough VEP review means that Apple may never learn what software or hardware flaw enabled the agency to penetrate the security on Farook’s device.
This timeline will be updated as the new Crypto Wars progress.
Correction: Niels Ferguson and Dan Shumow were the first to widely publicize the idea that Dual_EC might contain a backdoor. And the number of people killed in the attack in San Bernardino was 14.