“Make no mistake, criminals are listening to this testimony,” said Capt. Charles Cohen.
Cohen is in charge of the Indiana State Police Office of Intelligence and Investigative Technologies. He testified on Tuesday before the House Committee on Energy and Commerce that “criminals are using this as an education to make themselves more effective at the criminal tradecraft.”
Cohen’s particular fear is that formally and publicly explaining police objections to encryption technology is dangerous. There are criminals, in Cohen’s mind, who will watch C-SPAN to learn about WhatsApp, but don’t know how to Google “better encryption techniques” at home. The hearings opened with sweeping, scary pronouncements of risk, with only mild statements that encryption might protect non-criminals, too.
“I think we can stipulate that encryption is really great for people like me, who have bank accounts, who don’t want them to be hacked, but it’s just really a horrible challenge for all of us as a society, not just law enforcement, who have a child sex predator who’s trying to encrypt, or a terrorist,” stated Rep. Diana DeGette, D-Colo.
In so much that encryption is a problem for government, it’s also an obstacle for law enforcement.
Child sex predators and terrorists came up a lot in the hearings, as the people our lawmakers least want to extend the shield of encryption to. Yet as DeGette noted even when raising these fears, encryption is a huge part of most people’s day-to-day life. Online banking is impossible without encryption to make it secure. That’s a technology that protects everybody from criminals.
The hearing proceeded for two and a half hours, and it was filled with exaggerated metaphors like “a safe that police can’t open,” and even referred to as an incomplete map, like the the one left behind by Luke Skywalker in Star Wars: The Force Awakens.
There was a weird tension during the law enforcement portion of the testimony between wanting American companies like Google and Apple to not sell foreign-made encrypted communication apps, and wanting to make sure those companies don’t develop strong encryption themselves.
It’s also unclear how a hearing on encryption fell under the mandate of the Committee on Energy and Commerce, after last month’s hearings about Apple’s encryption before the House Judiciary Committee. In so much that encryption is a problem for government, it’s also an obstacle for law enforcement. Encryption isn’t a problem the way that, say, counterfeit identification is. Instead, it’s a tool used by people to keep their private information secure. That definition of “people” includes everyone, so invariably encryption will protect some criminal activity, but it also includes anyone working in energy or commerce who wants to communicate securely with their coworkers, or who wants their machines to send the signals they’re supposed to send, without interference.
There’s a general sense of agreement between police and Congress that the laws currently governing encryption are outdated, and new legislation is needed. “We lack the technical ability at this point to properly execute the laws that Congress has passed,” Cohen noted, citing the rapid pace of technological change.
There’s a lot of pressure to get it right. What’s not said nearly enough is the great danger that comes with getting it wrong.
Privacy advocates, though, are deeply skeptical of the bills currently proposed to address the challenges of new technology. The last hour of the hearing featured panelists from the industry and academia, focused on maintaining strong encryption, not as an obstacle specifically to police, but as a protection against crime.
“Creating a ‘back door’ into encryption means creating opportunity for more people with nefarious intentions to harm us,” testified Amit Yoran, president of RSA, a computer and network security company.
“Back doors into encryption will not address advanced threat actors who pose a material threat to our security,” Yoran said.
Daniel Weitzner, director of MIT’s Internet Policy Research Initiative, testified that the problem with using a police key into bank’s safe deposit box as an analogy to encryption is that “we’re all using the same safe, so if someone gets into the safe, everyone is at risk.”
Congress is eager to legislate on encryption, and if they need it, police will provide a grim catalog of terrorists and child molesters using encryption to justify a great weakening of cryptographic protections. Yet the very fact that this hearing is before a committee on Energy and Commerce, and not the judiciary, shows that the risk of getting encryption wrong jeopardizes the whole enterprise of electronic communication.
There’s a lot of pressure to get it right. What’s not said nearly enough is the great danger that comes with getting it wrong.
Kelsey D. Atherton is a Washington, D.C.-based technology journalist. His work appears regularly in Popular Science, and has appeared in Popular Mechanics and War Is Boring. Follow him on Twitter @AthertonKD.