As politicians and counter-terrorism officials search for lessons from the recent attacks in Paris and San Bernardino, California, senior officials have called for limits on technology that sends encrypted messages.
It’s a debate that has repeatedly recurred for more than a decade. In the 1990s, the Clinton administration directed technology companies to store copies of their encryption keys with the government. That would have given the government a “backdoor” to allow law enforcement and intelligence agencies easy access to encrypted communications. That idea was dropped after sharp criticism from technologists and civil liberties advocates.
More recently, intelligence officials in Europe and the United States have asserted that encryption hampers their ability to detect plots and trace perpetrators. But many have questioned whether it would be practical or wise to allow governments widespread power to read encrypted messages.
To help readers appreciate the arguments on both sides, we’ve pulled together some FAQs on a subject that is sure to be hotly debated in the years to come.
Are terrorists really using encrypted messages to plot attacks?
There’s mounting evidence that terrorist groups are using encryption, but so does nearly everyone living in modern society. Encryption protects your bank information, prevents your password from being stolen when you log into a website, and allows all e-commerce transactions to take place securely.
In addition, apps that send encrypted text messaging apps through Wi-Fi, such as WhatsApp, Signal, and Telegram, have become increasingly commonplace in places where text messaging is expensive.
One piece of evidence that terror networks are using encrypted messages surfaced in a recent issue of ISIS’s Dabiq magazine,where the group listed a contact number in Telegram. Soon after,Telegram shut down many ISIS-connected groups using its service. And earlier this year, a West Point researcher found copies of an encryption manual designed for journalists and activists on an Internet forum linked to ISIS.
Intelligence officials have said that the planner of the Paris terrorist attacks used encryption technology, but police also found that one of the Paris terrorists was using an unencrypted cell phone.
Are Google, Apple, Facebook, and Twitter thwarting law enforcement through their use of encryption?
In the past few years, Silicon Valley tech companies have added layers of encryption to their cell phones and websites in an effort to assure users that their data is safe from both hackers and spies. That encryption has also made it harder for law enforcement officials to read what is transmitted by those devices.
Last year, Apple made encryption the default setting for iPhones, meaning that all data stored on the device was scrambled. In an open letter announcing the change, Apple CEO Tim Cook wrote, “At Apple, we believe a great customer experience shouldn’t come at the expense of your privacy.”
In congressional testimony this month, Federal Bureau of Investigations Director James Comey said that encryption is now part of “terrorist tradecraft.” He cited an instance in Garland, Texas, in which two terror suspects were arrested before they could execute an attack. “That morning, before one of those terrorists left to try and commit mass murder, he exchanged 109 messages with an overseas terrorist. We have no idea what he said because those messages were encrypted,” Comey said.
But can’t the National Security Agency just crack any code it wants?
It’s not clear how much encryption the NSA can break. In 2013, ProPublica and the New York Times reported on a top secret NSA program called Bullrun that was described in internal documents as being able to decrypt “vast amounts of encrypted Internet data.” The program started in 2011 and was the result of “an aggressive, multipronged effort to break widely used Internet encryption technologies.”
Details of the project are not known. But the documents showed that, in 2013, the agency planned to spend $250 million to, in part, “insert vulnerabilities into commercial encryption systems.”
I heard that there is a “golden key” that unlocks all encryption. Is there such a thing?
Not yet and it’s not clear it will ever exist. The U.S. government has been trying to figure out how to access encrypted data for decades. However, wiretapping a phone call is far easier than creating a backdoor into encryption technology.
Last year, the Washington Post editorial board called for Apple and Google “with all their wizardry,” to “invent a kind of secure golden key” that would allow law enforcement officials to read any encrypted message sent by a suspect.
It would be a tremendous challenge to convince the world’s encryption makers, many of whom live outside the U.S., to give American authorities access to such a tool. And it would be an even bigger challenge to keep the master key secret—given that it would immediately become the No. 1 target of every hacker and nation in the world.
To address that issue, a White House working group proposed a split key—where one half of the master key would be kept by the government and the other would be held by the encryption company. But the report noted that this approach would be “complex to implement and maintain.”
Are there less complicated ways to give law enforcement and intelligence officials the access they say they need?
The White House working group offered three additional ideas for “backdoors” into encryption. All required manufacturing or software changes by U.S. providers and all involved significant political or technical problems.
One idea raised by the panel called for manufacturers to create a special port on all devices that could only be accessed by law enforcement. Requiring a port would represent a "significant cost to U.S. providers," but could be avoided by installing software that creates "a secondary layer of encryption," the panel said.
Another option would be for telecom providers to slip software that defeats encryption into routine upgrades sent to customers. Such an approach would “call into question the trustworthiness” of American companies’ software updates, and could be easily repelled by technically adept users.
Finally, the working group suggested that telecom providers might be ordered to hack into their customers’ devices so that their back-up routines would send unencrypted copies of all data to the government.
Will any of these backdoor schemes work?
They all have flaws. A big one: Users could easily bypass all of the backdoor options by creating their own layers of encryption.
It’s not clear that compelling American companies to allow backdoors would accomplish much. A significant amount of the encryption software used around the world comes from widely available “open source’’ products. “There may be no central authority” for the government to negotiate with, the White House said in its report.
And even when there is a company to negotiate with, the government has not had luck getting access to encryption keys. Two years ago, for example, the FBI tried and failed to get access to encryption keys from Edward Snowden’s email provider, Lavabit.
Ladar Levison, Lavabit’s owner, “provided the FBI with an 11-page printout containing largely illegible characters in 4-point type” of the keys and then shut down the entire email service.
Most importantly, the U.S. isn’t the only country in the world with legal power over technology companies. For example, many cell phones used in the U.S. are manufactured in China, which could also demand backdoor access for its intelligence and law enforcement authorities. The White House report warns that “any U.S. proposed solution will be adopted by other countries.”
So what is the government proposing?
The short answer is that the government has quietly dropped its requests for a backdoor.
Last year, in a speech at the Brookings Institution, FBI Director Comey called for a “regulatory or legislative fix” to the problem of law enforcement access to encrypted communications, which was widely interpreted as calling for legislation to require encryption backdoors.
But after his proposal prompted a backlash from technologists, Comey has softened his tone. In July, he told a Senate panel that “there has not yet been a decision whether to seek legislation” about requiring companies to provide access to encrypted data.
And during a testimony earlier this month, he told a Senate panel that “the administration has decided not to seek a legislative remedy at this time.” California Senator Dianne Feinstein suggested that she is going to seek legislation. “If there is conspiracy going on over the Internet, that encryption ought to be able to be pierced,” she said at the hearing.
The next day, privacy advocates visited the White House to discuss a petition they submitted in support of strong encryption. Kevin Bankston, director of the Open Technology Institute, who attended the meeting, said that administration officials said they “would like to move beyond this debate” and start discussing “how to adapt to strong encryption rather than fighting it.”