Who’s Listening In?
In early 2016, Apple was faced with a dilemma: give the FBI a way to break into a terrorist’s iPhone—or protect the security and privacy of every other iPhone by refusing the request. The couple who killed 14 people in San Bernardino in December 2015 had destroyed their personal smartphones, but one work phone was later found, and the FBI wanted to get inside it. The trouble was, if investigators failed to figure out its four-digit password in ten attempts, the phone was programmed to destroy its data.
The FBI wanted Apple to write code to create a security update that would undo the phone’s security protections, but the company refused, saying that would weaken security for all users. As the case wended its way through the courts, the U.S. House of Representatives Judiciary Committee held hearings on the issue. One of the key experts testifying was Susan Landau, now Bridge Professor in Cybersecurity at Tufts, where she splits her time between the Fletcher School and the School of Engineering. Her take on the issue was unequivocal: weakening the security on one iPhone weakens it on all iPhones, and that’s bad for U.S. security, she said.
Now Landau has written a new book, Listening In: Cybersecurity in an Insecure Age (Yale), elaborating on her concerns. In it, she details how we got from the beginnings of the Internet to today’s far-reaching Internet of things. Underlying it all is her argument “that securing our data is essential for securing us,” she said.
At the same time, improving security means that the data of bad actors—from would-be terrorists to garden-variety criminals—is also secure, and increasingly hidden from law enforcement. That’s what the Apple case was really about: does secure data for all mean that law enforcement and national security take a back seat to privacy?
The answer, Landau said, is no. “In the book, I talk about how the NSA says they’re doing better than ever. They know that most things they want to investigate will be dark—not accessible,” she said. “That’s just the way things are going, regardless of how the U.S. tries to constrain manufacturers.”
There is no way a manufacturer can create a backdoor for the NSA or law enforcement officials without creating a serious risk that it will be found and exploited by hackers, Landau said; that would simply give attackers—especially the intelligence agencies of other nations—carte blanche to access everyone’s computer systems, and in the long run do serious damage to national security.
“We keep hearing that the FBI is growing dark, and that technology companies need to fix the problem,” she said. “But what the FBI should be doing is working to increase their investigative capabilities,” she said. After all, communications technologies are created widely outside of the U.S., and the government has no control over those developments. Landau argued that the FBI should take a page from the NSA, which had similar problems in the late 1990s, but developed techniques that would nonetheless enable it to provide intelligence.
The Trouble with Insecure Data
Landau started her career as a theoretical computer scientist, and some of her work had implications for cryptography, the practice of keeping communications secret. In the early 1990s, she was first author on a computer association report on cryptography policy, and later co-authored the book Privacy on the Line: The Politics of Wiretapping and Encryption with Whitfield Diffie, co-inventor of public-key encryption, which is used in most current online transaction systems to encrypt information, such as credit card numbers.
Landau began her career as a computer science faculty member at Wesleyan University, and then as a research faculty member at the University of Massachusetts Amherst. She then joined Sun Microsystems, working on a variety of technology and policy projects, including export control issues for cryptography technology and digital rights management. After Sun folded, Landau spent two years at Harvard, the first as a Radcliffe Scholar, held a Guggenheim Fellowship, and then spent short periods of time at Google, as a senior staff privacy analyst, and Worcester Polytechnic Institute as a professor of cybersecurity policy.
Her specialty is surveillance. In her new book, she writes more broadly about the trouble with insecure data. All too often, she says, individuals and organizations don’t think about securing their data. “I think everybody has a sense that it won’t happen to me,” Landau said. “Nobody thinks about what their particular threats are, nobody thinks about what it is that they need to protect.”
She cited the breach at Sony Pictures in 2014, when hackers broke into the company’s computer systems and stole huge amounts of information, including embarrassing corporate memos and copies of unreleased movies. “Sony was making films, and thought of itself as a film company,” Landau said. “The mental model of the executives was probably a canister of film, not bits—computer data. Sony’s intellectual property was bits, not films.”
Banks, on the other hand, much more quickly realized that money “is all about the bits,” she said. “When you do a money transfer, nobody carries bills from bank A to bank B; they just do a bit transaction. They have known for a long time they need to protect the bits.”
Obviously not everyone is as careful as financial institutions in guarding data. For example, the Democratic National Committee emails that ended up on WikiLeaks in 2016, apparently courtesy of Russian hackers, were stolen through a simple phishing scheme. In such ploys, a corrupted link is sent in a phony email, which the unwary—and unwise—click on, allowing hackers into their computer systems.
Phishing schemes are one way to take advantage of human weaknesses to penetrate computer systems. Sometimes attackers are simply conducting an effort to collect information—that’s a cyberexploit—and sometimes they’re seeking to conduct a cyberattack. The goal in that case is not so much to gather data such as trade secrets, but to access control to physical systems, such as the Russian efforts to bring down the Ukrainian power system in late 2015—the opening salvo in a campaign to undermine that country’s government.
And what can we do personally to keep our data secure? Landau gives a few basic rules. “The most important thing is to do automatic updates for everything you use,” she said. “Don’t say I’m too busy now—just do it.” And, she added, think about what is important for you to secure—work information and personal information. Use two-factor authentication when it’s available, and, of course, think before you click on a link or open an attachment—that’s the way most cyberexploits work. When she travels, Landau said, she is “very careful about where I browse on public Wi-Fi.”
The same is true for governments and businesses—only more so. Security measures should be built into every system, Landau says, and access to critical pieces should be limited to a very small number of trusted—and careful—people.
In the end, Landau circles back to the big picture. With ever more information and systems online—and thus accessible to outsiders intent on harming people, organizations, and entire countries—“cybersecurity protections, including encrypted communications and secured communications, are far more critical now than they were even five years ago,” she said.
Taylor McNeil can be reached at firstname.lastname@example.org.