Data and Goliath

Home > Other > Data and Goliath > Page 20
Data and Goliath Page 20

by Bruce Schneier


  But more steps are needed to put the NSA under credible tactical oversight. Its internal procedures are better suited to detecting activities such as inadvertent and incorrect surveillance targeting than they are to detecting people who deliberately circumvent surveillance controls, either individually or for the organization as a whole. To rectify this, an external auditor is essential. Making government officials personally responsible for overreaching and illegal behavior is also important. Not a single one of those NSA LOVEINT snoops was fired, let alone prosecuted. And Snowden was rebuffed repeatedly when he tried to express his concern internally about the extent of the NSA’s surveillance on Americans.

  Other law enforcement agencies, like the FBI, have their own internal oversight mechanisms. Here, too, the more transparency, the better. We have always given the police extraordinary powers to investigate crime. We do this knowingly, and we are safer as a society because of it, because we regulate these actions and have some recourse to ensure that the police aren’t abusing them. We can argue about how well these are working in the US and other countries, but the general idea is a sound one.

  PROTECT WHISTLEBLOWERS

  Columbia law professor David Pozen contends that democracies need to be leaky—leaks and whistleblowing are themselves security mechanisms against an overreaching government. In his view, leaks serve as a counterpoint to the trend of overclassification and, ultimately, as a way for governments to win back the trust lost through excessive secrecy.

  Ethnographer danah boyd has called whistleblowing the civil disobedience of the information age; it enables individuals to fight back against abuse by the powerful. The NGO Human Rights Watch wrote that “those who disclose official wrongdoing . . . perform an important service in a democratic society. . . .”

  In this way of thinking, whistleblowers provide another oversight mechanism. You can think of them as a random surprise inspection. Just as we have laws to protect corporate whistleblowers, we need laws to protect government whistleblowers. Once they are in place, we could create a framework and rules for whistleblowing legally.

  This would not mean that anyone is free to leak government secrets by claiming that he’s a whistleblower. It just means that conscience-driven disclosure of official wrongdoing would be a valid defense that a leaker could use in court—juries would have to decide whether it was justified—and that reporters would legally be able to keep their sources secret. The clever thing about this is that it sidesteps the difficult problem of defining “whistleblower,” and allows the courts to decide on a case-by-case basis whether someone’s actions qualify as such or not. Someone like Snowden would be allowed to return to the US and make his case in court, which—as I explained in Chapter 7—currently he cannot.

  Additionally, we need laws that protect journalists who gain access to classified information. Public disclosure in itself is not espionage, and treating journalism as a crime is extraordinarily harmful to democracy.

  In Chapter 7, I mentioned the Obama administration’s overzealous prosecution of whistleblowers. That policy is both hypocritical and dangerous. We encourage individuals to blow the whistle on violations of law by private industry; we need to protect whistleblowing in government as well.

  TARGET MORE NARROWLY, AND ONLY WITH JUDICIAL APPROVAL

  Electronic surveillance is a valuable tool for both law enforcement and intelligence gathering, and one we should continue to use. The problem is electronic surveillance on the entire population, especially mass surveillance conducted outside of a narrow court order. As we saw in Chapter 11, it doesn’t make us any safer. In fact, it makes us less safe by diverting resources and attention from things that actually do make us safer. The solution is to limit data collection and return to targeted—and only targeted—surveillance.

  Cybersecurity and information law researcher Axel Arnbak said about government surveillance, “The front door governed by law; the backdoor governed by game theory.” What he meant is that the targeted surveillance process is subject to probable cause, warrants, limits in scope, and other laws designed to protect our security and privacy. Mass surveillance is subject to an organization’s cold analyses of what it can collect and how likely it is to get away with it. When we give the NSA the ability to conduct mass surveillance by evading the warrant process, we allow NSA staff to think more in terms of what’s possible than in terms of what’s legal. We let them grow arrogant and greedy, and we pay the price.

  Bulk surveillance turns the traditional investigative process on its head. Under a normal sequence of operations, law enforcement has reason to suspect an individual, and applies for a warrant to surveil that person. Bulk surveillance allows law enforcement to surveil everyone—to develop grounds for suspicion. This is expressly prohibited by the US Constitution, and with good reason. That’s why a 2014 UN report concluded that mass surveillance threatens international law.

  We need legislation that compels intelligence agencies and law enforcement to target their surveillance: both new legislation and new enforcement of existing legislation. That combination will give law enforcement only the information it needs, and prevent abuse.

  The US Supreme Court took a baby step in this direction in 2013 when it required police officers to obtain a warrant before attaching a GPS tracking device to a suspect’s car, and another in 2014 when it required police officers to obtain a warrant before searching the cell phones of people they stopped or arrested.

  In the US, we need to overturn the antiquated third-party doctrine and recognize that information can still be private even if we have entrusted it to an online service provider. The police should need a warrant to access my mail, whether it is on paper in my home, on a computer at my work, or on Google’s servers somewhere in the world.

  Many of these issues are international. Making this work is going to mean recognizing that governments are obligated to protect the rights and freedoms not just of their own citizens but of every citizen in the world. This is new; US legal protections against surveillance don’t apply to non-US citizens outside the US. International agreements would need to recognize that a country’s duties don’t entirely stop at its borders. There are fundamental moral arguments for why we should do this, but there are also pragmatic ones. Protecting foreigners’ privacy rights helps protect our own, and the economic harms discussed in Chapter 9 stem from trampling on them.

  FIX ALMOST ALL VULNERABILITIES

  As I discussed in Chapter 11, a debate is going on about whether the US government—specifically, the NSA and US Cyber Command—should stockpile Internet vulnerabilities or disclose and fix them. It’s a complicated problem, and one that starkly illustrates the difficulty of separating attack and defense in cyberspace.

  An arms race is raging in cyberspace right now. The Chinese, the Russians, and many other countries are also hoarding vulnerabilities. If we leave a vulnerability unpatched, we run the risk that another country will independently discover it and use it in a cyberweapon against us and our allies. But if we patch all the vulnerabilities we find, there goes our armory.

  Some people believe the NSA should disclose and fix every bug it finds. Others claim that this would amount to unilateral disarmament. President Obama’s NSA review group recommended something in the middle: that vulnerabilities should only be hoarded in rare instances and for short periods of time. I have made this point myself. This is what the NSA, and by extension US Cyber Command, claims it is doing: balancing several factors, such as whether anyone else is likely to discover the vulnerability—remember NOBUS from Chapter 11—and how strategic it is for the US. The evidence, though, indicates that it hoards far more than it discloses.

  This is backwards. We have to err on the side of disclosure. It will especially benefit countries that depend heavily on the Internet’s infrastructure, like the US. It will restore trust by demonstrating that we’re putting security ahead of surveillance. While stockpiled vulnerabilities need to be kept secret, the more we can open the proc
ess of deciding what kind of vulnerabilities to stockpile, the better. To do this properly, we require an independent government organization with appropriate technical expertise making the decisions.

  In today’s cyberwar arms race, the world’s militaries are investing more money in finding and purchasing vulnerabilities than the commercial world is investing in fixing them. Their stockpiles affect the security of us all. No matter what cybercriminals do, no matter what other countries do, we in the US need to err on the side of security by fixing almost all the vulnerabilities we find and making the process for disclosure more public. This will keep us safer, while engendering trust both in US policy and in the technical underpinnings of the Internet.

  DON’T SUBVERT PRODUCTS OR STANDARDS

  Trust is vitally important to society. It is personal, relative, situational, and fluid. It underpins everything we have accomplished as a species. We have to be able to trust one another, our institutions—both government and corporate—and the technological systems that make society function. As we build systems, we need to ensure they are trustworthy as well as effective.

  The exact nature of Internet trust is interesting. Those of us who are techies never had any illusions that the Internet was secure, or that governments, criminals, hackers, and others couldn’t break into systems and networks if they were sufficiently skilled and motivated. We never trusted that the programmers were perfect, that the code was bug-free, or even that our crypto math was unbreakable. We knew that Internet security was an arms race and that the attackers had most of the advantages.

  What we did trust was that the technologies would stand or fall on their own merits. Thanks to Snowden’s revelations, we now know that trust was misplaced. This is why the NSA’s and GCHQ’s surveillance programs have generated so much outcry around the world, and why the technical community is particularly outraged about the NSA’s subversion of Internet products, protocols, and standards. Those agencies’ actions have weakened the world’s trust in the technology behind the Internet.

  I discussed in Chapter 6 that the FBI is continually trying to get laws passed to mandate backdoors into security. I discussed in Chapter 11 how the NSA is surreptitiously inserting backdoors into Internet products and protocols so it can spy. We also have to assume that other countries have been doing the same thing to their own products (and to each other’s). A number of observers have concluded that companies with Israeli development teams, such as Verint, Narent, and Amdocs, are in bed with the Israeli government, and that Huawei equipment is backdoored by the Chinese government. Do we trust US products made in China? Do we trust Israeli nationals working at Microsoft? Or hardware and software from Russia? France? Germany? Anything from anywhere?

  This mistrust is poison. Security has to come first, eavesdropping second. Law enforcement can obtain a warrant and attempt to eavesdrop, but should not be able to force communications companies to guarantee that eavesdropping attempts will be successful. What we actually need to do is repeal CALEA and get back into the business of securing telephone networks and the Internet.

  The response to this from law enforcement people is to try to frighten us with visions of kidnappers, pedophiles, drug dealers, and terrorists going scot-free because they can’t decrypt their computer and communications. We saw this in late 2014 when Apple finally encrypted iPhone data; one after the other, law enforcement officials raised the specter of kidnappers and child predators. This is a common fearmongering assertion, but no one has pointed to any actual cases where this was an issue. Of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping—and the victim wasn’t a child. More importantly, there’s no evidence that encryption hampers criminal investigations in any serious way. In 2013, encryption foiled the police nine times, up from four in 2012—and the investigations proceeded in some other way.

  Law enforcement organizations have a huge array of investigative tools at their disposal. They can obtain warrants for data stored in the cloud and for an enormous array of metadata. They have the right and ability to infiltrate targeted suspects’ computers, which can give them the data they need without weakening security for everyone. Good security does not put us at risk.

  Our intelligence agencies should not deliberately insert vulnerabilities into anything other than specialized foreign military and government systems, either openly or surreptitiously; and they should work with the academic and business communities to ensure that any vulnerabilities inserted by more hostile parties are discovered, revealed, and disabled.

  We’ll never get every power in the world to agree not to subvert the parts of the Internet it controls, but we can stop subverting the parts we control. Most of the high-tech companies that make the Internet work are US-based, so our influence is disproportionate. And once we stop playing the subversion game, we can credibly devote our resources to detecting and preventing subversion by others—thereby increasing trust worldwide.

  SEPARATE ESPIONAGE FROM SURVEILLANCE

  In 2013, we learned that the NSA eavesdropped on German chancellor Angela Merkel’s cell phone. We learned that the NSA spied on embassies and missions all over the world: Brazil, Bulgaria, Colombia, the European Union, France, Georgia, Greece, India, Italy, Japan, Mexico, Slovakia, South Africa, South Korea, Taiwan, Venezuela, and Vietnam. We learned that the NSA spied on the UN. Naturally, these revelations strained international relations, but was anyone really surprised? Spying on foreign governments is what the NSA is supposed to do.

  Government-on-government espionage is as old as government itself. It’s an important military mission, in both peacetime and wartime, and it’s not going to go away. It’s targeted. It’s focused. It’s actually stabilizing, reducing the uncertainty countries have about each other’s intentions.

  Espionage is fundamentally different from the NSA’s mass-surveillance programs, both in the US and internationally. In Chapter 5, I noted that this shift in the NSA’s activities was a result of a shift in its mission to terrorism prevention. After 9/11, the primary mission of counterterrorist surveillance was given to the NSA because it had existing capabilities that could easily be repurposed, though the mission could have gone to the FBI instead.

  Because the NSA was given the counterterrorist surveillance mission, both the military norms and the legal framework from its espionage activities carried over. Our surveillance efforts against entire populations (including our own) were kept as secret as our espionage efforts against governments.

  We need to separate these two missions. Government espionage should stay within the purview of the State Department and the military. The president, as commander in chief, should determine which embassies and whose cell phones to eavesdrop on, and the NSA should carry out those orders. Mass surveillance, whether domestic or foreign, is almost never justified. Government surveillance of private citizens sometimes is, but only as a criminal investigative activity. These surveillance activities should move outside the NSA and the military. They should instead come under the auspices of the FBI and Justice Department, which will apply the police rules of probable cause, due process, and oversight to surveillance activities—in regular open courtrooms.

  This isn’t to say that we don’t need major police reform in the US. We’ve already discussed police secrecy in Chapter 7. The increasing militarization of the police and many departments’ tendency toward racially discriminatory practices are also serious problems. But that’s a topic for another book. Counterterrorism was, and still is, primarily the mission of the FBI.

  In January 2014, President Obama gave a speech about the NSA in which he made two very important points. He promised that the NSA would no longer spy on Angela Merkel’s cell phone. And while he didn’t extend that courtesy to the other 82 million citizens of Germany, he did say that he would extend some of the US’s constitutional protections against warrantless surveillance to the rest of the world. Putting government-on-population surveillance
under a civilian head and police rules would go a long way towards achieving that ideal.

  LIMIT THE MILITARY’S ROLE IN CYBERSPACE

  One of the great political achievements of the late nineteenth century was the separation of civilian government from the military. Both history and current politics have demonstrated the enormous social damage that occurs when generals are in charge of a country. Separating the two, as many free and democratic countries worldwide do, has provided space for liberty and democracy to flourish.

  The problem is that cyberspace doesn’t easily lend itself to the traditional separation into civilian and military domains. When you’re being physically attacked, you can call on a variety of organizations to defend you: the police, the military, whoever does antiterrorism security in your country, your lawyers. The legal regime justifying that defense depends on two things: who’s attacking you, and why. Unfortunately, when you’re being attacked in cyberspace, the two things you don’t know are who’s attacking you, and why.

  Additionally, the Internet doesn’t have borders comparable to real-world ones—you can argue that it has no borders at all—so the distinction between foreign and domestic is much harder to apply. Attackers can range from bored teenagers to professional criminals to nation-states, perhaps using the same tactics and weaponry, so the distinction between types of attackers is hard to determine. Attacks occur in milliseconds and can have wide-ranging effects.

  The easy reaction is to lump all of these unknown attacks under the rubric of “cyberwar,” and it’s the hot new thing in military planning. I’ve already mentioned that about 30 countries have cyberwarfare divisions in their militaries. A “cybersiege” mentality is becoming the new norm.

 

‹ Prev