Deep State

Home > Other > Deep State > Page 36
Deep State Page 36

by Marc Ambinder


  8. Susan Landau, Surveillance or Security?: The Risks Posed by New Wiretapping Technologies (Cambridge: The MIT Press, 2011), 247.

  9. “Ten years after its founding,” an observation from Georgetown University terrorism scholar Bruce Hoffmann, speaking at the Woodrow Wilson Center, September 12, 2011.

  10. Matthew Hayward, “Quantum Computing and Shor’s Algorithm,” April 26, 2008.

  11. RSA Laboratories, “What Is Public-Key Cryptography,” 2011, http://www.rsa.com/rsalabs/node.asp?id=2165.

  12. Bob Gourley, “Quantum Encryption vs Quantum Computing: Will the Defense or Offense Dominate,” SANS Security Essentials GSEC Practical Assignment Version 1.2e, July 15, 2001, http://www.sans.org/reading_room/whitepapers/vpns/quantum-encryption-quantum-computing-defense-offense-dominate_720.

  13. Pgp.net FAQ, “Chapter 3: Security Questions,” http://www.pgp.net/pgpnet/pgp-faq/pgp-faq-security-questions.html#security-against-brute-force.

  14. Alexander Higgins, “Building a $2 Billion Quantum Computer Artificial Intelligence Spy Center,” March 18, 2012, http://blog.alexanderhiggins.com/2012/03/18/nsa-building-a-2-billion-quantum-computer-spy-center-98341/.

  15. Email from Christopher Monroe to Steve Fetter, Office of Science and Technology Policy, June 27, 2009, from an FOIA obtained by the authors.

  16. “The Growth of Cryptography,” MIT Video, February 8, 2011, http://video.mit.edu/watch/the-growth-of-cryptography-9655/.

  17. Jodie Eicher and Yaw Opoku, “Using the Quantum Computer to Break Elliptic Curve Cryptosystems,” July 29, 1997, http://www.mathcs.richmond.edu/~jad/summerwork/ellipticcurvequantum.pdf.

  18. R. Sakthi Vignesh, S. Sudharssun, and K. J. Jegadish Kumar, “Limitations of Quantum and the Versatility of Classical Cryptography: A Comparative Study,” Proceeding ICECS ’09, Proceedings of the 2009 Second International Conference on Environmental and Computer Science, 2009, http://dl.acm.org/citation.cfm?id=1726307.

  19. Shane Harris, The Watchers: The Rise of America’s Surveillance State (New York: Penguin Press, 2010), 237–238. This book contains the best history of the TIA affair.

  20. Steven Aftergood, “A Correction on Nuclear Secrecy,” Secrecy News, August 25, 2011, http://www.fas.org/blog/secrecy/2011/08/correction_nuclear.html; Department of Energy, “Record of Decision to Classify Certain Elements of the SILEX Process as Privately Generated Restricted Data,” Federal Register 26, no. 123 (June 26, 2001), http://www.fas.org/sgp/othergov/doe/silex.html.

  21. William J. Broad, “Laser Advances in Nuclear Fuel Stir Terror Fear,” New York Times, August 21, 2011, A1.

  22. Laura Blumenfeld, “Dissertation Could Be Security Threat,” Washington Post, July 8, 2003, A1.

  CONCLUSION

  Shooting at Ahmadinejad

  In late September 2006, President George W. Bush attended the 61st United Nations General Assembly in New York. Each morning, the president is given a highly classified newspaper of sorts that summarizes the latest intelligence and events from around the world. The document is called the President’s Daily Brief, and the most chilling item that morning was saved for last.

  The item was three sentences long and marked Top Secret, and it scared the hell out of the dozen or so White House officials cleared to read it. According to one official, it began, “A U.S. Secret Service agent, in an apparent accident, discharged his shotgun as President Ahmadinejad was loading his motorcade at the Intercontinental Hotel yesterday.”

  At the time, the Bush administration was still weighing options for how best to deal with the Iranian nuclear weapons program. And here, a U.S. Secret Service agent had just given the president of Iran a massive and potentially devastating public relations coup. Mahmoud Ahmadinejad was certain to reveal the accident in some grand form on the world stage—before the whole of the United Nations. He might allege that the United States had tried to assassinate him, or scare him, or somehow send a grave message to the Middle East, and thus upend the entire conference.

  “When I read that, I remember closing my eyes and saying, ‘Three, two, one . . .’” recalls the official. But a quick scan of the morning newspapers revealed no word of the incident. The Secret Service, embarrassed and chastened, informed the White House that the agent had been pulled off the detail and that a full, secret investigation was under way.

  It remains unclear to everyone why the incident never leaked. The agent had hopped into the armored follow-up Suburban and was adjusting the side-mounted shotgun when it discharged. The armor was strong enough to stop the slug, but every agent on the detail—and certainly the half dozen or so Iranian security agents escorting Ahmadinejad to his car—knew what that sound was.

  “Everyone just stopped. The Iranians looked at us and we looked at the Iranians. The agent began to apologize. Ahmadinejad just turned his head and got into his car.” And that was it.

  The Iranians told no one. Not that day. Not to this day. And their silence—their helping the Bush administration to keep an embarrassing secret—led several White House aides, previously inclined to view Iran’s leadership as being driven by the emotions of the moment, to begin to see Ahmadinejad and his circle of advisers in a new light. Here was evidence that maybe Iran was acting strategically, and therefore cautiously.

  One of the more nuanced arguments against excessive secrecy (but not against secrecy itself) comes from Jennifer Sims, the director of intelligence studies at Georgetown University and a member of the Public Interest Declassification Board. She believes that the system protects irrelevant information almost by design and thus creates tensions that inevitably lead to leaks and conflict. Properly developed, secrets are valuable to policymakers choosing from among many difficult options. Those secrets make it easier to govern. But frivolous secrets generated by overclassification create conditions for massive counterintelligence problems. The more leaks there are, the less liaison cooperation the United States will get, and the more likely the enemy will perceive the U.S. national security system as vulnerable. Her solution is to radically reduce the number of things that are kept secret and to radically increase the protection accorded to those secrets. Here, Sims is describing one of the mechanisms that we’ve discussed throughout the book: that the bigger the system gets, the more difficult it becomes to manage, and the harder it is to properly assess and analyze secret information. Sims’s board oversees the government’s declassification efforts. She acknowledges that the rules now in place are at once necessary and impossibly burdensome.

  “It was Pat Moynihan’s fundamental insight that secrecy is a regulatory system just like any other,” says John Podesta. “What really struck him were the cases where the more secrecy there was, the more likely something was to be unsuccessful.”

  Things marked Secret draw attention to themselves, and their importance is automatically elevated. The CIA was obsessed with the Soviet Union’s nuclear arsenal and generated an enormous number of secrets about it. Meanwhile, it missed open source information about the USSR’s demography that told a more reliable story about the challenges facing the country. The United States ignored suggestions in the Indian press that a nuclear weapon was about to be tested in 1998, because their secret reconnaissance told them that none was in the offing.

  And sometimes the government uses secrecy to avoid doing its job. Podesta recalls a debate in the Clinton administration over chemical plants in the United States. A lot of plants were underregulated, but instead of rewriting rules to ensure that, say, a vat of chlorine wasn’t left outside overnight, some members of the administration wanted to classify the locations of these plants and keep details about them secret.

  Overclassification is the detritus of a self-perpetuating secrecy apparatus, the result of rapidly advancing technology, and the natural evolution of an entrenched national security state. But to focus on overclassification as the root of the problem is myopic. Absent an official and sustained push for reform from the top, overclassification will remain a problem in perpetuity.

  And th
e state is showing its wear. General Bryan Douglas Brown, former commander of the U.S. Joint Special Operations Command and the U.S. Special Operations Command, says that secrecy “is just very expensive,” which he means in terms of dollars spent maintaining the apparatus and opportunities lost by distracting intellectual resources from other, more important areas. Bulk declassification of very old historical records is pretty easy. But reviewing every document that’s been classified in the past twenty-five years and has been marked with a “Do Not Declassify” caveat is impractical.

  Selective declassification, on the other hand, is a workable start. During the Clinton administration, Al Gore was particularly keen to declassify such scientific data as telemetry from the nation’s undersea surveillance system, which could help monitor climate change. The National Reconnaissance Office, as part of its fiftieth anniversary, gave historians a bonanza of data about some of its earlier reconnaissance systems. By comparison, the Obama administration has been cagey about major declassification efforts directed at documents from the 1980s and 1990s, as many of them might relate to counterterrorist activities still under way.

  The intelligence community produces an enormous amount of collateral intelligence, because technology allows it to do so. The State Department, for example, is never not going to communicate via Secret cables, and so on. There are simply too many incentives to classify something at a higher level than necessary, and no incentive at all to underclassify.

  The process to protest a classification decision within the government is rarely used. If minor, inconsequential information is being classified, the public isn’t necessarily being deprived of critical information. And the enormous expense associated with any serious go at declassification is a deterrent, whatever the long-term financial gain.

  Formal self-correcting mechanisms such as the Freedom of Information Act, the Public Interest Declassification Board, and the Interagency Security Classification Appeals Panel, combined with the informal mechanisms—dogged researchers like Thomas S. Blanton, Jeffrey Richelson, Steven Aftergood, and curious historians—are probably sufficient to ensure that historically relevant classified material is released, perhaps not in as timely a manner as it could be, but eventually. And an enlightening portrait of the deep state for all the public to see is the result.

  Ironically, another informal hedge against overclassification is the growing number of people with access to classified information. The more people who have access to a secret, the greater the chances that it will leak.1 This especially applies to immoral and illegal activities of the government. Whistleblowers will provide sunlight.

  One reason for the “stamp and leak” culture is the institutional failure of the intelligence community to find an effective way of allowing people uncomfortable with certain secrets to protest them without leaking to the public. Channels that allow for proper and credible adjudication are essential. David Grannis, the staff director of the Senate Select Committee on Intelligence, says he is not aware of a single instance where a whistleblower from within the community successfully navigated the complex rules set up by agencies to handle complaints. And simply put, the people who work with secrets have little faith in the inspectors general, no matter how independent they are, and have every reason to believe, because they can read newspapers, that their whistleblowing will end their careers if done internally.

  One reason the government has tended so poorly to the culture of secrecy is that the executive branch refuses to concede that any other branch of government (and certainly not the press) has the right or the duty to question classification decisions, to help determine what qualifies as national security information and how that information should be protected. Sometimes Congress can press the issue. The Senate, for example, forced the executive branch’s hand in declassifying the existence of the National Reconnaissance Office in 1992; it was going to include line items in its unclassified authorization. (At any rate, the press had long since revealed it.)2 But more often than not, the legislative branch abdicates its responsibility.

  Congressional oversight of national security would be more effective if the same legal opinions that underlie executive decisions were given to the congressional committees, but the executive branch, citing its constitutional prerogative, will never consider that. The executive branch is self-defeating in another way: the public now more than ever knows how the government works. As a result, it grows skeptical when told that it can’t access information, especially as society itself has begun reorganizing itself around openness and access. This is especially so when the government goes to excessive lengths to protect information that has a direct bearing on the national security debate. Very few people inside government consciously use secrecy as a means to sow fear and anxiety among Americans, and yet it fosters that anxiety and serves to recursively justify a permanent state of war and a massive military budget. (Government contractors are direct beneficiaries, as the independent journalist Tim Shorrock has documented.)3

  Barton Gellman conceives the secrecy debate as a struggle between the government and the press for information, and the way that information ought to be presented and interpreted. It is a competition that is structured by, and limited by, a mutual understanding of each side’s respective role in protecting American interests. The notion that journalists even have a role to play in the broader protection of American interests strikes some journalists as folly, because it implies jingoism at best and capture at worst. But Gellman, whose body of work disproves any such allegation against him, with a reputation for breaking stories that hold the government and powerful interests to account, never argues for surrender. He wants the executive branch (even informally if it must) to recognize that journalists have a significant degree of control over secrecy. He believes that trust between antagonists can be built, while appropriate oppositional roles can be maintained: “Hard questions about government secrecy involve a clash of core values. Call them self-preservation and self-government. Any answer that fails to take both of those values seriously, and address them both explicitly, has not even engaged the central problem.”4

  Suppose, he says, that we know that the “president lied about Iraq’s nonconventional weapons and thereby took the nation to war in Iraq by a kind of fraud.” This is the kind of thing that the public should know before they vote. Further suppose that the information proving this was released. “Opening files would resolve the mystery but undoubtedly carry high costs. It might put the safety of human sources at risk, reveal enough about intelligence methods to enable their defeat, compromise ongoing operations, or warn enemies of operations to come. Withholding the evidence, on the other hand, renders citizens unable to judge what may be the most consequential act of this presidency.”

  Who gets to make this judgment? What’s the right decision? The press doesn’t know enough to do so. The judicial branch will defer to the executive branch, in whose interest it is not to disclose the information. Congress probably won’t have the information to begin with.

  The answer, as Gellman sees it, is the status quo: “In practice, the flow of information is regulated by a process of struggle as the government tries to keep its secrets and people like me try to find them out. Intermediaries, with a variety of motives, perform the arbitrage. No one effectively exerts coercive authority at the boundary. And that’s a good thing.”5

  The formal checks, such as oversight or an inspector general’s process for whistleblowers, are insufficient. The informal checks, like the power of the press and the ubiquity of access to information, are potent and necessary, as are the informal negotiations that occur between the government and the press. Malfeasance, wrongdoing, cover-ups—there is simply no normative principle the government can use to defend secrecy in these cases. Someone must call the system to account.

  To Gellman’s prescription we would add a few more.

  The government uses a vocabulary that Americans do not understand. For example, what does the U.S. government mean
when it tells Americans that it has received a “specific,” “credible but unconfirmed,” or “uncorroborated” terrorist threat, as it did on September 10, 2011? Candidate Barack Obama and President-Elect Barack Obama decided as a matter of policy that the worst way to respond to a distinctive threat was to treat the country to a command performance by scary adults in suits, grimly conveying vague but ominous information to citizens. But that’s precisely what President Obama has continued. So what happened? Even if Obama came to appreciate that the old style didn’t actually incite panic, it did lead to an inevitable question for which there is no answer: what do we do with this “information” you have just given us?

  When the intelligence community thinks something is “specific,” what does that mean? At what point does John Brennan, counterterrorism adviser to the White House, consider a threat sufficiently “credible” that he checks his insurance policy? It’s astonishing that in the millions of person-hours devoted to pondering strategic communication, no one has thought to tell Americans what the government means when it uses specific phrases. Doing so would not help terrorists. Rather, it would remove some of the Orwellian stigma associated with vague government warnings. It would foster a common sensibility about terrorism, and a more realistic view of what the law enforcement and intelligence communities can and cannot do to defuse potential threats.

  Here is what the words actually mean.

  A specific threat is one that includes details that are distinctive enough to allow the government to narrow the target set (what’s supposed to blow up) and/or the identities of the terrorists (not just “two men,” but “two guys who trained at terrorist camp X and who might have entered the United States on or around this specific date”). For the most part, timing doesn’t factor into these considerations, because anniversaries of some particular event come up almost every day.

 

‹ Prev