Dark Mirror

Home > Other > Dark Mirror > Page 29
Dark Mirror Page 29

by Barton Gellman


  Some people will see a kind of sense in that. A secret has been spilled and damage potentially done. From the NSA’s point of view, a loss is a loss. It may not matter whether a foreign adversary learns the secret from a spy or a published news report. The cryptologic insecurity is the same. Before the disclosure, the NSA had a valuable source or method. Afterward, it does not.

  In other ways, espionage is a terrible analogy for a news media leak. Spying and talking to a journalist are not the same behavior at all. Spies, as we understand the term in everyday life, steal American secrets on behalf of some other country. They hope our government never learns of the breach. They have a single, stealthy customer. They intend, as the Espionage Act defines the crime, “that the information is to be used to the injury of the United States or to the advantage of [a] foreign nation.” News sources, on the other hand, give information to reporters for the purpose of exposure to the public at large. They want everyone to know. They may have self-interested motives, but they commonly believe, whether rightly or wrongly, that their fellow citizens will benefit from the leak.

  I am not making a legal argument. News sources have been tried and convicted of espionage. The charge is nonetheless a fiction enacted as law. The underlying conduct, which may be whistleblowing of the purest kind, is disfigured by forcing the whistleblower into the mold of a spy. If news is conceived as espionage, then it is logical for George Ellard to call me an agent of the adversary and James Clapper to call me an accomplice. It is no stretch at all, from that point, to deployment of the government’s most intrusive counterintelligence powers against a journalist.

  I had all this on my mind when I filed Freedom of Information Act and Privacy Act requests for copies of my records on file with the Department of Homeland Security, Justice Department, FBI, NSA, CIA, and other government agencies. Although I joked with Snowden about it, I made the requests in earnest. After more than two years of foot-dragging by most of the agencies, I filed a lawsuit to enforce the requests in 2016.

  For all the limits of FOIA, which has seven kinds of exemptions and infinite opportunities for government delay, I have learned some interesting things from Gellman v. DHS et al. The CIA, for example, offered what is known as a Glomar response when I asked for my files. “The CIA can neither confirm nor deny the existence or nonexistence of records responsive to your request,” the agency told me. “The fact of the existence or nonexistence of requested records is currently and properly classified.” The DNI’s office said it had withheld 435 documents about me in full, and it explained its reasoning in a classified, ex parte explanation that my lawyers at the Reporters Committee for Freedom of the Press were not allowed to read. Homeland Security personnel, I learned from another document, had produced a seventy-six-page report of every international flight I took since 1983. Customs inspectors had secretly searched my checked baggage when I returned from more than one overseas reporting trip. The reasons and results of those searches were redacted because disclosure, the Customs and Border Patrol asserted, “would disclose techniques and/or procedures for law enforcement investigations or prosecutions.” When the Post’s Snowden coverage won honors, intelligence officials groused in an interagency email chain. “Can you believe the Pulitzer Prize announcement?” one official wrote. “So annoying.” Hundreds of emails recorded behind-the-scenes reactions and internal debates about how to respond to my questions or stories. The government asked the court to withhold all of those on grounds of deliberative privilege.

  I learned something else by way of FOIA. My practice, when seeking comment for a story, had been to frame my questions precisely and send them to spokesmen by email. I aimed to avoid misunderstanding and make the questions harder to dodge. It turned out, according to internal government correspondence I received in the course of my FOIA lawsuit, that the spokesmen were forwarding my emails to the FBI. The public affairs shop subsumed its work entirely to law enforcement. The spokesmen did not even have to be asked. They volunteered. “Below please find correspondence between reporter Bart Gellman and NSA & ODNI public affairs,” a senior intelligence official, name redacted in the FOIA release, wrote on December 21, 2013, to a manager in the Office of the National Counterintelligence Executive, or NCIX. “In the email, Gellman references conversations he has with Edward Snowden. . . . Are these emails useful for NCIX?”

  The manager replied, “Yes, these types of correspondence are useful. We will ensure they get to the FBI investigations team.” The deputy assistant director for analysis and collection management passed my emails to the group chief for analysis and production, who wrote in turn to someone at FBI headquarters. “I have been asked to share the below with the appropriate FBI POCs and so am forwarding so you can pass on as appropriate,” the group chief wrote. From that point forward, my emails were redirected routinely to the FBI.

  None of the FOIA disclosures thus far have acknowledged the existence of FIRSTFRUITS. Two hints, nonetheless, have emerged from descriptions of documents that the government is asking the court to keep out of my hands. I knew, as I mentioned earlier, that the Foreign Denial and Deception Committee hired a contractor to manage the database that tracks news stories, reporters, and referrals for criminal investigation. In the FOIA case the government cited a contractor’s “trade secrets” to justify withholding what it blandly described as “copyrighted bulletins summarizing intelligence news reports which are prepared, pursuant to contract, by a non-governmental outside vendor.”

  The most unsettling revelations in the case came from filings that hinted at the nature of the records that the FBI wants to withhold. According to an affidavit from David M. Hardy, a section chief in the FBI’s Information Management Division, my name appears in files relating to “investigations of alleged federal criminal violations and counterterrorism, counterintelligence investigations of third party subjects.” Not only the Snowden case, that is. Investigations and third parties, plural. Some of those files, Hardy said, may appear in an ELSUR database, short for electronic surveillance, that includes “all persons whose voices have been monitored.”

  Even the names of the FBI files, Hardy told the court, would give too much away. The file names specify “non-public investigative techniques” and “non-public details about techniques and procedures that are otherwise known to the public.” The FBI is especially concerned about protecting one unspecified intelligence-gathering method. “Its use in the specific context of this investigative case is not a publicly known fact,” Hardy wrote. The bureau wants to protect “the nature of the information gleaned by its use.”

  Those are not comforting words.

  EIGHT

  EXPLOITATION

  It was warm enough, barely, for an outdoor table the day I met the Google security engineer. He was not supposed to talk to a journalist, certainly not about company business, but he agreed to meet on a Sunday away from his office. I had promised to tell him something consequential to his work. Understanding it, I said, was equally consequential to mine. That sufficed as a lure. He knew what kind of work I was doing.

  This begins to sound like a quid pro quo, but that was not my proposal. No trade was in the offing. On rare occasions I chose to show a document to one of my sources, not as barter but as an aid to reporting. Sometimes I could not interpret the evidence without an expert guide. Something was happening inside Google. Ashkan Soltani and I had worked out enough on our own to know it was going to be news. Now I needed insiders—from Google or the NSA, or both. They had to see what we saw in order to help us.

  The engineer and I sat on a wide esplanade that edged the Hudson River on a late autumn afternoon, the sun strong and slanting low in the sky. Skateboarders honed their aerobatic tricks, grinding across the top of a low concrete wall. My source drained a beer and made small talk as I booted a laptop. We agreed on some ground rules. I turned to page fourteen of a twenty-three-page NSA file and pivoted the display in his direction. The engineer shaded h
is eyes.

  I knew the exact moment he spotted the smiley face near the bottom of the classified diagram. His eyebrows climbed. He leaned closer to the screen. He started to say something, took a breath, and reread the page. The title was “Current Efforts—Google.” Current efforts, that is, by the NSA. Against Google.

  “Mo-ther-fuckers,” he said at last, pronouncing the word with conviction. He reached for his beer, found it empty, and waved for a waiter. “I hope you publish this. I’ve spent years securing this network. Fuck those guys.”

  We talked for two more hours and agreed to meet again.

  * * *

  —

  The hand-drawn cartoon I showed the engineer lent a whimsical quality to a classified presentation that was otherwise notably dense. Two fluffy clouds floated side by side against a lemon sky. The one on the left bore the label “Public Internet.” The one on the right said “Google Cloud.” An arrow pointed to the spot where the two clouds met, the digital borderland between Google’s inside networks and the outside world. The caption said “SSL Added and removed here!” That was where the artist placed the smile emoji.

  Added and removed? It hardly seemed possible. SSL, which stands for secure sockets layers, is the core technology of encryption on the internet. It is the padlock in your browser’s address bar, the armor that protects information as it flows across the web. To remove SSL would be to pick the lock, pierce the armor, peel encryption away. Could the NSA somehow do that?

  The drawing transfixed me. Ashkan and I spent most of a month of research and reporting to decode it. One thing was clear right away. The emoji conveyed an unmistakable boast, a keyboard warrior’s dance of victory. The NSA had triumphed over Google in some secret fashion. It had found a path, this being the point of signals intelligence, into something valuable that Google thought to be hidden from prying eyes. LOL Google, the diagram said, if you think your cloud is secure. The message resembled an old hacker meme, mocking a vanquished opponent: All your base are belong to us. Game over. Thank you for playing. Small wonder that my source had erupted in anger.

  So simple a diagram and yet so opaque. Nothing in the adjacent pages explained how the NSA pulled off its heist. Technical details did not always matter for journalistic purposes, but they damn well did here. Every expectation of privacy on the internet, every secure transaction, depended on SSL. If the encryption was broken in some fundamental way, we were living in a different world than we had been led to believe. Maybe the diagram meant something else, but that question was front and center.

  “Google definitely has a dog in this fight,” the engineer told me. Not only did the company rely on SSL, but its employees were also core contributors to the shared software code that brought encryption to websites around the world.

  By some means or another, the NSA was piercing the boundary between the public internet and Google’s private infrastructure. On the diagram, that boundary was virtual, a digital abstraction represented by a hand-drawn box that nestled between two clouds. In the tangible world, the box had to be something concrete—physical hardware that could be seen and touched and pinpointed on a map. But where? Information crossed the globe by land, sea, and air, and Google hosted operations on four continents.

  Unless the NSA had gone entirely rogue, which I did not believe, the operation had to be taking place overseas. It would be lawless, full stop, for the agency to engage in clandestine collection against a U.S. company on American soil. Even with a federal warrant, electronic surveillance inside the United States was generally the province of the FBI. Outside the United States, the NSA’s operations were considerably less constrained. The Foreign Intelligence Surveillance Act did not apply to collection abroad unless it deliberately targeted an American using equipment based inside the United States. There were other rules and regulations based on Executive Order 12333, a directive signed by President Ronald Reagan. Insiders pronounced it Twelve Triple Three. The standards set in that executive order were more permissive, implemented in classified regulations and seldom subject to oversight outside the executive branch.

  “Look, NSA has platoons of lawyers, and their entire job is figuring out how to stay within the law and maximize collection by exploiting every loophole,” John Schindler, a former NSA analyst who taught at the Naval War College, told me. “It’s fair to say the rules are less restrictive under Executive Order 12333 than they are under FISA.”

  Congress and the courts stayed largely out of the picture. The FISA Court had no jurisdiction, and the intelligence committees, a top Senate committee staff member told me, “are far less aware of operations conducted under Twelve Triple Three.” The staff member added, “I believe the NSA would answer questions if we asked them, and if we knew to ask them, but it would not routinely report these things, and, in general, they would not fall within the focus of the committee.”

  The NSA did not have to invade Google headquarters in Mountain View, California, to tap the internet giant’s data stream. Google had built enormous facilities all over the world, large enough to require their own electrical power substations and industrial cooling plants. Thousands of miles of privately owned or exclusively leased fiber optic cable linked these fortresslike data centers, sixteen of them in all. “Cloud” was an insubstantial metaphor to describe this globe-spanning machinery. There must be many points of entry, potentially, for an enterprising signals intelligence agency.

  The engineer and I considered several hypotheses at length. Understanding how the NSA broke in required me to ask delicate questions. How exactly did Google protect its cloud from an intrusion like this?

  “I’m not going to tell you,” he said.

  “Not well enough, it looks like,” I said.

  “I’ll tell you one thing—that’s going to change,” he said.

  One mystery here was the NSA’s motive to break into Google at all, no matter where it happened. There was no obvious need for stealth, or none obvious to me right away. Under PRISM, the NSA already had compulsory access to any information that Google possessed about a foreign intelligence target. This access was hardly a secret from Google, though the public knew nothing about PRISM before the Snowden disclosures. The NSA sent classified orders to Google by the tens of thousands, specifying accounts it wanted to tap. As long as the agency asserted a lawful foreign intelligence purpose and sent orders in valid form, Google had to comply. Under separate authority from the FISA Court, the NSA could also gather communications in transit through U.S. cables from one foreign country to another. All that happened under streamlined legal procedures inside the United States. Why would the NSA resort to black bag operations against Google assets somewhere else?

  When I worked through related documents with Ashkan, we found nothing that spelled out the background or mechanics of the operation. More than a dozen presentations in the Snowden archive had references or brief asides that seemed relevant. We assembled one cryptic technical clue at a time: “international/fiber,” “private search index,” “Internal server-to-server authentication,” “GCHQ access environment,” “gaia protocol.” A few cover names repeated themselves: WINDSTOP, MUSCULAR, GHOSTMACHINE. Over and over we found signs of an operation that took place on a grand scale: “bulk access,” “full take,” “high volume.” The collection systems had to cope with “truly heinous” quantities of information, one briefing said. No wonder about that. Google accounted for the lion’s share of the whole world’s traffic in web search, email, photographs, chat, online documents, and videos.

  One week into the sleuthing, I described the Google cloud diagram to Snowden in a live chat over one of our secure links. There was more to the drawing than I have laid out here so far. I was looking at this:

  The box where encryption was “[a]dded and removed” was labeled “GFE.” That stood for Google Front End. The term referred to the computer servers—there were many—that connected Google’s internal systems
to the browsers used by members of the public. When you check your email, your computer or smartphone is talking to a front end server. Google’s private infrastructure, I explained to Snowden, was shown on the right side of the diagram. It included boxes labeled “DC,” or data center. “Traffic in clear text here,” a caption said. No encryption, in other words. Only outside its digital property line, on the public internet, did Google armor its data with encryption. The NSA, somehow, was inside Google’s house.

  Snowden and I spoke in technical shorthand, but in essence I asked him, how did the NSA break in?

  “That’s a complicated topic and I can’t answer everything,” Snowden wrote.

  Then how about a more basic question: why would the NSA do this?

  Because it could, Snowden replied. The agency saw great heaps of information flowing unencrypted through conduits it could reach. “I’m speculating, but NSA doesn’t ignore low-hanging fruit,” Snowden wrote.

  One of the NSA documents used the same phrase. High-volume collection overseas could be “optimized,” it said, by focusing first on well-known “protocols and applications—‘low hanging fruit.’”

  Understanding dawned on me slowly in the coming days, no more than surmise at first. The NSA must be getting something abroad that it could not get at home. Location made all the difference as a legal matter. Domestic collection under FISA was inherently a discriminative act. PRISM offered rich access to Google accounts, but the NSA had to specify individual foreign targets. NSA analysts sent “selectors” to Google, most commonly in the form of email addresses, and targeting had to follow court-approved guidelines. Another way to say this is that the NSA had to know in advance whom it wanted to spy on. It could not use mass surveillance tools to discover unknown targets—not if the tools were employed on U.S. territory, that is. If the NSA reached into the Google cloud from overseas, on the other hand, there was a lot more room to maneuver. Analysts could gather a whole data stream and sift it with selection criteria that FISA did not permit. They could decide to collect, for example, any communication that used this version of that software or mentioned these key words on those network domains. They were bound to collect much more than they wanted that way, but they could subject the take to further filtering.

 

‹ Prev