Binney had quit because spying on Americans—without a warrant—was a red line for him, whether or not ostensibly lawful under presidential orders. As technical director of the NSA’s World Geopolitical and Military Analysis Reporting Group, he had helped design analytic tools to automate the production of a gargantuan social graph from foreign metadata. One day in October 2001, Binney said, he heard that a colleague, Ben Gunn, was overseeing the installation of new equipment behind a red-sealed door on the third floor of OPS 2B. Gunn was a hard-charging former employee of GCHQ, the United Kingdom’s counterpart agency, who had since become a U.S. citizen and joined the NSA. Subordinates told Binney in alarm, he said, that Gunn was adapting Binney’s software, called ThinThread, to analyze domestic calls. When I showed Binney the network diagram of American call data records being funneled to MAINWAY, he did a double take.
“That’s a name I’ve never spoken of,” he said. “That’s the program they used for STELLARWIND to reconstruct social networks. They take a data set of trillions of calls and collapse it down to a network diagram of who communicates with whom.” By “collapse,” a data science term that is also called “reduce,” Binney meant an analytic technique that strips out the granular detail—the dots and lines on the map—in order to summarize the hidden links of individuals and their overlapping social groups. The result was akin to a constellation pulled out of a skyscape of numberless stars.
Crucially, Binney confirmed to me that the techniques he devised did not confine themselves to individual targets. They computed social graphs for every caller in the gargantuan data set.
“The software does build a profile on everybody in the database, whether or not the analysts look at it,” he said. If they did choose to look, he said, they could track individuals in fine detail by extracting a timeline from the index of individual calls. That information, in turn, was enriched by metadata and content drawn from other NSA repositories—for example, from PINWALE, which is a database of the content of intercepted emails and other digital text.
The U.S. call records were supposed to be segregated from other data sets in MAINWAY, with special permission required for access. That is what the network diagram on this page conveys with its depiction of FISA-authorized “partitions.” By regulation and practice that restriction all but disappeared in November 2010, when Attorney General Michael Mukasey approved new and more permissive rules for the Signals Intelligence Directorate.
An NSA summary circulated to analysts celebrated their new freedom to conduct “better, faster analysis” without painstaking efforts to protect the privacy of U.S. citizens and residents. Until then, foreign intelligence analysts were required to ensure that a telephone number (or any other “selector,” as the NSA calls search terms) did not belong to an American before using it in contact chaining analysis. The new procedures approved by Mukasey allowed the NSA staff to calculate social graphs “from and through any selector, irrespective of nationality or location.” That is, a U.S. telephone number could be used at the beginning, middle, or end of a contact chain, under no more restriction than a foreign intelligence target. The same change applied to British, Australian, and other allied Five Eyes nationals who were normally off-limits. Analysts needed no special permission from superiors to incorporate Americans into their work. The value of this change, the summary memo exulted, was that “it enables large-scale graph analysis on very large sets of communications metadata without having to check foreignness of every node or address in the graph.”
In 2012, a year before Snowden’s disclosures, NSA director Keith Alexander had ventured to Def Con, the annual Las Vegas hackers’ conference. It was a bold excursion into unfriendly terrain, intended to change minds and find recruits. NSA managers saw Def Con as a potential talent pool, albeit a hostile one. The conference had a notorious “Spot the Fed” contest, offering prizes to out government agents in the crowd. Alexander gamely took the stage.
“Does the NSA really keep a file on everyone?” Def Con’s founder, Jeff Moss, asked him bluntly.
“No, we don’t. Absolutely no,” Alexander replied. “And anybody who would tell you that we’re keeping files or dossiers on the American people knows that’s not true.”
Nobody had an inkling then of the Graph-in-Memory. No one knew the NSA and FBI were ingesting our call records. No concrete evidence was available to prove Alexander wrong. Did he take advantage of our collective ignorance to lie?
The tech-speak of electronic surveillance—“metadata,” “social graph”—makes it hard to see how systematic the public gaslighting had become. It is difficult, as I tried to illustrate with the tongue-in-cheek story of a trillion-teen house party, to translate the jargon into everyday terms that make sense. Sometimes, even so, it is illuminating to try. So let’s suppose you find a G-man standing by your phone, fountain pen and leather notebook in hand. He writes down the numbers you dial and the incoming caller IDs. He times each call on a pocket watch stored in his vest. At the end of the day he sends his notebook to FBI headquarters. Fleets of trucks dispatch all those notebooks, more than ten thousand tons a day, up the Baltimore–Washington Parkway to Fort Meade. Armies of clerks copy every page of every notebook onto a parchment roll that stretches coast to coast. Millions more labor in shifts, around the clock, to cross-reference each line, prepare an index, and draw a gargantuan map.
One of the dots on that map is the phone in your pocket or purse. Another is the phone on your desk or your wall, if you bother to keep a landline anymore. The NSA map traces paths to your spouse, your boss, your shrink, your landlord, your bookie, and perhaps to someone you would rather not mention at all. The good news is that the government is not in the business of random snooping and will probably never look at your small corner of that great big map. But it could. It assumed that power when it decided, in secret, to collect and analyze your calls. What’s drawn on the map stays on the map—until the day that someone decides otherwise.
The fanciful apparatus I just described could never be built in the analog world, of course. The personnel required would dwarf the population at large. Pencil and parchment consumption would deforest a continent. In all of human history, until right now, the tools of our species could not have accomplished surveillance on this scale. Digital technology made it possible. The government made it real, and never asked the public for consent.
Our intuitions do not work very well on complex, abstract questions. That is where our thought experiment helps. The imaginary G-men and notebooks and clerks do only some of the things that MAINWAY did with our telephone records, but at least we can picture their work. Big Data techniques may leave us unmoved, but most of us know what we would think if the fellow with the fountain pen turned up at our door. And the parchment model does not begin to capture the most important danger posed by the telephone records.
The real MAINWAY is, in essence, a surveillance time machine. It can reach back into the past and place eyes on events in a span of hours or days or weeks that had held no interest for the NSA at the moment they happened. That is because MAINWAY and the Graph-in-Memory keep copies of every map they draw. Remember: the FISA Court allowed the NSA to hold on to telephone logs for five years. Every call record included a date, time, and duration. The social graph—the relationship map—could change dramatically from day to day. Telephone accounts were opened and closed. Contacts spoke frequently for a week, then fell silent for a year. MAINWAY could rewind, review, and advance the map like a planetarium display. With a little imagination you can probably spot the risk in your own profession or personal life. Here is mine. I make the rounds of confidential journalistic sources, believing I am flying under the radar because my story is yet to come. There is no reason for the government to monitor the people I talk to, no reason to take the considerable trouble of obtaining a warrant on me. But that offers no protection now, because the government can look back as soon as it judges my work to pose a risk to protected
national security information. It is easier for investigators to spy on my sources than on me. There is a lower legal bar for surveillance of someone who signs the security contract. Investigators make a list of who knew the things I wrote about. Which of them showed contact with me in the month before my story? Which of them received calls from a burner phone bought for cash? Whose IP or MAC addresses, the telltales of a device on the web, logged on to newly created accounts? Whose movements overlapped with mine? If my source and I do not take special care, the answers are obvious. The government can watch us in retrospect as easily as if it had tracked us in real time. That is something entirely new, impossible until it had access to metadata in sufficient volume.
Ordinarily we think of surveillance in terms of its targets. The NSA is watching you, or me, or her, or them. Targeters have to know where to point their antenna. With bulk collection and a mass social graph, all that changes. There was no reason to track my contacts before the leak, but MAINWAY and associated tools could do it just as well in retrospect.
Which brings us back to Keith Alexander at Def Con. Credit where due: he did not lie about the dossiers. Not if we take the word to mean manila folders in a file cabinet somewhere. The NSA maintained no individual files on us, not even in digital form. The technology does not work that way. Something even more revealing, though, resided in MAINWAY. Our dossiers floated formlessly in a classified cloud, precomputed and untouched until someone asked for them. They were ghosts in the Graph-in-Memory, summoned on demand.
I am well aware that a person could take this line of thinking too far. Maybe I have. The United States is not East Germany. As I pieced this picture together, I had no reason to believe the NSA made corrupt use of its real-time map of American life. The rules imposed some restrictions on use of U.S. telephone records, even after Mukasey blew a hole in them. Only twenty-two top officials, according to the Privacy and Civil Liberties Oversight Board, had authority to order a contact chain to be built from data in MAINWAY’s FISA partitions. But history has not been kind to the belief that government conduct always follows rules or that the rules will never change in dangerous ways. Rules can be bypassed or rewritten—with or without notice, with or without malignant intent, by a few degrees at a time or more than a few. Government might decide one day to look in MAINWAY or a comparable system for evidence of violent crime, or any crime, or any suspicion. Governments have slid down that slope before. Within living memory, Richard Nixon had ordered wiretaps of his political enemies. The FBI, judging Martin Luther King Jr. a “dangerous and effective Negro,” had used secret surveillance to record his sexual liaisons. A top lieutenant of J. Edgar Hoover invited King to kill himself or face exposure.
Meaningful abuse of surveillance had come much more recently. The FBI illegally planted hundreds of GPS tracking devices without warrants. New York police spied systematically on mosques. Governments at all levels used the power of the state most heavy-handedly, sometimes illegally, to monitor communities disadvantaged by poverty, race, religion, ethnicity, and immigration status. As a presidential candidate, Donald Trump threatened explicitly to put his opposing candidate in jail. Once in office he asserted the absolute right to control any government agency. He placed intense pressure on the Justice Department, publicly and privately, to launch criminal investigations of his critics.
The Graph-in-Memory knew nothing of such things. It had no awareness of law or norms or the nature of abuse. It computed the chains and made diagrams of our hidden relationships on a vast, ever-updating map. It obeyed its instructions, embedded in code, whatever those instructions said or ever might say.
* * *
—
In the national security establishment’s conversation with itself, Snowden’s name came up often and unfondly. There were exceptions, not many. Some Aspen participants saw the virtue of a more transparent debate about surveillance. Others thought the NSA had gone too far. Negroponte leaned in to tell me quietly, an hour after our panel, that the telephone records collection produced nothing important.
Visiting foreign officials, another minority, took tart exception to some of what they had learned. I found Gilles de Kerchove, the European Union’s counterterrorism coordinator, alone in an upstairs lounge, looking disgruntled. For a moment it looked as though he would wave me off. Then he nodded, as if to himself, and recounted the morning, three weeks before, when he learned from Der Spiegel that his fax machine had been compromised by an NSA surveillance device. At fifty-six, after years of interaction with American counterparts, de Kerchove did not regard himself as naïve. But a recent remark from the NSA director, in private, had left him sputtering. “‘Everybody knows. Everybody does [it]’—Keith Alexander said that,” de Kerchove told me. “I don’t like the idea that the NSA will put bugs in my office. No. I don’t like it. No. Between allies? No. I’m surprised that people find that noble.”
Most of the Americans here could speak of nothing but Snowden’s betrayal. Many accused him, in nearly identical words, of breaking a sacred oath when he handed NSA documents to me. That assertion, made in all apparent sincerity, was a fascinating artifact of secrecy culture. Clearly Snowden had breached a strongly felt norm, in addition to any offenses under the law, but the oath that people kept talking about does not exist. They were conflating two separate things in their minds, incorporating one into the other. Snowden, like anyone else with a clearance, had signed Standard Form 312, the Classified Information Nondisclosure Agreement. That is a government contract with criminal penalties for noncompliance. The oath Snowden swore, the same one recited by most of his accusers, obliged him to “support and defend the Constitution of the United States against all enemies, foreign and domestic.” He and his accusers interpreted that differently.
Some insiders asked incredulously why the government even allowed me to keep writing stories about the classified archive. “As a test of your concern for the security of this country, I suggest you turn over to the FBI everything you received from Snowden,” George Cotter later wrote to me, his retirement as NSA chief scientist affording him freedom to be quoted by name. Others, speaking off the record, said in earnest that the FBI should not wait for me to volunteer.
Undercurrents of grievance eddied through the bars and outdoor spaces. Many of those present were baffled by the distrust of the body politic. Shortly before the forum began, a national poll found more people who regarded Snowden as a whistleblower than a traitor. Registered voters displayed “a massive shift in attitudes,” the survey found, with a majority agreeing for the first time in years that “the government’s anti-terrorism efforts go too far.” Members of the Aspen tribe believed in their mission. How could the public and press fail to understand that they were trying to protect us? Who could conceive such a thing as too much intelligence? The president, Congress, and the public at large forgave no failure by intelligence agencies to spot trouble before it arrived. In a corridor I overheard one man say, “Guy boards an airplane with a crotch full of TATP, too dumb to ignite his own underwear, and that’s supposed to be a systematic failure of the intelligence community. Now the same people tell us NSA should scale back.”
Most of the men and women here had spent whole careers inside the sealed habitat of “the high side,” the classified networks that wall off their working environment. Pretty much everyone inside those walls relied to some degree on signals intelligence. The reach of the apparatus came to seem routine. “I did not have a whole lot of time to step back and think about philosophical questions,” said a recent retiree from the uppermost ranks of U.S. intelligence. “None of us did. We were too busy putting out fires and managing a complex set of problems and answering to masters who wanted to know how the hell we had missed whatever we didn’t see coming.” Policymakers, spies, and analysts accepted the surveillance apparatus as they found it. They tried to use it wisely. They were altogether unprepared for the suspicion and anger aroused when ordinary people caught a glimpse
of the machinery.
Raj De, the NSA’s soft-spoken general counsel, accepted that there could be legitimate disagreements about the boundaries. “I just hate the undertone of, ‘You guys are lying,’” he told me. “People can say, ‘We don’t agree with that.’ It’s just—I hate the tenor of the whole thing.”
Negroponte looked for a dispassionate explanation. “I think the ghost of J. Edgar Hoover and the ghost of Richard Nixon have long been exorcised, but they still cast this baleful influence on some of the things we’re doing,” he said.
Some of the sentiments I heard at Aspen reminded me of a presentation I had recently come across in the Snowden archive. It dated back to the fall of 2001. In seven pages, marked TOP SECRET//COMINT/NOFORN/X1, the NSA took inventory of the capabilities it must preserve at all cost in the coming war with al Qaeda. Along the way, the agency ranked its own place in the hierarchy of “critical national assets.” The “ability of the U.S. to conduct SIGINT Operations,” according to this analysis, was a Level 1 national resource. The heading on that column was “Survival,” reserved for assets “without which America would cease to exist as we know it.” The organizational health of the Signals Intelligence Directorate qualified as Level 2, or “Critical,” which meant “causally one step removed from survival.”
The author of these slides was a career civil servant, fairly senior but nowhere near the top. I wondered whether the views expressed were merely her own. The answer became clear when her name and this presentation turned up in a memorandum from Joseph J. Brand. At the time he was chief of staff for policy in the Signals Intelligence Directorate. Brand had commissioned the slide deck and marked it for distribution to the intelligence community at large. He cited it in support of a special achievement award for its author.
Dark Mirror Page 19