Dark Territory

Home > Other > Dark Territory > Page 10
Dark Territory Page 10

by Fred Kaplan


  For his part, Mudge was always happy to give them advice and never charged a fee. He figured that, any day now, the feds could come knocking at the warehouse door—some of the L0pht gang’s projects were of dubious legality—and it would be useful to summon, as character witnesses, the directors of the nation’s intelligence and law enforcement agencies.

  For the next few hours on that winter night in Watertown, the L0pht gang held Clarke’s rapt attention, telling him all the things they could do, if they wanted. They could break the passwords stored on any operating system, not just Microsoft Windows. They could decrypt any satellite communications. They had devised software (not yet for sale or distribution) that could hack into someone’s computer and control it remotely, spying on the user’s every keystroke, changing his files, tossing him off the Internet or whisking him away to a site of their choosing. They had special machines that let them reverse-engineer any microchip by de-capping the chip and extracting the silicon dye. In hushed tones, they told him about a recent discovery, involving the vulnerability of the Border Gateway Protocol, a sort of supra-router for all online traffic, which would let them—or some other skilled hackers—shut down the entire Internet in a half hour.

  Clarke didn’t know whether to believe everything they said, but he was awed and shaken. Everyone who’d briefed him, during his crash course on the workings and pitfalls of the Internet, had implied or stated outright that only nation-states possessed the resources to do just half the things that Mudge and his chums were saying—and, in some cases, demonstrating—that they could do from this hole in the wall with little money and, as far as he could tell, no outside support. In short, the official threat model seemed to have it all wrong.

  And Clarke, the president’s special adviser on counterterrorism, realized that this cyber thing was more than an engrossing diversion; it fit into his bailiwick precisely. If Mudge and his gang used their talents to disrupt American society and security, exploiting the critical vulnerabilities that the Marsh Report had outlined, they would be tagged as terrorists—cyber terrorists. Here, then, was another threat for Clarke to worry about—and to add to his thickening portfolio.

  It was two a.m., after a few more drinks, when they decided to call it a night. Clarke asked them if they’d like to come down to Washington for a private tour of the White House, and he offered to pay their way.

  Mudge and the others were startled. “Hackers”—which is what they were—was still a nasty term in most official corridors. It was one thing for some spook in a three-letter agency to invite them to brief a roomful of other spooks on a hush-hush basis—quite another to be invited to the White House by a special adviser to the president of the United States.

  A month later, they came down, not only to see the West Wing after hours but also to testify before Congress. The Senate Governmental Affairs Committee happened to be holding hearings on cyber security. Through his Hill contacts, Clarke got the L0pht members—all seven of them, together, using their pseudonyms—placed on the witness list.

  Clarke had a few more conversations with Mudge during this period. His real name, it turned out, was Peiter Zatko. He’d been a hacker since his early teen years. He hated the movie WarGames because it encouraged too many other people his age, but nowhere near his IQ, to join the field. He’d graduated not from someplace like MIT, as Clarke suspected, but from the Berklee College of Music, as a guitar major, at the top of his class. By day, Zatko was working as a computer security specialist at BNN, a Cambridge-based firm, though his looming public profile accelerated his plans to quit and turn the L0pht into a full-time commercial enterprise.

  He and the other L0pht denizens made their Capitol Hill debut on May 19, 1998. Only three senators attended the hearing—the chairman Fred Thompson, John Glenn, and Joe Lieberman—but they treated the bizarre witnesses with respect, hailing them as patriots, Lieberman likening them to Paul Revere, alerting the citizenry to danger in the digital age.

  Three days after Mudge’s testimony, Clinton signed a Presidential Decision Directive, PDD-63, titled “Critical Infrastructure Protection,” reprising the Marsh Commission’s findings—the nation’s growing dependence on computer networks, the vulnerability of those networks to attack—and outlining ways to mitigate the problem.

  A special panel of the NSC, headed by Rand Beers, had cut-and-pasted early drafts of the directive. Then, at one of the meetings, Beers informed the group that he was moving to the State Department and that Dick Clarke—who, for the first time, was seated next to him—would take his place on the project.

  Several officials on the panel raised their eyebrows. Clarke was a brash, haughty figure, a spitball player of bureaucratic politics, admired by some as a can-do operator, despised by others as a power-grabbing manipulator. John Hamre, the deputy secretary of defense, particularly distrusted Clarke. Several times Hamre heard complaints from four-star generals, combatant commanders in the field, that Clarke had directly phoned them with orders, rather than going through the secretary of defense, as even the president was supposed to do. Once, Clarke told a general that the president wanted to move a company of soldiers to the Congo during a crisis; Hamre looked into it, and found out the president had asked for no such thing. (Clinton eventually did sign the order, but to Hamre and a number of generals, that didn’t excuse Clarke’s presumptuousness.)

  Hamre’s resentment had deeper roots. A few times, when he was the Pentagon’s comptroller, he found Clarke raiding the defense budget for “emergency actions,” purportedly on behalf of the president. Clarke invoked legal authority for this maneuver—an obscure clause that he’d discovered in the Foreign Assistance Act, Section 506, which allowed the president to take up to $200 million from a department’s coffers for urgent, unfunded requirements. Hamre had enough headaches, dealing with post-Cold War budget cuts and pressure from the chiefs, without Clarke swooping down and treating the Pentagon like his piggy bank.

  As a result, although they held similar views on several issues, not just cyber security, Hamre hid things from Clarke, sometimes briefing other department deputies in private, rather than in a memo or an NSC meeting, in order to keep Clarke out of the loop.

  Around the time of Solar Sunrise and Moonlight Maze, a special prosecutor happened to be investigating charges that President Clinton and the first lady had, years earlier, illegally profited from a land deal in Arkansas. Orders went out from the White House counsel, barring all contact between the White House and the Justice Department, unless it went through him. Clarke ignored the order (he once told an NSA lawyer, “Bureaucrats and lawyers just get in the way”) and kept calling the FBI task force for information on its investigation of the hackings. Louis Freeh, the bureau’s director, who didn’t like Clarke either, told his underlings to ignore the calls.

  But Clarke had protectors who valued his advice and gumption. When one agency head urged Sandy Berger, the national security adviser, to fire Clarke, Berger replied, “He’s an asshole, but he’s my asshole.” The president liked that Clarke was watching out for him, too.

  Midlevel staffers were simply amazed by the network that Clarke had woven throughout the bureaucracy and by his assertiveness in running it. Once, shortly after coming over from the NSA to be Vice President Gore’s intelligence adviser, Rich Wilhelm sat in on a meeting of the NSC counterterrorism subgroup, which Clarke chaired. High-ranking officers and officials, from all the relevant agencies and departments, were at the table, and there was Clarke, this unelected, unconfirmed civilian, barking out orders to an Air Force general to obtain an unmarked airplane and telling the CIA how many agents should board it, all with unquestioned authority.

  An aide to Clarke named John McCarthy, a Coast Guard commander with a background in emergency management, attended a Saturday budget meeting, early on in his tenure, where Clarke, upon hearing that an important program fell $3 million short of its needs, told McCarthy to get the money from a certain person at the General Services Administration, adding
, “Do it on Monday because I need it on Tuesday.” The GSA official told McCarthy he’d give him $800,000, at which point the bargaining commenced. Clarke wound up getting nearly the full sum.

  When Clarke replaced Rand Beers, the NSC deputies had been drafting the presidential directive on the protection of critical infrastructure, going back and forth on disagreements and compromise language. Clarke took their work, went back to his office, and wrote the draft himself. It was a detailed document, creating several forums for private-public cooperation on cyber security, most notably Information Sharing and Analysis Centers, in which the government would provide its expertise—including, in some cases, classified knowledge—to companies in the various sectors of critical infrastructure (banking, transportation, energy, and so forth), so they could fix their vulnerabilities.

  According to the directive, as Clarke wrote it, this entire effort would be chaired by a new, presidentially appointed official—the “National Coordinator for Security, Infrastructure Protection, and Counter-terrorism.” Clarke made sure, in advance, that he would be this national coordinator.

  His detractors, and some of his admirers, saw this as a blatant power grab: he already had the counterterrorism portfolio; now he’d be in charge of critical infrastructure, too. Some of his critics, especially in the FBI, saw it as a substantively bad idea, to boot: cyber threats came mainly from nation-states and criminals; tying the issue to counterterrorism would misconstrue the problem and distract attention from serious solutions. (The idea also threatened to sideline the FBI, which, in the Solar Sunrise and Moonlight Maze investigations, had taken a front-and-center role.)

  Clarke waved away the accusations. First, as was often the case, he regarded himself as the best person for the job: he knew more about the issues than anyone else in the White House; ever since the problem had arisen, he and Beers were the only ones to give it more than scant attention. Second, his meetings with Mudge convinced him—he hadn’t considered the notion before—that a certain kind of terrorist could pull off a devastating cyber attack; it made sense, he coolly explained to anyone who asked, to expand his portfolio in this direction.

  As usual, Clarke got his way.

  But his directive hit an obstacle with private industry. In Section 5 of PDD-63, laying down “guidelines,” Clarke wrote: “The Federal Government shall serve as a model to the private sector on how infrastructure assurance is best achieved and shall, to the extent feasible, distribute the results of its endeavors.”

  This is what the corporate executives most feared: that the government would be running the show; more to the point, that they would be saddled with the nastiest word in their dictionary—regulations. They’d sensed the same threat when they met with the Marsh Commission: here was an Air Force general—and, though retired, he referred to himself as General Marsh—laying down the rules on what they must do, as if they were enlisted men. And now here was Dick Clarke, writing under the president’s signature, trying to lay down the law.

  For several months now, these same companies had been working in concert with Washington, under its guidelines, to solve the Y2K crisis. This crisis—also known as the Millennium Bug—emerged when someone realized that some of the government’s most vital computer programs had encoded years (dates of birth, dates of retirement, payroll periods, and so forth) by their last two digits: 1995 as “95,” 1996 as “96,” and so forth. When the calendar flipped to 2000, the computers would read it as “00,” and the fear was that they’d interpret it as the year 1900, at which point, all of a sudden, such programs as Social Security and Medicare would screech to a halt: the people who’d been receiving checks would be deemed ineligible because, as far as the computers could tell, they hadn’t yet been born. Paychecks for government employees, including the armed forces, could stall; some critical infrastructure, with time-coded programs, might also break down.

  To deal with the problem, the White House set up a national information coordination center to develop new guidelines for software and to make sure everyone was on the same page. The major companies, such as AT&T and Microsoft, were brought into the same room with the FBI, the Defense Department, the General Services Administration, the NSA—all the relevant agencies. But the corporate executives made clear that this was a one-time deal; once the Y2K problem was solved, the center would be dismantled.

  Clarke wanted to make the arrangement permanent, to turn the Y2K center into the agency that handled cyber threats. Sometime earlier, he’d made no secret of his desire to impose mandatory requirements on cyber security for critical infrastructure, knowing that the private companies wouldn’t voluntarily spend the money to take the necessary actions. But Clinton’s economic advisers strenuously opposed the idea, arguing that regulations would distort the free market and impede innovation. Clinton agreed; Clarke backed down. Now he was carving a back door, seeking to establish government control through a revamped version of the Y2K center. That was his agenda in taking over the drafting of the presidential directive—and the companies weren’t buying it.

  Their resistance put Clarke in a bind. Short of imposing strict requirements, which the president had already struck down, he needed private industry onboard to make any cyber security policy work: the vast majority of government data, including a lot of classified data, flowed through privately controlled networks; and, as the Marsh Report had shown, the vulnerability of private entities—the critical infrastructures—had grave implications for national security.

  Clarke also knew that, even if the government did take control of Internet traffic, few agencies possessed the resources or the technical talent to do much with it—the exceptions being the Defense Department, which had the authority only to defend its own networks, and the NSA, which had twice been excluded from any role in monitoring civilian computers or telecommunications: first, back in 1984, in the aftermath of Ronald Reagan’s NSDD-145; and, again, early on in the Clinton presidency, during the Clipper Chip controversy.

  Clarke spent much of the next year and a half, in between various crises over terrorism, writing a 159-page document called the National Plan for Information Systems Protection: Defending America’s Cyberspace, which President Clinton signed on January 7, 2000.

  In an early draft, Clarke had proposed hooking up all civilian government agencies—and, perhaps, eventually critical infrastructure companies—to a Federal Intrusion Detection Network. FIDNET, as he called it, would be a parallel Internet, with sensors wired to some government agency’s monitor (which agency was left unclear). If the sensors detected an intrusion, the monitor would automatically be alerted. FIDNET would unavoidably have a few access points to the regular Internet, but sensors would sit atop those points and alert officials of intrusions there, as well. Clarke modeled the idea on the intrusion-detection systems installed in Defense Department computers in the wake of Solar Sunrise. But that was a case of the military monitoring itself. To have the government—and, given what agencies did this sort of thing, it would probably be the military—monitoring civilian officials, much less private industry, was widely seen, and loathed, as something different.

  When someone leaked Clarke’s draft to The New York Times, in July 1999, howls of protest filled the air. Prominent members of Congress and civil-liberties groups denounced the plan as “Orwellian.” Clarke tried to calm these fears, telling reporters that FIDNET wouldn’t infringe on individual networks or privacy rights in the slightest. Fiercer objections still came from the executives and board members of the infrastructure companies, who lambasted the plan as the incarnation of their worst nightmares about government regulation.

  The idea was scuttled; the National Plan was rewritten.

  When the revision was finished and approved six months later, President Clinton scrawled his signature under a dramatic cover note, a standard practice for such documents. But, in a departure from the norm, Clarke—under his own name—penned a separate introduction, headlined, “Message from the National Coordinator.”
/>
  In it, he tried to erase the image of his presumptuousness. “While the President and Congress can order Federal networks to be secured,” he wrote, “they cannot and should not dictate solutions for private sector systems,” nor will they “infringe on civil liberties, privacy rights, or proprietary information.” He added, just to make things clearer, that the government “will eschew regulation.”

  Finally, in a gesture so conciliatory that it startled friends and foes alike, Clarke wrote, “This is Version 1.0 of the Plan. We earnestly seek and solicit views about its improvement. As private sector entities make more decisions and plans to reduce their vulnerabilities and improve their protections, future versions of the Plan will reflect that progress.”

  Then, one month later, the country’s largest online companies—including eBay, Yahoo, and Amazon—were hit with a massive denial-of-service attack. Someone hacked into thousands of their computers, few of which were protected in any way, and flooded them with endless requests for data, overloading the servers to the point where they shut down for several hours, in some cases days.

 

‹ Prev