Northrop got its first BACN contract in 2005, and it flew an experimental relay in February 2006. The black box was fitted onto a NASA-owned manned WB-57, the same plane used to map all of Afghanistan with hyperspectral precision. Flying over Southern California at 60,000 feet, BACN created a “forward-edge tactical server.” Marines on the ground tapped into real-time imagery and intercepts from collectors near and far, pulling down common situational awareness displays and current intelligence, and gaining access to network-management services, including the ability to send e-mails and make cellular calls over a military-only network completely divorced from the commercial Internet.4 In December 2008, a BACN-converted manned Bombardier business jet deployed to Afghanistan to serve as the quick-reaction capability to test the system operationally. Flying over a special operations or CIA mission otherwise taking place in a networked dead zone, the airplane could provide improvised connectivity for hours at a time. It could pull down what users at the edge needed directly from whatever was flying that possessed the right data. If a soldier queried the node and it did not have the requested data, its server would go out and poll other servers in the network to obtain it. BACN emerged not only to serve the soldiers at the edge and the new culture of constant contact but also because it was far cheaper and more flexible than leasing commercial communications. The Data Machine could now be extended anywhere, regardless of local capacity and without resort to commercial leases.
In June 2009, Northrop Grumman received a quarter of a billion dollars for three additional BACN Bombardiers, in addition to two unmanned Global Hawks, newly outfitted with BACN capabilities.5 By November 2010, two Global Hawk BACNs, each with 300-mile-radius coverage when airborne, were providing about 50 percent of the requested 24/7 network support for the edge. The drones can stay up for days at a time. By the end of 2012, more than 3,000 missions had been flown.6 In late 2011, a year after the last of 300,000-plus American soldiers and contractors left Iraq, the Pentagon formally christened the now multibillion-dollar converted Global Hawk the EQ-4.7 No one noticed the party. In fact, in the eleven years of its existence from concept to deployment, from 2003 until this writing in late 2014, no mainstream media outlet has ever reported on the existence of this now-multibillion-dollar tool for waging war anywhere and anytime.
Perhaps one of the reasons for the media’s silence is that this system—described by its manufacturer as platform agnostic, radio agnostic, and untethered—is virtually impossible to describe. Global Express is the closest one comes to a military nickname that sticks, but in a stroke of geographic indifference to mountainous Afghanistan, the overall system has been tagged Desert Express. So it is not a weapon, not a sensor, and though Global Hawk is host, it is not really a drone in the way most people think about drones. As simply as can be defined, it is an alternate and exclusive military Internet in the sky, essential to shore up a weak spot in the Data Machine but really a secret agent of the vision of precision without location, loitering transformed into perpetual war-making.
From the first night of Afghanistan bombing in October 2001, when everyone boggled over the all-seeing eye for the first time, decision-makers at the CIA, the Pentagon, the White House, and command centers near and far were glued to their own DNN, the drone news network, everyone fully in thrall. Video was of course the simplest explanation, spawning epithets of Predator porn and “CAOC crack,”8 but what really appealed to a television-watching and image-obsessed generation was persistence. General Jumper called it the buzzword of the decade in 2003.9 Arguably the most important strategy document that the Pentagon prepares, the Quadrennial Defense Review, in 2006 argued that future capabilities needed to favor “systems that have far greater range and persistence; larger and more flexible payloads for surveillance or strike; and the ability to penetrate and sustain operations in denied areas.”10 BACN is the facilitator of anywhere and always. Now all that was needed was all the time.
The concept of persistence requires yet another family of black box sensors. Predator is up there like no other, but it provided far less than the persistence that was envisioned, at least beyond extremely narrow individual targeting that came from looking through a soda straw. It’s an “immediate-time kind of reporting,” one air force officer said, “of viewing exactly what’s going on with whatever your selected target is—whether that’s a house, a building, a vehicle moving down the road, whatever that is—you are able to then sit there and watch that. It’s very small. So I just see one vehicle or two or three vehicles at the most, but my field of view just isn’t that big on the ground.”11 Not only did the Predator camera show a limited perspective, but the raw imagery from the moving platform proved not so easy to interpret, the thirty-to forty-five-degree angle constantly changing as the drone moved. Scientists went to work on better processing, developing software and hardware that would provide georeferences (what we today call metadata) and even a converted top-down perspective that matched a scene-based correlation, virtually all of the advances being borrowed from graphics processors used in gaming applications. The other two avenues of attack were increasing the breadth of the perspective (wide area) and providing higher resolution, thus allowing greater exploitation of each imaged scene by the naked eye.
Sonoma was the first experiment of widening the perspective, developed starting in 2003 by Lawrence Livermore National Laboratory in California. Using a novel mosaic-like sensor design that could view a wide area at high definition, the first prototype carried a 22-megapixel sensor (six times Predator’s resolution), the second a 66-megapixel sensor, and the third a 176-megapixel sensor, each capable of imaging a larger and larger area in a single frame. Where that normal sensor on Predator can image the area of a city block, Sonoma 2 could cover an area the size of downtown Washington, DC, and Sonoma 3 could see the entire metropolitan area. Such wide-area high-definition imaging exposes every corner. In one of the initial Sonoma experiments, an IED scenario was created—Red Team Intent—that assumed that any car that slowed down to five miles per hour for more than 100 feet was suspicious. Software was written that highlighted the path of all vehicles matching this signature. Then, once the pattern was triggered, an analyst could rewind the video and discover where a suspicious vehicle came from. And Sonoma could track 8,000 simultaneous moving objects.
It was truly persistence, but in order for surveillance to be useful, an analyst must be able to see the data in real time. As the Livermore laboratory explained, “all data processing for one frame must be completed before the next frame is captured.” With data being collected at two frames per second, Sonoma’s data exceeded the bandwidth of available communications by a factor of 100 to 10,000. So scientists applied various techniques, including data compression, to show only movement (or anomalies), while the georeferenced static background was only episodically transmitted to match what the sensor was seeing.12
Sonoma turned into the Mohawk Stare experiment for the army and then into Constant Hawk, and in 2006, a prototype Constant Hawk wide-area persistent surveillance (WAPS) system was quietly deployed in Iraq, owned and operated by contractors.13 Constant Hawk could record and archive sensor data that allowed for playback of incidents, such as roadside IED bomb blasts, to be reviewed. Once an event occurred, the data was downloaded, and analysts attempted to backtrack from the incident, tracing bomb-makers and insurgents who might have deployed the IED and, if possible, following them backward even to their points of origin. They call this method of going backward to pick up clues forensic analysis. This was warfare completely turned on its head. Constant Hawk was an immediate hit. But the experimental black box was integrated on a manned airplane and not a drone, giving it limited time in the air. And it still produced enormous amounts of data, much more than could be moved very far, and in formats useful only for demonstration.14
Then, as these things go, the Los Alamos National Laboratory in New Mexico produced Angel Fire for the marines—smaller and more user friendly—and other wide-area and persi
stent programs came knocking.
More black boxes meant more data. And the introduction of wide-area surveillance, and particularly high definition,15 exponentially increased the amount of information available. Collection outpaced the ability to move the information, store it, or process it.16 As a result, the Pentagon admitted in 2009 that it was drowning in data. It was now looking at hundreds of terabytes of new data coming in every day. That’s over 800 laptops with the typical 128-gigabyte solid state drive, and more than the total of all the terabytes collected by the Library of Congress Web teams.17 “We’re going to find ourselves in the not too distant future swimming in sensors and drowning in data,” said Lieutenant General David Deptula, head of air force intelligence in January 2010. And within a couple of years, Reapers would be carrying their own wide-area black boxes that would be able to track up to twelve different targets simultaneously, delivering 84 million pixels twice a second. “The iteration after that will jump to 30 and there are plans to eventually reach 65. That’s an increase from 39 possible video feeds [from Predators and Reapers] to more than 3,000 with a 50 cap force,” Deptula said.18 Data pipes were filled and storage was approaching saturation levels.19
The next month, BAE Systems announced successful flight tests of its ARGUS-IS, a 1.8-billion-pixel camera with a resolution of six inches that can see a minimum of sixty-five “Predator-like” video windows across more than 100 square kilometers.20 And ARGUS would transmit at five times the frame rate of Constant Hawk, ten times a second.21 One minute of high-definition video of a city block already demanded one gigabyte; an 800-megapixel image of a small city—that required to extract intelligence information at specific locations—demanded half of a terabyte per minute; ARGUS-IS, operating at 1,800 megapixels, could image a large city demanding half a petabyte per minute of bandwidth if all of the data was transmitted.22
BACN was pursued because everyone saw saturation coming, because there was a demand for far more bandwidth and data.
Part of the problem is the haystack itself. When 9/11 came, there were about 450 million Internet users and close to one billion mobile connections in use around the globe, sending about 10 billion electronic messages daily, 10 percent of them text messages. By 2014, the planet was closing in on two billion Internet users and the number of mobile connections was estimated at 7.5 trillion, with only about 5 percent of them in the United States. Internet use was no longer dominated by people sitting at computers; in most parts of the world, particularly in places like Afghanistan and Iraq, the vast majority of Internet access, including everything from communications to banking, was achieved using smartphones. By 2014, the number of electronic messages sent daily topped 500 billion. In the decade and a half after 9/11, the numbers multiplied many times over, with each development—digital DVDs replacing analog CDs, digital radio and television, high-definition, social media, and people living online—exerting greater and greater demands for bandwidth and presenting an infinite universe of data to be collected.
Everyone, including the custodians and residents of the Data Machine, is now drowning in information. The number of all kinds of manned and unmanned collection platforms tripled in the two years after 9/11 and continued to grow after the Iraq war started, increasing by over 200 percent from the end of the Bush administration until 2012.23 Just in terms of combat flight hours, drones increased from a total of around 22,000 in 2001 to over 550,000 in 2011.24 The demands for intelligence became so great, and the capacity to collect information proliferated so broadly, that by 2013, there were triple the number of platforms in Afghanistan than there had been at the height of operations in Iraq, despite the fact that the fighting force on the ground there was only one-fifth the number of troops that deployed to Iraq.25
Those in the know describe just the amount of visual data collected every day as five seasons’ worth of every professional football game played—thousands upon thousands of hours. The data moves around the globe multiple times, first for “actionable” purposes, which means in support of an immediate high-value mission. The data then moves to be processed for second-phase and multi-INT exploitation. It then moves to contribute to geospatial products. It then moves to park itself somewhere on the network. And it then moves whenever someone pulses the system, secret Googles that go under names like Stone Ghost, Gemini, and Hercules. On a daily basis, the Data Machine produces hundreds of thousands of reports, many of which require no human intervention whatsoever.
All of this data is now constantly on and fully dynamic and moving from desktops to handheld ROVERs and ginormous video walls in fusion centers, occupying chat, e-mail, and Web services for processors and users all along the way. It is a wholesale change in culture that had quietly taken hold in the military and intelligence communities, one where information—data—came to dominate, where it was seen as key to soldier safety and discriminate warfare. Yet despite the coming end of the big wars in Iraq and Afghanistan, despite the directive to stop buying platforms, and despite the saturation that was affecting movement and storage, no one could seem to find a limit, a point when or where information ended. Years later, when Edward Snowden brought to light the NSA’s infinite collection of signals, the broader impact (and appetite) of the Data Machine was lost in discussions of the legality and privacy of eavesdropping and cyberdata interception. The way the Data Machine itself works also wields enormous demands of its own, not just the post-9/11 cult of connect the dots and the kill chain perfected, but also the human factors—user friendliness and interactivity that make the machine workable for a generation of digital natives, seamless production values that now mask the drivel of most of the content.
CHAPTER EIGHTEEN
Command Post of the Future
For the king of Uruk-the-Town-Square,
the veil will be parted for the one who picks first;
for Gilgamesh, the king of Uruk-the-Town-Square,
the veil will be parted for the one who picks first.
TABLET II, EPIC OF GILGAMESH
Clustered along the Kabul-Gardez highway in the Char Asiab district and less than fifteen miles south of the city center of Kabul sits the hamlet of Khairabad, a dusty brown collection of wholly unexceptional houses and shops butting up against Afghan hills, the village itself at an altitude of 6,046 feet.
It was near Khairabad in 2005 that four Afghan policemen were killed just before nationwide elections were held. It wasn’t the Taliban or al Qaeda; it was Hizb-i-Islami, literally meaning “Islamic Party,” part political and part insurgent force, an organization that has opposed both the presence of American troops after 9/11 and the Karzai government. The organization is led by its founder, Gulbuddin Hekmatyar, a Sunni Pashtun and mujahideen whose résumé reads: Reagan-era freedom fighter, CIA proxy, prime minister, destroyer of Kabul, Taliban supporter, Taliban foe, al Qaeda affiliate, drug lord, officially designated terrorist, ally of everyone, ally of no one; the most-wanted everything, apparition who drifts across the border into Pakistan, even sometime resident of Iran. Since 9/11, Hekmatyar has added another honorific to his résumé—HVT, or high-value target—and he was the object of the third major CIA drone assassination attempt in May 2002.1 Hizb insurgents were implicated in an attempt on the Afghan president’s life in 2008. The group is also credited with one of the deadliest suicide bombings of 2013, one that killed fifteen, including two soldiers and four civilian contractors.2 Long after US troops are gone, Hekmatyar (or one of his successors) will still be there.
Befitting its perch along the main transportation route to the Khyber Pass, Khairabad’s status is also set in the giant brain of the Data Machine. It is enemy territory and an antigovernment stronghold, a place that the Afghan National Army might seek to pacify or just give up on, another pin on a global map. Home as it was to snipers and bomb-makers, for the United States, every transit through the village in the bad days represented the risk of an attack, if not by rifle or grenade launcher, then with a roadside IED. Hizb and other fighters would at
tack, blend into the homes and shops, alleyways and ditches, scurrying over rooftops and down tunnels, all with the exceptional guile of guerrillas defending their own turf.
To serve the Machine, Khairabad has also now been transformed into Everybad, a computer model created by a company that is itself virtual, its programmers working in a geographically distributed environment, with most civilian members of the team located in their home offices, modern-day Rosie the Riveters arming the men at the edge for duty in the war zone, here today and maybe gone from Afghanistan but somewhere else tomorrow.
With pinpoint accuracy, the United States has mapped every structure and compound in Khairabad, every mud hut, every wall, every path, and even every large rock. Like a set on a Hollywood studio lot (or yes, like a video game), Khairabad has been rendered, actual mountains and complex terrain of varying elevations mixed in with everything man-made. MetaVR, Inc., the company that built the Khairabad simulation for the US government, started with high-resolution commercial satellite imagery and extensive ground photography, then supplemented those with Internet research and actual intelligence from fights and incidents to precisely model 520 structures that match the footprint of the real village, its tree lines, its tunnel and cave entrances and networks, homes and courtyards, even furnished interiors. Even the crops in the fields are simulated, plants digitally sculpted so that their size and density change with the seasons. The effort is meticulous, and the user experience behind the Khairabad simulation is meant to increase the odds for the United States and the international side. But in the end, the geography is less important than the operating skills. Mastering them is thought applicable to any of thousands of Everybads in scores of places interchangeable within dozens of training scenarios.
Unmanned: Drones, Data, and the Illusion of Perfect Warfare Page 22