by Joel Garreau
From this tumult, a fog of intelligence is emerging. This project is called The Shadow Bowl. This hive, hosted by San Diego State University, is nicknamed “The River.” Very fat pipes connect it to the Supercomputer Center at the University of California at San Diego. What this is about is wiring the Super Bowl for human cognition.
Some of the cables lead up to the roof. There stands another laser cannon. If you precisely aim one laser cannon at another, you can create a beam of conjoined light along which you can transmit anything you can imagine and some things you might not think exist. Lasers have very narrow beams, though. So aligning them is no small trick. It seems Shania Twain’s band—the one that will be playing at halftime and which is now rehearsing on the 50-yard line—is so amped that its bass notes vibrate the light ring. This throws the beams out of whack, breaking the connection. That is why the leader of this enterprise, Dave Warner, is up there dancing on top of that damn fool light ring, re-aiming that laser.
This gathering is actually a sophisticated collection of perhaps a hundred people with biological, chemical, radiological, temperature, weather, motion and video sensors who are attempting to conduct an unusual experiment. They are engaged in an exercise that resonates to that possible outcome of The Curve called The Singularity. They are trying to make an entire multi-square-mile environment intelligent.
In the parking lot outside The River, antennas bristle. Truck-mounted robot uplinks scan the skies with their dishes, looking for their satellites like baby birds searching for their mothers. This is the Super Bowl of January 2003, only 16 months after the 9/11 attacks. The worry is what happens if there is an assault on this biggest secular holiday event of the American calendar. It’s no small concern. Some idiot has allowed a gasoline storage depot to operate just uphill from Qualcomm Stadium. If somebody were to fly a plane into that tank farm, flaming petroleum would head right for the stadium. The ravine of the river is heavily shielded with brush. If someone were to infiltrate it with mortars filled with biological, chemical or radiological weapons, it would be an easy lob to the 50-yard line.
The more you look at the festive stadium from the roof of the nearby garage brain center, the more uneasy you become. There are so many ways to attack those happy, innocent football lovers colorfully garbed as pirates for the game between the Raiders and the Buccaneers, teams named for outlaws of three centuries past. It’s almost too perfect. It gives one a shudder. Overhead, Blackhawk helicopters and jet fighters roam. Somehow this is not comfort-inducing.
The Defense Department, of course, is funding much of this work down by The River. Warner has been a DARPA principal investigator. He carries himself with the swagger of a lifeguard (which he used to be), sporting long, rock-star-quality hair, now receding at the forehead. (The blonde carting in supplies who has a smile so big you can see her molars is Janice Robertson, his girlfriend.) He favors fashionable sunglasses and a cell phone in a hip holder that he perpetually twirls like a six-gun. He has an MD and a PhD and variously describes himself as a cultural engineer and a neuroscientist. He has a tiny but powerful light-emitting diode taped to the bill of his baseball cap, demonstrating how often he has to perform surgery on small, dark pieces of gear. He refers to his funders as “DARPA Vader Ville.”
As far as Defense was concerned, what was being demonstrated here was how it might be possible to recognize a weapon of mass destruction and react to mass casualties. The practical result emerging was something quite different.
There is a uniform one comes to recognize at a gathering of those who are inventing the future. At The River, everyone wears black jeans and black sneakers, out of sheer habit. If you arrange to have little else in your wardrobe, getting up in the morning involves two fewer decisions. The ideal topper for such an ensemble is a black T-shirt. As it happens, Warner has provided those for the two dozen stalwarts at the core of this exercise. High up on the chest of these Shadow Bowl staff shirts there is a symbol of a figure that seems to have archangel wings, surrounded by a ring of palm fronds. It gives the staffers the curiously authoritative look of an intergalactic peacekeeping team. Warner gave careful psychological consideration to these symbols. The message: Don’t mess with the guys in the black T-shirts.
Among this hard core, the significant marks of individuality and identity involve the weird stuff hanging around their belts. People pull out of their fanny packs the most impressive things—an entire socket wrench set, or a knife sufficient for gutting a calf, or an Iridium phone that can connect directly to a satellite. My award for the best status display, however, goes to the fellow with a sling on his belt from which hangs a flashlight with a red lens. That object states that you are so adapted to dark rooms illuminated only by computer screens that when you need to search under the desk for misconnected cables, you view it as unthinkable to ruin your batlike night vision with a beam approaching daylight spectrum. It would harsh your mellow.
None of this, however, is to be mistaken for lack of serious purpose. All manifestations of bleeding-edge technology by definition are demonstrations of the just barely possible. Thus they usually appear ragged and unprepossessing. By the time the future has all its wires carefully tucked away in a nice metal box where you can no longer see the gaffer tape, it is no longer the future. If you had been in Steve Jobs’ garage in 1976, looking at the first mock-up of the Apple personal computer, you might have been forgiven for not seeing in it an agent of massive social change. You might not have looked at it and instantly seen e-mail, much less Google, in your personal future.
Just so, you have to squint a little at this ragtag collection of boys and their toys in San Diego to imagine where all this takes us. But for one weekend in January, what happened was that the boys of the Shadow Bowl for the first time in human history made several square miles of the San Diego River smart. They made the water smart, with sensors making it alert to little biological critters meant to do harm. They made the air smart, full of sensors wary of radiation, chemicals and detonations. They made the dirt smart, sensitive to the movement of would-be attackers. Most important, they imaged all this and ported the intelligence into one place. There blossomed unprecedented simultaneous views of everything that was going on in the area, from the parking lots to the drunk tank to the end zones.
They did this in part to imagine how you’d build a superorganism. How might you rebuild the connections between human and machine if you were to adapt the machines to the human nervous system, rather than the other way around? Dave Warner calls this the “last-millimeter” problem—the stubborn and persistent lack of connection between all that our machines can gather and all that our minds can know.
How would you wire all of the senses that humans come equipped with and make them a seamless part of a network in which the distinction between human and machine blurred? How might you feed information directly to your skin so that you would know whether a potential threat was coming from the left or the right? How might you feed information directly to your ears, which can make fine distinctions that eyes cannot? If something small but bad started to happen, you might instantly hear and recognize a discordance as certainly as you could tell which violin suddenly went out of tune in a philharmonic. How might you use your nose to alert you to critical incoming information by overriding all the other senses as if with a sudden burst of ammonia?
In such a world, the superb human ability to recognize patterns would be an element in a loop that roamed far beyond what is now the human ability to sense. If the human so connected were a fighter pilot or an air traffic controller or a pollution monitor, it would allow her to actually feel, hear and smell tens of thousands of cubic miles of space, alert to discord or opportunity in the music of the spheres. In this fashion, the intelligence of millions of little networked agents would enhance human thought. Augmented perception, this is called, extending our senses out past our skin, giving humans mastery of all they survey and beyond. It is meant to be a qualitative change in what it means to be hu
man, to be enhanced in ways beyond the imagination of any previous generation.
Back at the sponsoring university, Vernor Vinge thinks about the implications of all this. Author of True Names and A Fire Upon the Deep, the novelist Vernor Vinge is a sweet, unassuming 60-something with a fuzzy fringe of grayish white hair and silver-framed aviator bifocals. His day job, before he decided to write best-sellers full time, was at San Diego State, where he still has an office, as a professor in the Department of Mathematics and Computer Science. It shows. He’s the sort of methodical chap who takes notes of ideas that occur to him while listening to you. He wants to be sure to be systematic about sharing everything with you.
Vinge (rhymes with stingy, which he distinctly is not) in 1993 introduced the idea of The Singularity to describe huge but unpredictable social change driven by The Curve. In a seminal academic paper delivered to a NASA colloquium he wrote, “I argue in this paper that we are on the edge of change comparable to the rise of human life on Earth.” He’s anticipating the possibility of greater-than-human intelligence. He’s talking about some form of transcendence.
As a metaphor for mind-boggling social change, The Singularity has been borrowed from math and physics. In those realms, singularities are the points where everything stops making sense. In math it is a point where you are dividing through by zero, for example. The result is so whacked out as to be meaningless. Physics has its black holes—points in space so dense that even light cannot escape their horrible gravity. If you were to approach one in a spaceship, you would find that even the laws of physics no longer seemed to function. That’s what a Singularity is like. “At this singularity,” writes Stephen Hawking in A Brief History of Time, “the laws of science and our ability to predict the future would break down.” Another borrowed metaphor is “the event horizon,” the point of no return as you approach a black hole. It is the place beyond which you cannot escape. It is also the point beyond which you cannot see.
Some people think we are approaching such a Singularity—a point where our everyday world stops making sense. They think that’s what happens when The Curve goes almost straight up. The sheer magnitude of each doubling becomes unfathomable.
To Vinge that’s actually more than a possibility. He’s gone to great effort to imagine scenarios in which it might not occur. Even though he has a multiple Hugo Award–winning imagination, he hasn’t had much luck. If The Singularity is possible, he doubts it can be prevented. He believes some sort of fundamental transcendence will happen soon. “I’ll be surprised if this event occurs before 2005 or after 2030,” he says.
Vinge makes an analogy to the evolution we know. By long ago learning to do what-ifs in our head, we rapidly surpassed natural evolution. We discovered we could solve many problems thousands of times faster than nature could. Now, with our exploding technology, “by creating the means to execute those simulations at much higher speeds, we are entering a regime as radically different from our human past as we humans are from the lower animals,” he writes.
The critical element of his Singularity scenario is that it is fundamentally out of control. When finally we experience it, he believes, it will be like wildfire: “Developments that before were thought might only happen ‘in a million years’ (if ever) will likely happen in the next century.”
Vinge and others see several ways that greater-than-human intelligence might occur in our prospective lifetimes:
• The Curve drives supercomputers, intentionally or unintentionally, to cross the line to greater-than-human intelligence.
• The Curve drives the Net to interconnect so much power that, intentionally or unintentionally, it wakes up as one superorganism.
• Information industry implants into biological humans produce people with greater-than-human intelligence. (As long ago as the early nineties, Vinge notes, a PhD-level human and a decent workstation—not even connected to the Net—probably could have maxed any IQ test.)
• Biological technology, probably through genetic engineering, produces humans with greater-than-human intelligence.
The Singularity would occur this way. Suppose one of the above scenarios were to happen. Suddenly we find ourselves with an ultra-intelligent critter. Making machines is what humans do real well. So what a greater-than-human-intelligence critter naturally does is start making machines vastly better and more intelligent than humans could. And faster. Much, much faster. These vastly better and more intelligent critters then create even more intelligent critters. And the spiral never ends. This would lead to what Vinge describes as “an intelligence explosion.” In fact, that first ultra-intelligent critter might be the last invention humans ever need make. Or ever are allowed to make. It would be nice if “the machine is docile enough to tell us how to keep it under control,” he writes.
Today all serious discussions regarding the social impact of the coming decades of The Curve start with Vinge’s notion of The Singularity. Some wonder if it is in fact inevitable. For example, Marvin Minsky, MIT’s grand old man of artificial intelligence, says that we are so bad at writing software—it is so laden with bugs—that he believes the first ultra-intelligent machines will be leapingly, screamingly insane. Others wonder whether we sadly underestimate how powerful the human brain is and grossly overestimate how soon The Curve will yield hardware and software that approach it. As you will see in the next chapters, some people see the approach of The Singularity as a force for good, in a scenario I call Heaven. Some people imagine this technology getting into the hands of psychopaths, opening the door to supreme evil, in a scenario I call Hell. Perhaps most intriguingly, some people are looking at a future in which we choose to alter initial conditions leading to The Singularity. They dismiss as mechanistic how we chase the number of transistors we can put on the head of a pin. What they see is humans choosing to refocus on how many connections we can make among the qualities of the human spirit. I call this scenario Prevail.
These critiques obviously take off in radically different directions. Nonetheless, they all grapple with the question of if, how and when we might transcend human nature. It says something about the technology we fear and respect in the early 21st century that when they grapple with such questions, they all see as their starting point Vinge’s notion of The Singularity.
Vinge’s office at San Diego State is dingy, with gray metal shelves, yellowed linoleum tiles, askew venetian blinds, glaring fluorescents and institutional-dirty light blue walls. Sitting there, I say to him this whole Singularity business is all very well and good, and perhaps even logical, but it sounds simply incredible. His eyes squeeze almost shut in his full, easy face when he smiles, which he does a lot. Yes, he knows. It’s not the first time he’s heard that objection. Almost by definition, disbelief accompanies any notion that all the rules that humans have known for millennia might soon blow up. “Some economists have been playing with this quite a bit,” Vinge says. He has one paper from a scholar at UCLA that discusses, with equations, the consequences of The Curve going straight up. Even he has trouble buying that. “Anyone who talks about vertical asymptotes in terms of trend-line projections has some hard explaining to do,” he says.
More credible, he thinks, is his version of The Singularity. He doesn’t believe change has to become almost infinite for The Singularity to occur. “Just getting applications that are good enough to support superhuman intelligence” would trigger The Singularity, he believes.
I ask for a copy of the economist’s paper. “Yes, that’s easy for me to find; I just have to go to my extended memory here,” he says, reaching for his computer keyboard, smiling at his little cyborg joke about his superhuman intelligence.
I tell him about Belle, DARPA’s owl monkey with her brain connected to computers. “The first story I ever wrote that sold is about a preliminary attempt at intelligence upgrading and they did it with a chimpanzee,” he replies. He sees fiction as scenarios, written vividly.
Vinge is made of stern stuff. Although the other side of
a singularity is theoretically unimaginable, that hasn’t prevented him from trying. He hopes for a “soft takeoff” of The Singularity, since he dreads what a “hard takeoff” might feel like, and doesn’t know whether it would be safe for humans.
“The purest scary version” of the hard takeoff, he says, “is just if you had an arms race to get to The Singularity. A hard takeoff is where the whole transition from a situation where people are still talking like we’re talking, about the plausibility and implausibility of it all—the transition from that to things being incontrovertibly strange is 100 hours. It’s essentially like dealing with an avalanche. It’s not something that you can talk about planning for. It’s not really a plannable thing.
“In a nightmare scenario, it would be part of an arms race. Suppose you have two national forces that are going after this the way the Americans went after the A-bomb. Superhuman intelligence—it’s the ultimate weapon. Now, they might not actually even think of it necessarily that way. What they might want, say, is something that can really monitor the Net. You talk about your intelligence problem.”