Dogs of War
Page 14
“You’re First Sergeant Sims?” she asked.
“Nah, I’m the good-looking one,” said Bunny. “Master Sergeant Harvey Rabbit.”
“I saw that on the letter my captain gave me. Your name is really Harvey Rabbit?”
“Yup. My dad was kind of an asshole.”
“Sounds like it.”
“Call me Bunny,” he said, and offered his hand. She took his hand, gave it a single, firm pump. Bunny pointed to the crab house. “Sims is in there. Call him Top. He thinks he’s in charge, but it’s all me.”
“Why don’t I believe you?”
He grinned, and they began walking toward the restaurant.
That’s when they saw the blood.
And heard the screams.
CHAPTER TWENTY-EIGHT
IN FLIGHT
“I don’t suppose you know anything about this nanotech stuff?” I said to Rudy.
He brooded into the depths of his wineglass for a moment. “It’s not really my field, of course, but I’ve read quite a bit about the medical uses, and I had some conversations with Bill Hu a while back. He was quite fascinated with its potential in espionage and field support. He was less enthused about being able to regulate it once the technology was perfected and made readily available.”
“Like drones,” I said sourly.
“Like drones. Hu expressed valid concerns about drones even before the Predator One case, and look how that turned out. Part of Hu’s genius was to be able to look at a new or emerging aspect of science and follow certain lines of thought to imagine what dreadful uses it could be put to. In that, he was well suited to the DMS. Call it a kind of strategic paranoia. Predictive rather than merely alarmist.”
“Good thing he was on our side.”
“Very good thing,” Rudy said. “Hu had concerns about nanotech and, to pardon a bad pun, infected me with his unease.”
“Really? You don’t seem particularly paranoid, Rude.”
“I hide it well. My glass eye confuses any attempt to read my expression.” Rudy had lost an eye in a helicopter crash in Baltimore, and one of Church’s friends had replaced the damaged one with a superb fake. It didn’t give Rudy telescoping vision or any cool Million Dollar Man stuff, but it moved like the real one and was synced so that the fake pupil dilated in harmony. He looked at me with those eyes, both as dark as polished coal. “The science of nanotechnology was introduced by the late Nobel physicist Richard Feynman back in 1959. He was giving a dinner talk and said something to the effect that there was ‘plenty of room at the bottom,’ meaning that industrial and mechanical expansion was moving solidly outward but that there was potential for miniaturization. He proposed using delicate machinery to make even more delicate—and smaller—machines, and those would, in turn, make smaller ones, and so on. It was his belief that machines could be miniaturized down to the atomic level. He saw no practical reason why it couldn’t be done and felt that such a technological path was inevitable. He postulated manipulation of matter at the atomic, molecular, and supramolecular levels.”
I nodded. “Didn’t he make some joke about swallowing a doctor?”
“He did, though Feynman credited that concept to his friend and graduate student Albert Hibbs, it was a reference to the possibility of one day constructing a surgical robot so small that it could be swallowed. The construction of such a thing could not be done by human hands, so it involved having machines build smaller and smaller versions of themselves, all the way down to the molecular level. And, Joe, this idea was actually anticipated by the science-fiction writer Robert Heinlein in his story ‘Waldo,’ which was released in the early forties. Oddly, though, Feynman’s ideas weren’t acted on for quite a long time. Not until the mid-1980s, actually, when MIT engineering graduate K. Eric Drexler published his book Engines of Creation. That really ignited a more intense worldwide interest in nanotechnology. Since then, nanotech and nanites have become words in common usage.”
I grunted. “So this isn’t really something new?”
“In concept? No. In practical application, yes. Knowing that we wanted to build microminiaturized surgical or medical machines is one thing, but having the technology to do it is another. It took three-quarters of a century to bring us to where we are now and, from what Bill Hu told me, we have a long way to go.”
“Apparently not, or we wouldn’t be on this plane.”
Rudy sighed. “That’s what alarms me. About this and so many other things we encounter. We believe that we grasp where science is on a topic and then we’re faced with either an unforeseen practical application or a leap forward in development. Sadly, that’s something we’ll be seeing more and more often, Joe. It’s not only that the products of science are getting smaller but the equipment to develop and manufacture these things has become less cumbersome and more affordable. Laptop computers, portable laboratory setups, even mass production relies on compact, transportable, and affordable hardware.” He lowered his voice. “The DMS has faced as many small mobile terrorist cells as they have large-scale. With nanotechnology there is no conceptual zero point for miniaturization. Especially with machines that are designed to make smaller machines. That’s a cycle that will continue down to the subatomic level. When you include mass replication as a function, well…”
“You’re scaring me to death,” I told him.
The plane hit an air pocket and dropped so suddenly that everyone yelped. Rudy’s face went from a medium tan to the color of old paste. “Now we’re both scared.” I don’t like skydiving, but Rudy hates to fly at all. The cabin steward must have seen his face, and she materialized with a fresh glass of wine. When in doubt, anesthetize the customers.
Rudy drank about half of it, sighed, drank some more, then nodded and continued where he had left off. “We like to talk about being on the cutting edge of science, Cowboy, but in truth we don’t know where the cutting edge actually is, because it changes every day. We can be at the forefront one minute and then that boundary can be pushed far beyond our current reach in an instant, and that’s as true for viral and bacterial mutations as it is for computer software, robotics, and nanotech.”
I glared down into my whiskey. “You are harshing my buzz.”
“You should watch one of the YouTube lectures by John the Revelator. He’ll sober you up in a heartbeat.”
“Who in the wide blue fuck is John the Revelator? Some kind of evangelist?”
“Of a kind. He talks about the coming technological singularity. You know what that is?”
“Sure. Skynet becomes self-aware and launches the nukes. I’ve seen Terminator, like, eighty times.”
“It’s John’s opinion that movies of that kind are like the nuclear-apocalypse novels of the fifties and sixties, that they’re predictive and cautionary tales. The difference is that John thinks we should embrace this event.”
“Why? Is he an android?”
“No. He’s flesh and blood. I’ve met him. He did a book signing at Mysterious Galaxy books in San Diego. Very well-spoken man, and although we’ve never met before, I couldn’t shake the feeling that we had. He has that kind of charisma. He says that, because this event is inevitable, at some point in the not-too-distant future it’s in our best interests to accept it and attempt to curate it.”
“‘Curate’? How? By polishing the boots of the robot overlords?”
“No. Though he has made a joke to that effect on a few occasions.” The plane continued to bump, and Rudy finished his glass. By the time we switched planes in Atlanta, I was going to have to carry him. “John says that we have to develop artificial intelligence, and to allow it to grow at its own pace, which he says will be exponential and dramatic. But, at the same time, we have to control the safeguards that will always allow a select few to control the overall system without interfering with the rate of AI growth.”
“Don’t we already do that?”
“Sadly, no. Hu told me that there are quite a few research groups that haven’t imposed any co
ntrols on AI development for fear of limiting or interfering with the development of true computer intelligence.”
I raised my empty glass. “All in favor of not doing that, say ‘Aye.’”
“John would agree. He says that we should always be in control.”
“Not sure how much I like that ‘chosen few’ part of it, though, Rude.”
“Nor I,” admitted Rudy. “Unfortunately, the signing was too crowded to allow me to pursue the topic with him and find out what he meant. I wonder if Dr. Acharya would know. He was the one who first told me about John the Revelator. Did I understand correctly that he’s not available?”
“He’s in Washington State at the DARPA thing,” I said.
“What DARPA thing?”
“They’re testing new versions of BigDog.”
Rudy looked blank, so I explained that there was a robotics testing facility in the middle of no-damn-where in the big timber country of the Pacific Northwest. DARPA had been working on building the next generation of tactical-use robots. BigDog was a dynamically stable quadruped robot developed in 2005 by the guys at Boston Dynamics, along with the NASA Jet Propulsion Lab, and some guys from the Harvard University Concord Field Station. I said, “BigDog doesn’t actually look like a dog. No fake hair or lolling tongue or any of that. Doesn’t beg for treats or pee on the rug. It has four legs and a bulky utilitarian body. Kind of like a big robot mastiff without a head.”
“Charming,” said Rudy sourly. “I seem to remember seeing something about that on YouTube a few years ago. Someone kicked it.”
“Yeah. That was an early prototype. They showed it walking, climbing steps, running. They also showed some guy kicking it to knock it over so they could demonstrate how it regained balance. People freaked, though.”
“Well, they would,” he said, nodding. “We rush to anthropomorphize everything. When the robot dog stumbled, it connected the viewers with our memories of dogs we’ve known being injured. If it was actually destroyed, say by being hit by a car, people would be appalled. They’d grieve as if it was real.” He cut me a look. “How did you react?”
I shrugged. “It’s only a machine and—”
“Joe…”
“Okay, it pissed me off, too.”
“Even though you know it has no consciousness and no feelings?”
“Sure. I never claimed to be less crazy than other people.”
“It’s not an issue of sanity, Cowboy,” he said. “It’s a quality of our nature as compassionate beings who are genetically and culturally hardwired to nurture and protect.”
“Still only a machine,” I said.
“Do you think the people who posted the video were unaware of how the public would react?”
“Jury’s out on that. Scientists can be detached in weird ways. And sometimes they can be malicious little pricks.”
Rudy gave me a fond and tolerant smile, which I chose to ignore.
I said, “The Marines originally wanted BigDog as a pack animal, but the first versions they field-tested were too loud. So DARPA partnered with some new brainiacs to build the next generation of robot dogs, calling them WarDogs. Supposed to be bigger, faster, tougher, and quieter. Some of them will be four-legged portable machine-gun nests, which is a little cool and a lot scary. I mean, think about having something like that come creeping up to your foxhole or into your camp and then opening up with a 50-caliber? Holy shit.”
“That is truly frightening,” Rudy said, and actually shivered.
“It’s all being developed under the project heading of ‘Havoc.’”
“‘Cry Havoc and let slip the dogs of war’?” recited Rudy. “That’s a little on the nose for a project name, isn’t it?”
“Somebody probably got wood in his shorts when he thought it up and couldn’t bring himself to switch to anything less obvious.”
The plane continued to bounce around the ether, and Rudy kept drinking. I cut him a look.
“You’re making a face,” I said.
“A face? No, I’m not. What kind of face?”
“Of disapproval. You have something against robots?”
“On the whole?” said Rudy thoughtfully. “No. Rescue robots are saving lives all over the globe, and exploration robots will push back the boundaries of our knowledge without endangering human lives. But in combat? In that regard, I’m much less enthusiastic.”
“The military’s working on a mobile surgical robot, too,” I said. “To assist medics with emergency field surgery.”
“That’s less frightening,” he said. “Surgical robots have been in use for years. They used a variation of the da Vinci surgical suite on me,” he said, touching his knee. “Superbly fine-tuned. I highly approve of that kind of use. Are any of them autonomous?”
“To a degree,” I said, and I knew where he was going with this. A couple of years ago we had that whole mess with the drones, autonomous-drive systems and hacked GPS controls. It left a very sour taste in all our mouths. And it made us very, very afraid of all the ways in which that technology can be misused. “We’ve wandered away from nanotech and we’re both half drunk. Fuck. It’s inconvenient as balls to have Dr. Acharya out of touch. They’re not even allowing Internet or cell phones except during emergencies.”
“Doesn’t this thing in Baltimore qualify?”
“That,” I said, “is what we’re flying home to establish.”
CHAPTER TWENTY-NINE
JOHN THE REVELATOR
GEORGE WARREN BROWN HALL
WASHINGTON UNIVERSITY IN ST. LOUIS
ST. LOUIS, MISSOURI
SIX WEEKS AGO
The lights came up and John glanced at his watch and then addressed the audience.
“I think we have time for a few questions,” he said. “If anyone—”
Dozens of hands shot into the air.
“Ah,” he said, and chuckled warmly. “Well, maybe we won’t have time to answer all your questions, but let’s start with you. Yes, in the third row, with the beard.”
A young man stood up, looking awkward and unkempt in faded jeans and an ancient UCLA sweatshirt. “Thanks, sir,” he said. “I enjoyed your presentation. My question, though, is about the socioeconomic effect of a curated technological singularity. One of the things you’ve talked about—here and online—is how the next generation of machines is already being built to improve farming, manufacturing, and so on. I get that this is an inevitable advancement, but what about the people who currently work in those fields? If an autonomous-drive combine harvester can operate twenty-four hours a day and in all weather conditions, won’t each eight-hour increment remove the potential for a farmer or farmworker to earn a day’s wage?”
Every eye shifted from the young man to the speaker, and he smiled at the audience.
“Yes,” said John, “that’s what it will mean.”
“So what happens to those farmworkers?”
The speaker walked slowly to the edge of the stage and stood looking down at the young man.
“Let me turn that around and ask you that question. What will become of those workers?”
“They’ll be out of a job.”
“Yes. And…?”
“If they can’t find other work, then they’ll have to get government assistance,” said the young man. “Which puts a drain on the system.”
“That is very true.”
“But those people need to have a chance to survive.”
John’s smile never flickered. “Do they?”
The young man blinked at him. “Of course they do.”
“Why?”
“Why not? They’re people. This is America. They deserve to—”
“Slow down, a bit,” said John, patting the air as if pushing the outrage back into manageable shape. “Don’t explode, calm down. This is a forum for understanding here, so let’s discuss this objectively, okay? Can you do that? Good. Now, give me pro and con on it. What is the basis for your argument that people deserve survival?”
r /> “Um … because they’re people?”
John shook his head. “No. That’s a subjective view. It’s sentimental.”
“It’s humanist,” countered the young man.
“It’s compassionate, I’ll grant, but how does that factor into the dynamics of evolution?” He held up a hand before the young man could reply. “Darwin only understood part of the process of evolution. He looked at it in the natural world, and correctly judged that survival of the fittest was a biological imperative, and he separated that from human desire and the influence of politics or religion. Scholars and historians have since expanded on Darwin to explain everything from religious evangelism to conquest to eminent domain. However, too often they failed to understand that having conscious self-awareness has created a different kind of evolution. Call it curated evolution. An attempt to force an evolution of a certain element or elements of a group even when biology isn’t necessarily a willing accomplice.”
“Like what, eugenics?” demanded the young man.
“Sure, eugenics is part of it, but so is gerrymandering, so is the spread of religion through attrition, so is ethnic genocide, so are social caste systems and economic class distinctions, so is war, so is crime. I could go on and on. Human history is an index of attempts to curate the human species. We do it in thousands of ways, large and small. Some of it is reprehensible, of course. Hitler attempted a systematic genocide of Jews, Pol Pot and the Khmer Rouge exterminated twenty-five percent of the Cambodian population, there’s the Ba’ath Party’s systematic extermination of the Kurds, the Rwandan slaughter of a million Tutsi by the Hutu, and other atrocities. These acts were not only morally wrong—and here I’m momentarily agreeing with your humanism—but also inefficient. Totalitarian governments such as the People’s Republic of China persecuted intellectuals, killing or imprisoning visionary thinkers, because visionaries cannot abide pervasive oppression. The Chinese wanted obedience without opposition. They don’t want a better way or a brighter future, because that would involve change, and they like their militant-regime structure. It suits small minds. They hide behind the concept of the ‘dictatorship of the masses’ without understanding the long game implied in that.”