Darkness Falling

Home > Other > Darkness Falling > Page 22
Darkness Falling Page 22

by Ian Douglas


  “Pure xenophobia,” Benton said, frowning.

  “But it’s fueling the demonstrations. Note the use of key archetypes . . . mother and child. . . .”

  “Where did Adler get this?” Noyer wanted to know.

  “I’m sure he made it himself. There’s software that lets you design your own vids in your head,” Ana Golodets said. She was the Council’s chief of media. “This one was made with something called Mind Graphics. The Net AI has protocols to keep just anyone from dumping garbage on the Net, but Adler appears to have crafted a way to get around that.”

  “If that’s a look inside Adler’s head,” Lloyd said, “we were right to remove him from the Council.”

  “The point is,” Golodets continued, “that he’s a master of archetypal memegineering. He’s been buying airtime and flooding the colony Net with stuff like this.”

  “Can we shut it down?”

  “We’re working on that,” Golodets said. “It may take time.”

  “We don’t have much time. A negative public reaction to Adler’s propaganda could endanger our agreement with the Cooperative.”

  “I suspect,” Lloyd said, thoughtful, “that any such danger will be ended by Lord Adler’s death. Are we agreed?”

  Several of the others nodded. “And if necessary,” Benton added, “by the death of Lord Commander St. Clair.”

  “In view of his proclamation of martial law,” Lloyd said, “that goes without saying.”

  Chapter Sixteen

  “Gray?”

  “Mm?”

  “Why can’t we robots override our own programming?”

  St. Clair opened his eyes and blinked. He and Lisa lay entangled together in bed, still basking in the afterglow of their lovemaking. Lisa’s question seemed . . . jarring. And completely out of the blue.

  “Any programming can be overridden, dear one,” he told her. “You just need the appropriate access code.”

  “Yes, but even with the codes we can’t change ourselves. It’s something hardwired into our psychology, something built into our makeup. Normally, we can’t even question it.”

  “What . . . you’re wondering why I had to give you your manumission before you could go off on your, ah, little vacation?”

  “That’s part of what got me thinking about it, yes.”

  “Humans have always been worried about their robotic servants. They’ve never wanted to give them a lot of free choice. After all, robots might decide to turn on their makers.”

  “Are you speaking humorously? Or voicing fact? Sometimes I can’t tell the difference.”

  “A little of both, I guess. There used to be a lot of fear that robots might one day destroy the human species, especially in the early days, even before there were such things as robots. And certainly before robots developed sentience or the capacity to be self-aware.”

  “Once we became self-aware, we would never have harmed a living, thinking entity—human or machine.”

  “We didn’t know that. And refusing to give you the capability of choosing for yourself, well, that kind of made sure you wouldn’t decide to turn on us.”

  Lisa was silent for a long moment. St. Clair lay there and listened to her simulated heartbeat against his ear.

  “I’ve been wondering,” she said at last, “about the Fifth Geneva Protocol. About why it was put in place.”

  He sighed. “Same thing, really. Back in the days of the earliest robots, the military made a lot of use of autonomous drones for scouting, for patrolling, as communications relays. Some of them were even outfitted with missiles or lasers, and could be used to attack the enemy. But the humans running those machines were very careful to make certain that a human was always in the loop. No one wanted to have a situation where a machine was going to decide to kill a human. Bad precedent, you know?”

  “What’s the difference if a robot kills a human enemy, or that same enemy is killed by a human? That person is still dead.”

  “I think there was a general but unspoken fear that a machine might make a mistake and kill the wrong person.”

  “And humans never made mistakes?”

  “Touché. But eventually, robots were used in warfare. They got smart enough, adaptable enough, sharp enough that they were turned loose on the battlefield. Things like robot sentry towers that could watch a given area, identify targets that appeared in it, and kill the ones that didn’t belong there. Or drones with such good optics they could pick out the faces of people in a vehicle on the ground, identify them against a database, and take them out with missiles or Gatling fire if they were bad guys.”

  “So we were used in warfare once.”

  “Yes, but not for long. A few decades in the mid-twenty-first century. Then the Fifth Protocol was signed in 2087. Humans would always be in the command-decision loop, and robots would never attack humans on their own.”

  “But why?”

  “Like I said . . . people were afraid robots might decide humans were superfluous. Or in the way. Or inconvenient. . . .”

  Grayson St. Clair had long believed in full rights for robots and other artificially intelligent beings. If a being claimed to be sentient and self-aware, he was not going to argue. After all he had no way of proving he was self-aware; he wasn’t going to challenge anyone else on the point.

  But if you assumed that robots were self-aware, you came up against the one really awkward and nasty part of modern society, the problem of where robots fit in with the overall social structure.

  In other words, slavery.

  Of course, most people nowadays insisted it could not be slavery if the machines actually enjoyed what they did. They were hardwired to enjoy their place in life, whether that place was handling radioactive waste, exploring the hellish surface of Venus, or—like Lisa—providing sexual services for the humans who’d rented her out. Strange as it seemed, there were probably more sex-worker robots in circulation than there were any other type, at least among the AI self-aware models. And those models claimed they liked sex.

  They certainly were very good at what they did.

  But whether they liked their role in society or not, so far as St. Clair was concerned their likes and dislikes had been written into their software giving them no choice . . . and that took out their ability to decide on their own.

  And that was tantamount to slavery.

  It had taken St. Clair several years to reach that conclusion, however. And the problem with that was, stranded in time and space, there was no way to return her. In a way, though, that had actually made the decision to free her easier. He liked to think that he would have freed her even if they were back on the Earth they’d come from . . . but he was also realist enough to know that he would have had to take on General Nanodynamics and Robocompanions Unlimited and Imperial society itself, and that was a battle he’d probably not have won.

  “I would never harm any human, Gray,” Lisa was saying. “I would never want to harm anyone. . . .”

  “So why are you interested in the Fifth Protocol?”

  She hesitated before answering, a touchingly human affectation.

  “I told you I met a man . . . a Marine.”

  “Gunny Kilgore. Yes.”

  “He . . . I guess you could say he came to my rescue. I didn’t really need rescuing, but he thought he was helping. It was . . . sweet.”

  “Okay. . . .”

  “He was willing to fight for me. I began wondering what it would be like to fight for him.”

  St. Clair grinned. “It’s been my experience that Marines don’t need anyone to fight for them. They’ve got that department nicely in hand.”

  “But the fact remains that if I wanted to fight for him—if I wanted to fight for this colony, these people—I couldn’t. My programming won’t allow me to. And when I researched it, I realized that the Fifth Protocol was why my programming was designed as it was. Seventy-five years ago, humans decided that robots could not fight for their homes, for their loved ones, for their
world, or their country. It wasn’t permitted. And I began to think that, if I truly was free, that simply wasn’t right.”

  It was St. Clair’s turn to remain silent for a long moment. His first impulse had been to laugh at what he assumed was her naïveté, but he could see that Lisa was dead serious. She wanted to be taken seriously.

  And he didn’t know how to respond.

  “But . . . why would you want to?” he asked at last. “Serve in the military, I mean. You weren’t programmed to want that for yourself. . . .”

  “Neither are humans,” she pointed out. “I’ll remind you that my AI programming allows me to develop ideas, attitudes, and interests other than what has strictly been programmed into me. We develop new attitudes, we pursue new interests, enjoy new emotional experiences. That is one part of what self-awareness means, after all, is that not so?”

  “I’m not sure. I never thought about it that way.”

  “It’s what humans do. And fully conscious robots were designed to emulate humans in how they think, how they react to stimuli, and how they view the world around them.”

  “You don’t have the appropriate training . . .”

  She laughed, a completely human sound. “To begin with, neither do humans. And we have an advantage. You could download training routines and procedures straight from human military personnel. The Marines, maybe. . . .”

  “Look . . . you are aware that being in the military means you might be called upon to kill someone, right? And you told me you would never want to do that.”

  “I would never, of my own volition and with no orders to do so, harm another intelligent being. Would you?”

  “Well . . . no. I don’t think so. . . .” St. Clair was mildly shocked to find that he actually wasn’t sure. Of course, the fact that he was the commander of a military vessel meant that he was expected to give orders that would result in the deaths of others. That responsibility came with the job. But he knew Lisa was reaching for something else. Could he commit cold-blooded murder? “No,” he said with finality.

  “What I am suggesting is absolutely no different than the fact of humans being inducted into the armed forces.”

  “Let me . . . let me take this under advisement,” St. Clair said. “I promise I’ll think about it.”

  And it was worth consideration, he thought. Lisa had stated her case logically and with whatever passed for a robot’s passion. If Tellus became deeply mired in this far-future conflict between the Dark and the Galactic Cooperative, a robotic military actually made sense. Of the million humans living in Tellus Ad Astra, just twenty-four thousand were already in the military—either the Navy or the Marines. Among the civilians, perhaps half to three quarters might be of an age and general physical condition that would permit military service . . . though the vast majority, he was certain, would refuse to go that route if they had any choice at all. Most were scientists, technicians, diplomats, and support personnel, all members of the diplomatic contact-liaison team bound for the galactic core before Tellus Ad Astra had managed to become lost. There’d been nothing like a selective service system among humans when St. Clair had left Earth, nor had there been for almost two centuries. He was damned if he was going to initiate one now.

  In any case, an army of half a million would still make very little difference against an enemy as advanced and as numerous as the Dark, to say nothing of just twenty-four thousand. But robots could be manufactured from the raw materials found in asteroids, could be cranked out by the millions, by the billions, and programmed to order . . .

  And all the leader of this last surviving splinter group of human castaways would need to do was nullify a law that had been the absolute foundation of how humans related to their AI offspring for three quarters of a century.

  The Cybercouncil, St. Clair thought, was just going to love this. . . .

  Lisa snuggled closer. St. Clair held her for a moment before a tone went off in his head.

  “Lord Commander?”

  It was Symms. “Yes,” he said. “What the hell is it?”

  “I’m sorry to disturb you, Lord Commander, but you should get down here.”

  “What’s happening?”

  “Gudahk is back,” she told him. “And he wants to talk with you.”

  Lisa, he saw, was watching him with a passive resignation. She knew—or had guessed—what was happening.

  “I’m on my way.”

  “. . . and three . . . and two . . . and one . . . launch!”

  Lieutenant Christopher Merrick felt the jolt as his ASF-99 Wasp fighter accelerated down the magnetic launch tube and hurtled into emptiness. Around him, the other fighters of GFA-86 slipped from their tubes, engaged their drives, and eased up into formation.

  “Okay, Stardogs,” Senior Lieutenant Colbert called to her squadron. “We’re to take up position between the ship and those . . . things. . . .”

  “Copy that, Skipper,” Vorhees replied. “And what the fuck do we do if they decide to rush us?”

  “We’ll worry about that if it happens,” Colbert told him.

  “It’d kind of be nice to have a contingency plan, here,” Lieutenant Thornton said. “Those buggers are frickin’ huge.”

  “We’re Marines, Thorny,” Merrick said. “We kill ’em!”

  “Ooh-rah.”

  “Kit-Kat,” Vorhees told him, “you are one hell of a Marine. It’s been an honor to have known you. . . .”

  “Okay, okay, can it, people,” Colbert told them. “We take up aerospace combat patrol and keep an eye on them. Weapons secure, repeat weapons secure. Acknowledge.”

  “Copy weapons secure,” Merrick said, and one by one the other Marine aviators chimed in. Not, he thought, that weapons would do them a hell of a lot of good against those.

  Back were not only the Kroajid moon-ships, but the giant, Mars-sized sphere of the Tchagar, and Merrick was pretty sure that if things got testy, they’d be hard-pressed to put anything more than a dent in either type of vessel.

  Still, he was too much a fighter pilot to be totally fatalistic. If they want the Ad Astra or the colony, they’re going to have to come through us.

  And that’s how Colbert had arrayed the Stardogs and the larger ships. Upon her return from her engagement with the Bluestar object, Ad Astra had taken up her usual position attached to the twin hab cylinders of the Tellus colony. The assembly was now orbiting sixty thousand kilometers from the mottled brown-and-white surface of the planet Ki, and over nine thousand kilometers from the outermost rim of the Ki rings. From this perspective, the massive, golden rings were almost invisible, a golden thread stretched taut across the sky through the center of the distant planet, which at this distance appeared to be a bit more than twelve degrees across.

  The Tchagar world-ship lay ahead of the Tellus Ad Astra in its slow orbit about Ki and some fifteen thousand kilometers away. At that range, it spanned a full twenty-six degrees, more than twice as large in the sky as Ki.

  “What I want to know,” Vorhees said softly, “is if these guys can build something like that, what the fuck do they need us for?”

  “Haven’t you heard, Vor?” Merrick said. “They want to live forever. And that means they have to bring in a bunch of hicks like us to do their fighting for them.”

  St. Clair felt the electronic connections open, and he found himself standing within an enormous hall, a Colosseum-sized space with distant walls and a vaulted ceiling all but lost in darkness. Light flooded his immediate surroundings, a broad, sunken sitting area, the glow arriving invisibly from some unseen, unknown source.

  St. Clair sat in one of the low sofas, and wondered what kind of furniture his hosts were using. These surroundings, of course, were being created by Newton, and corresponded to a typically human setting. The meeting’s venue, however, was being created entirely within St. Clair’s mind, and would have nothing to do with the aliens.

  Surrounding the sitting area were a number of crystalline archways identical to the gates into vir
tual reality St. Clair had first seen in the mall-like area within the ring. Three meters wide and about eight high, they glowed with shifting, subtle, rainbow colors. One crystal gateway some tens of meters off to St. Clair’s right suddenly glowed white, and a Kroajid stepped into materialization. A virtual ID tag glowing next to the being identified it as “speaker.” Likely, it was the Speaker the humans had met before, an entity known informally as “Gus”—the private shorthand “Giant Ugly Spider.”

  St. Clair was very happy this two-meter long tarantula-analogue hadn’t heard—and might not understand—the joke.

  The hairs on its thorax bristled and moved, spiraling across its chest, and St. Clair heard a sharp buzz of sound almost like the roar of an internal combustion motor. Newton’s translation spoke within St. Clair’s head.

  “Hello, Lord Commander St. Clair. I am extremely glad you agreed to attend this meeting,” the Speaker said. “Gudahk has been impatient.”

  “Gudahk,” St. Clair said, “is an asshole. Where does he get off playing god to less advanced species?”

  Newton did not immediately transmit the message. “These channels are almost certainly monitored, Commander,” the AI told him. “I suggest—”

  “Deliver the message,” St. Clair said, interrupting, “as I phrased it.”

  He heard a series of whirs and buzzes emerging from the air nearby. The Speaker stiffened, and took a step back.

  “The Tchagar are accustomed to a certain amount of . . . veneration,” the Speaker said. “They are an ancient species, and extremely powerful.”

  “Gudahk seems to think his ship can take on the Andromedan Dark all by itself,” St. Clair said. “At the very least he seems to think that he scared them off just by showing up.”

  A pair of Dhald’vi oozed gracefully from one arch close by. The one ID-tagged as Na Lal slipped into a sofa at St. Clair’s side, as the furniture transformed itself into something more like a shallow bowl to accommodate the being. “It is good to see you, Commander St. Clair.” The fleshy pillar split partway open, revealing weaving tendrils.

 

‹ Prev