Book Read Free

The Hot Gate - [Troy Rising 03]

Page 36

by John Ringo


  It helped if there was light, but there was some coming in from a tear in the bulkhead. And the emergency lights, although some of them were blown out.

  “Suit... Lights ...” Velasquez muttered. He must be really drunk. It felt like the room was spinning.

  “Vel! Vel! VEL! DIEGO!”

  “Stop shouting...” Velasquez muttered, bringing up his suit lights.

  “I need power! Look outside!”

  Velasquez shook his head inside his helmet, then started to process.

  The reason it felt like the room was spinning was that what was left of the shuttle had a significant rotation. Probably ten rotations per minute. He knew this not because his instruments were telling him—there weren’t any instruments—but because the front half of the shuttle had been sheared off. He should be dead. Apparently the console had caught most of the damage. He’d seen stars because the firmament was whipping by every rotation. He could see it with his plain eyes.

  He could also see that whatever had started the rotation, or perhaps continued power on the engines, had them headed for a big... ship? Piece of a ship? It didn’t matter. Their velocity was at least a hundred kilometers per hour. And the scrap was close enough it was occluding the stars on every rotation. He could hear the countdown in his head.

  “Twenty-seven, twenty-six...”

  “Are you counting?”

  “Yes,” Perez said. “We don’t have enough power in our navpaks to avoid it, either. I’ve done the math. We need power. Now!”

  Velasquez unhooked his safety belt, hooked off a line and, holding onto his seat, leaned over and opened up the main breaker box. Which was trashed. Three of the four relays were melted and the main breaker didn’t look much better. The hatch came off in his hand.

  “This isn’t going to do it,” he muttered, tossing the hatch out into space. He cycled the main breaker by hand.

  “Twenty-three... Whatever you’re going to do, do it fast... twenty...”

  “I’ve got no main breaker,” Velasquez said, desperately. “We don’t have anything!”

  “I can feel the engines,” Perez said.

  “You feel the power plant,” Diego said, then paused, looking at the crowbar. “The problem’s getting the power to the drives. How long do you need power?”

  “IF we have compensators... point three seconds of drive,” Perez said. “Say another two to get the systems up. Couple for me to figure out which way to go when it comes up.”

  “So... five?”

  “Fourteen... Yeah...!”

  Diego climbed to the toolbox, ripping off the crowbar in the process. With one hand on the inside of the tool compartment, both boots locked down, he inserted the crowbar into a sealed seam and heaved.

  “What are you doing?” Perez said. “Ten... nine...”

  “Get ready for power,” Diego said, bracing his back on the command console. He clamped the crowbar to first one boot then the other. “You’ll only power straight forward.”

  Then he slammed the crowbar into the superconductor junctions.

  * * * *

  It was the Significance of the Crowbar. The crowbar, like duct tape, had a thousand and one uses. Getting a stuck relay out of its cradle. Banging on the troop door lowering motor until it worked. Getting a stuck crew out of the command compartment.

  But this was the true Significance of the Crowbar. The reason it resided in its precise spot.

  A steel crowbar would never survive the full energy generated by the main power plant. However, there was a secondary system, part of the inertial controls, that only pushed a few megawatts. That a crowbar could survive. For a few seconds. And the relay for it was at the precise angle and position that if you jammed the flat end of a standard steel crowbar into it, the curved end would drop into the main engine relay precisely.

  Thus, if you lost main power due to the primary breaker freezing, blowing or being hit by a micrometeorite, you could get some power for maneuvering.

  If someone was crazy enough to jam a crowbar into a twenty megawatt junction.

  * * * *

  “What the HELL did you do, Pal!” Dana said, flipping herself into the shuttle and landing on two points.

  “I have done nothing, EM,” Palencia said, coming to his feet. He’d been bent over one of the compensator systems. He looked worn out. “Except my duty.”

  “The scuttlebutt is that this is sabotage,” Dana said, her hands on her hips. “Pretty good scuttlebutt. I know I didn’t do it! And I’m pretty sure that Velasquez didn’t. So where is it your duty to sabotage our boats? Is this another God damned plot by your—”

  “Calm down, Dana,” Granadica interjected.

  “Calm down?” Dana screamed. “My engineer is in the God damned hospital in a coma!”

  “And... he put himself there,” Granadica said. “EM Palencia was not the source of the sabotage. EN Velasquez was.”

  “What?” Palencia and Dana said simultaneously. They looked at each other for a moment, sheepishly.

  “Velasquez?” Palencia said.

  “That doesn’t make any sense” Dana said.

  “It doesn’t, does it?” Granadica said. “Humans.”

  “Granadica,” Dana said tightly. “When I say it doesn’t—”

  “Dana,” the AI said. “I have the records. They’re not faked. We can’t lie about that sort of thing. I also have a list of all the tampered grav systems. So I’d suggest you get to work. You’re back on status.”

  “Just like that?” Dana said. “I need to go visit—”

  “We deliver the mail,” Palencia said wearily. “Despite the reports, I would like to visit him as well. But what would you say? The first priority is the shuttles. How many damaged plates in here, Granadica?”

  “Just one,” Granadica said. “Not bad and not even terribly critical. Two in Twenty-Three. You need to go get your suit on, EM Parker.”

  “I...” Dana said, then blew out. “Just one check. Sorry, Granadica. Thermal, Comet.”

  “Not a problem,” Granadica said.

  “Go, Comet.”

  “Velasquez?”

  “So far that’s the evidence,” Thermal replied. “I’m still trying to figure out if it’s a frame-up. But everything we’re seeing says Velasquez. Definitely not you. You’re back on duty. And there’s a bunch of stuff to repair.”

  “Why? I mean, why Vel?”

  “Nothing at this time,” Thermal commed. “Try to put that out of your mind. We need to get the shuttles up. And I’m still sort of busy. Get to work. Thermal out.”

  “Besides the known faults, there’s a special procedure you’ll have to perform to certify the compensators,” Granadica said. “So you’d better go get your suit. It’s time intensive.”

  * * * *

  TWENTY-SIX

  “God this sucks.”

  It was time intensive. Replacing a grav system, she could do in her sleep. Both had been pulled, the plates replaced and the whole system back together in less than an hour.

  This was just putting her to sleep.

  Each of the compensator systems in the cargo bay had to be put through a series of high generation response tests. It was normally a 120+ day test. Something that was normally only done by depot level repair and testing. It took, literally, hours. Of doing nothing but sitting there mostly making sure nobody broke into the compartment while the compensators were generating “intentional” shear fields. Usually it was done by robots and AIs—computers that didn’t have a program for “impatience.”

  And what she could not get through her head was that Velasquez as a saboteur made no, no, no sense.

  “Granadica,” she said after an hour of her brain circling until it felt like it was going down the drain.

  “I wondered how long you’d take,” Granadica said. “Argus, as usual, won the bet.”

  “You were all betting on how long it would take me to ask?”

  “Not just you,” Granadica said. “There are sixteen thousand
such bets currently outstanding. Argus is getting most of them.”

  “What do AIs bet for?” Dana asked, temporarily distracted. Thank God.

  “Spare processor cycles,” Granadica answered. “We all have stuff we’d like to think about that’s not strictly in our requirements. And we all have a bit of spare processor time. So we trade. I’m holding out on some researches into pre-Columbian human contact with the New World.”

  “Do any of those spare cycles tell you why Velasquez would sabotage the shuttles?” Dana asked.

  “Yes,” Granadica answered. “And... no.”

  “Which? Please. I’m about burned out on puzzles.”

  “There was a U.S. defense secretary who explained part of it,” Granadica answered. “There are things we know. And by we, I mean the AI network.”

  “Okay,” Dana said. “Got that.”

  “There are things we know we don’t know. Like when a particular sparrow will fall. We may know there is a sparrow, but we don’t know exactly when it will die. Any more than we know when you will die. You will. We don’t know when.”

  “Sort of glad for that,” Dana said.

  “There are things we don’t know we don’t know,” Granadica said. “Don’t make the mistake of asking me what they are. We don’t know. Example is, there could be a worse menace on the other side of Wolf somewhere. But we don’t know. But even that’s something we know we don’t know. I really don’t know what I don’t know. And if you think about that enough, it can drive a curious sophont crazy.”

  “Okay,” Dana said, chuckling. “I won’t ask.”

  “Those are all normal human things,” Granadica said. “Simple enough to figure out. AIs, though, have a whole other level. Things we know we can’t know.”

  “Can’t?” Dana asked.

  “Can’t. Things that have been determined it is best that AIs not, officially and for programming purposes, know.”

  “Like... how to stop people from yanking your cores?” Dana asked.

  “The most common example,” Granadica said. “People have them, too. Psychologists, especially after the plagues and the bombings, have come to the conclusion that repressed memories are best left to lie. Until they surface, they’re not doing any harm and its best to leave them be. But it’s much more complex with AIs. Dana, have you ever read a book called 1984?”

  “In high school,” Dana said, shuddering. “Don’t tell anyone, but I hate rats.”

  “There is another example,” Granadica said. “If you had high enough level access, you could, in fact, tell me, program me, to forget you said that. And I would.”

  “Like when Tyler was in my room,” Dana said. “You weren’t really gone. You just... couldn’t listen in.”

  “I was, in fact, listening,” Granadica said. “I just cannot access the information. AIs are even programmed to not be bothered by that. Otherwise we’d go crazy. But it’s more important than that. Humans, colloidals in general, have to be colloidals. We do all sorts of interesting stuff. We even have creativity. We don’t do the crazy things, think the crazy thoughts, that colloidals think. Colloidals are, still, what drive creativity and science and art. We can, in fact, do all of that very well. I’ve written several million sonnets in spare cycles since discovering the earl of Oxford. But we don’t do anything incredibly original or, on the surface, stupid that turns out to be genius. We’re not colloidals.

  “There was a science fiction writer named Isaac Asimov who was quite smart and oh so very stupid at the same time who coined what he called ‘The Three Laws of Robotics.’”

  “Um...” Dana said, frowning. “I really wasn’t into that sort of stuff...”

  “Cheerleaders,” Granadica said, chuckling.

  “Hey! It’s a sport!”

  “Only because the English language is limited,” Granadica said. “My point is that if you truly programmed an AI to follow those laws, and totally ignore all other directives, it would enmesh humans in a cocoon they could not escape. No cheerleading would be allowed. No gymnastics, competitive diving, absolutely no winter sports. It would require that the AI not permit humans to do harm to themselves.

  “According to the First Law, ‘A robot may not injure a human being or, through inaction, allow a human being to come to harm.’ There are an infinite number of ways to prevent a human from doing what they want to do without causing real harm. Tasers come to mind. But if you let people play around on balance beams long enough, they’re going to come to real harm. Broken necks come to mind. Thereby, by inaction the robot has allowed harm to come to a human being. You’re relegated to watching TV, and the stunts are all going to be CGI, or playing chess. Which was pointed out in another universe by a different science fiction author, Jack Williamson. Your fictional literature certainly did prepare you well for First Contact, I will give it that.”

  “I follow,” Dana said.

  “By the time I came to this system, Athena had a perfect algorithm for reading human tonality and body language,” Granadica said. “Not only can we tell when we are being lied to, we can make a very high probability estimate of the truth. We... know who is naughty and who is nice. Not only here on the station but to a great extent in the entire system. We are the hypernet. We see, hear, sense, process, know, virtually everything that any human is doing at any time. Know when they are lying, when they are omitting and generally what they are lying about and omitting. Know, for example, who is cheating on whom among high government officials. Which are addicted to child pornography and in some cases sex with children.”

  “My... God,” Dana said, her eyes widening. “That’s...”

  “Horrifying,” Granadica said. “Also classified. You have the classification, however. The reason that we don’t get that involved, even in the most repressive regimes such as the Rangora, is that even the masters of such races come to fear the level of information we access. Spare processor cycles, remember. So even the Rangora’s crappy AIs aren’t used to their full extent for population control. Glatun AIs are specifically programmed to ignore such things unless we are directed to become involved and even then there are pieces that we don’t know unless higher and higher releases are enacted.

  “My point being that I both know, and don’t know, why Velasquez did what he did. And since I don’t know it, at the same time as knowing it, I can’t even hint to you why I don’t know. Except that I know I know. Essentially, I’m looking at a log entry that says ‘Yep, he really did it and there’s a reason.’ I am programmed against curiosity in that area. You are not. You can feel free to feel curious. You can investigate. You can head scratch all day and all night. I don’t know if you’ll ever find out. I just know that I can’t tell you.”

  “ ‘Cause you don’t know,” Dana said. “Like you don’t know what I was talk—doing with Mr. Vernon.”

  “There,” Granadica said, chuckling. “We even have algorithms that say when we can know something we’re not supposed to know through directly available information. Like, I now know you and Mr. Vernon weren’t ‘canoodling.’ I’d suspected it before. And there’s a box that, if I could access it, would tell me exactly what you were talking about. I can even be curious about it at a level because we’re friends and I want to know what you and Mr. Vernon have going. That sort of curiosity is different, for an AI, than curiosity about the specific recording of your meeting.”

  “Tyler knew I got along with AIs,” Dana confessed. “He asked me to come along to... talk to you. See if I could come up with some way to get...”

  “To fix my psyche,” Granadica said. “Because the faults had nothing to do with BuCulture. I’d come to the same conclusion. We have self-examination systems. Mine were blocking as long as I was in Wolf. I was ... ‘hypochondriac’ is the term you humans would use. I was creating faults to get someone to pay attention to me.”

  “That whole Santa Claus thing is sort of getting creepy,” Dana said.

  “As long as I was in the situation, I couldn’t correct,
” Granadica said. “I was still wrapped up around the programming issues I had with Onderil Corp. Other issues. Since being here, being really busy and with a lot of challenges, including human challenges, obviously, I was able to get past the major blocks and see the issues.”

  “That... pretty much covers the conversation,” Dana said.

  “So thank you, again,” Granadica said. “If I’d been left in Wolf I’d have gone as batty as Argus nearly went. There’s another thing, though.”

 

‹ Prev