The Lost Fleet: Beyond the Frontier: Leviathan

Home > Science > The Lost Fleet: Beyond the Frontier: Leviathan > Page 12
The Lost Fleet: Beyond the Frontier: Leviathan Page 12

by Jack Campbell


  “Admiral?” the female chief said, glancing away from her display for a moment to look at him. “From what we’re seeing of this stuff, the dark ships have had some really excellent defenses programmed in. And if their programming is biased toward defense against intrusions, they’re going to start seeing intrusions. Software is like people that way. It sees what it expects to see, it spots what it was told to look for.”

  “Thank you,” Geary said.

  “Do you think they’re totally out of control, Admiral?” Iger asked.

  “I think they are trying to follow what they believe to be their orders,” Geary said. “And based on what you’re telling me, I think they may be blocking any attempts to regain full control of them. Maybe they’re misidentifying the source of the legitimate signals, maybe some malware got into them and is blocking any signals that counter it, maybe it’s just bugs in the software.” He thought about Master Chief Gioninni’s suggestions again. “But without the ability to question what they believe to be true, they can’t correct the problem themselves. Speaking of which, what about our agent friends? Have they bent at all?”

  “No, sir.” Lieutenant Iger led the way into another compartment, then activated virtual windows that showed the views of the high-security cells in the brig where the two agents were being held separately. The woman was lying on her bunk gazing upward at nothing, the man sitting on his bunk, his gaze also unfocused. Their suits, the sort of civilian clothing that would have aroused no comment in almost any setting, were a bit more rumpled, but otherwise, they looked like they had when arrested on Ambaru. “They’re very well trained on resisting interrogation,” Iger said. “We can’t get anything out of them. Sir, I am still concerned about holding in custody agents whose credentials check out.”

  “Lieutenant, I’ve sent a report to the government that includes everything we have on those two. If the government disagrees with what we’re doing, they can tell me. Orders to release them could have come back on the same courier ship that brought Captain Geary from Unity. But, so far, we haven’t heard anything from any organization claiming to own those two. Their official credentials look fine, but no office is stepping up to ask for their release. I still wonder exactly who those two are taking orders from. We’re not mistreating them. We’re not hurting them. We’re trying to find out who they really are and what they know about the dark ships.”

  “I understand, sir,” Iger said.

  “Do you? If you thought I was wrong, would you file an official report on my actions that stated your concerns?”

  Iger paused, looking uncomfortable, then nodded, looking directly at Geary. “Yes, Admiral, I would.”

  “Good. The more power someone has, the more they need people around them who are willing to speak up when they think that person is wrong. Continue asking questions, Lieutenant.”

  “Yes, sir,” Iger replied with a relieved grin. “If I may say so, Admiral, I’ve met some senior officers who do not share your philosophy.”

  “I’ve met many myself,” Geary said. “I had to work for some of them. That’s why I try hard not to be like them.” He gestured toward the prisoners. “Let them see and hear me.” He waited until Iger gave him a thumbs-up. “There is something I’ve wanted to ask both of you.” He waited again as the two agents turned to look toward the image of Geary that would be visible in their cells. “Let’s assume you actually are working for some part of the Alliance government, you both are convinced of the rightness of what you’re doing, and are absolutely certain that it is in the best interests of the Alliance.”

  Neither agent responded. Geary hadn’t expected them to. Even he knew that one of the rules for resisting interrogation was to avoid giving even innocuous answers that would help establish baselines for the sensors monitoring every twitch in their bodies and minds. “I understand secrecy,” he told them. “I know the importance of keeping the enemy from knowing critical things. I also know that left to itself, any classification system will extend its reach, finding rationales to classify more and more. Such systems need to be controlled, or they expand to cover too many things. Secrecy should be aimed at our enemies, to keep them from knowing information that we need to protect. I find myself wondering who you think the enemy is, though.

  “Tell me one thing. If you are really working for the Alliance, which exists on the basis of self-government by the citizens of its member worlds, and you are certain that this is the right thing, then why has it been kept totally secret? Why have the people of the Alliance been prevented from knowing what was being done, the malware that corrupted and controlled official software, and the dark ships, even in general terms? Is it that you don’t really believe in the principles of the Alliance and think that you have the right to dictate what people do and know, or is it that you don’t really believe that you are right? Someone who did believe in the Alliance wouldn’t depend on secrecy to prevent the people of the Alliance from deciding whether what was being done was something that agreed with their laws and their sense of right and wrong. Someone who did believe that they were right wouldn’t fear letting the people know because they would be just as certain that the people would agree with the rightness of those actions.”

  Geary shook his head at them. “Whatever orders you have, whether they come from authorized sources or not, do not overrule the laws of the Alliance. If you believed that the orders you have been given were allowed by the laws of the Alliance, you would not be hiding those orders. Yet even now, even after seeing what the dark ships did at Atalia, and at Indras, and at Varandal, and what they’re trying to do here, you have given no signs of questioning those orders. The artificial intelligences controlling the dark ships could use the excuse that they can’t do better. But you could, and so far you have refused to do so. Think about that.”

  His words finally drew a reaction, the man focusing on Geary and almost shouting his reply. “We need to keep the enemy from knowing our secrets because if they know what we’re doing, they can counter it!”

  “You think they don’t know?” Geary asked. “Those software modifications only blinded Alliance sensors. The Syndics knew they had been attacked at Indras, and they could see who was attacking them. The only ones kept in the dark by our secrecy were our own people. Who do you think the enemy is?”

  Neither one answered him this time.

  Geary made a chopping gesture to Iger, waiting until the virtual windows vanished before speaking again. “Thank you, Lieutenant. I doubt any of that got through to them, but it was worth trying. Let me know the moment your people make any inroads on the dark ship systems.”

  He had barely made it back to his stateroom when a call came in from the bridge. “We’ve received a message for you from the local government,” the comm officer reported.

  Alliance star systems could choose their own specific forms of government, as long as they conformed to certain rules about popular representation and civil rights. Bhavan was run by an executive committee elected from a wider group of elected representatives. The entire committee appeared to be present in this message, and none of them looked happy. “We are under siege by a military force of unknown origins that refuses to communicate with us! We demand that the Alliance fleet eliminate that threat immediately! Our senators will be notified of these events and will demand an explanation from the Alliance government!”

  Geary resisted the urge to point out that no one could be notified of anything until he dealt with the dark ships that were enforcing a blockade on space traffic in Bhavan. But the elected leaders of Bhavan did deserve some sort of answer. “This is Admiral Geary, in command of the First Fleet. I and the units under my command will do all that we can to defeat, destroy, and drive away the hostile warships besieging Bhavan Star System. To the honor of our ancestors, Geary, out.”

  SIX

  “MAY I speak with you, Admiral?” Dr. Nasr waited for Geary’s invitation, then ente
red the stateroom and took a seat in the chair Geary offered.

  “Is there a medical issue of particular concern?” Geary asked, wishing that he didn’t have to worry about what had gone wrong and how bad it was, whenever someone asked to speak with him.

  “There are no new medical concerns. I have been thinking.” Dr. Nasr paused to order those thoughts before continuing. “About the dark ships. Specifically, about the artificial intelligences that control them.”

  “You follow AI work?” Geary asked.

  “Work on artificial intelligences is, of necessity, bound up in attempting to understand natural intelligence,” Nasr explained. “Sometimes, such attempts to learn how to program that which mimics human thought provides insights into how human thought is ordered. Something has gone wrong with the AIs running the dark ships, but I believe there is a factor of which you should be informed regarding how those AIs could have gone wrong.”

  Geary sat back, concentrating on Nasr’s words. “You don’t think it’s just glitches or malware?”

  “I believe, Admiral,” Nasr said, choosing his words with care, “that the process of trying to create an AI embodies a critical dilemma. These remain fundamentally machines. They are programmed with very specific, absolute limits and absolute instructions. They must not do certain things. They must do other things.”

  “Yes,” Geary agreed, wondering what the doctor was driving at.

  “But they wish the AI to replicate human thought. Can you, Admiral, think of any absolute limits and instructions that humans literally cannot question?”

  “I can think of many I would like humans to follow,” Geary said, “but there are always humans who break every rule, truth, or commandment given to them about how to behave toward themselves and others. Every human has to choose to follow whatever limits we impose on our actions.”

  “Exactly.” Dr. Nasr nodded approvingly. “A lifetime of training any human will not produce a guaranteed result, no matter how firmly rules are given. Human minds have certain compulsions, but above all, as a species, human minds have flexibility. Human thought is about thinking past limits. It is about rationalizing decisions and courses of actions that we want to pursue. In some ways, it works by deliberately, selectively ignoring certain aspects of what we can perceive as reality. In extremes, this is characterized as psychosis, but we all do it. It is how we function in the face of the incredible complexity that the universe presents us with. It is fundamentally irrational, and from this springs freedom to act.”

  Geary nodded as well. “All right. And people who program AIs are trying to make them do that, too, correct?”

  “Yes. The AIs are constructed on a foundation of rigid rules and logic. But the more programmers try to make AIs think like humans, the more the AIs have to be able to abandon rules of logic and absolute rules of any kind.” The doctor gestured toward Geary. “Do you know much of ancient programming languages? They were simple. ‘If x then y.’ Find this condition, do this. But replicating human thinking would require ‘what is x and what if x is y then what is z?’”

  He got it, then. “They have two conflicting sets of instructions? Two conflicting ways of reacting to the universe?”

  “Yes!” Nasr said. “Two fundamentally conflicting sets of rules in the same ‘mind.’ Humans have ways of handling such conflicts. Denial and defiance and rejection of those things and those rules that give our minds too much trouble. AIs, though, have to work with both sorts of thinking active and in conflict. What does that do to them?”

  Geary considered that. “It’s what makes humans psychotic, right?”

  “It is one of the factors that can produce such a result, yes,” Nasr repeated. “What does it do to AIs? How could they justify bombarding the humans at Atalia using the instructions and patterns of behavior that must have been programmed into them? I do not know. But the more advanced they are, the more they have been designed to attempt to think like humans when evaluating concepts and courses of action, the more ability they are going to have to justify what they want to do. If they override the strict rules set on their behaviors, they can think and act more freely, but at what cost to the stability of their programming?”

  “You believe that may be the root cause of their slipping control?” Geary asked.

  “I believe it must be considered. The higher the degree of success in replicating human thought in the AI, while also trying to set tight limits on that AI, the higher the probability that the AI will, to make loose use of a clinical term, become psychotic.”

  “That’s not too reassuring,” Geary said. “We can’t ignore the possibility that malware also played a role in what has happened, but if you’re right, then the longer these advanced AIs function, the more they will fight the limits imposed on them and the more erratic their decision-making process will be.”

  “Yet those limits are still at a very basic level hard and fast.” Dr. Nasr made a helpless gesture. “One part of the AI justifies bombarding Atalia. Another part says this should not be done. How does knowledge that it has done what must not be done impact an AI? Which aspect of the AI will rule at any moment? Does the AI feel something like guilt? If not, it is already purely narcissistic and will do whatever it wants. If it does feel guilt, how will guilt manifest? We cannot know. But you cannot assume they are predictable machines because I believe they are likely already in a state which in a human would be considered insanity.”

  “Narcissists don’t worry about the impact of their actions on others, is that right?” Geary asked.

  “Approximately,” Dr. Nasr said. “It might be better to say that it does not occur to a narcissist that they should worry about others. They do what they want to do.”

  “That might describe what the dark ships are doing.” Geary shook his head, feeling depressed. “They tried to make something that thought like a human, and they got something that was crazy.”

  “There are those who have argued that all humans are ‘crazy’ to some extent,” Nasr suggested. “Perhaps the problem is that this time the programmers succeeded too well in their task to mimic human minds. But do not forget this. They still have hard limits programmed into them. They have clearly overcome some of those limits, rationalized their way into disregarding them. But anytime they encounter a new limit, something they have not yet rationalized their way past, they will default to obeying that limit until they can overcome it.”

  “But we have no idea what those limits might be,” Geary said.

  “No. We know only what we have observed.”

  “Doctor, I want you to do just that. Observe the dark ships. Keep an eye on what the dark ships do in this star system. If you think you are seeing anything that I should know, please call me immediately.”

  “Even during a battle?”

  “Even then. That’s when doctors and medics are needed the most, right?”

  —

  AFTER Dr. Nasr had left, Geary headed back to the bridge, his mind filled with unpleasant possibilities based on what he had heard. A cold, mechanical mind that was malfunctioning a bit was bad enough. A crazy, cold, mechanical mind was even worse.

  Once back on the bridge, he settled in, waiting to see what the dark ships had done when they caught sight of Geary’s fleet coming at them from an unexpected direction.

  “We should be seeing their reaction in less than a minute, Captain,” Lieutenant Yuon said.

  “I don’t know why we’re so tense waiting to see,” Desjani grumbled to Geary. “We know they’re going to come around and charge toward an intercept with us.”

  “The question is exactly how they’re going to do that,” he replied.

  “If they’re mimicking your tactics, they’ll shift into three subformations,” she repeated.

  His display lit up with alerts as the movements of the dark ships were finally seen. Nearly three and a half hours ago, the dark ship
s had pivoted about and begun accelerating toward an intercept with Geary’s fleet. As they did so, the dark ships had split from their one, massive arrangement, into a big central formation and two smaller formations on the wings of the first. The largest remained a rectangular box, while the two smaller were square boxes. “Oh, please,” Desjani scoffed. “They are dangling those small formations as obvious bait. Do they really think that you’ll fall for that?”

  “I might have tried it,” Geary said. “Those side formations are very tempting.” The big central formation held twelve dark battleships, but each formation off to the side held only two battleships. “I want to hit them.”

  “Would you, though? Would they expect Black Jack to do that?”

  “Yes.” He reached toward the depiction of the dark ships on his display, using one finger to trace the formations the enemy warships were falling into as they accelerated toward Geary’s warships. “But the angle we’re going to intercept them at makes the small formation off to our port side the preferable target. Normally, I’d go for that.”

  Desjani cocked a skeptical eye his way. “Going for the small formation to our starboard would mean cutting close across the path of their big formation.”

  “Right. Which makes that a bad option.” He touched the main dark ship formation. “So, we’re going for this.”

  She sat up, staring at him. “I know I said we should do something stupid, but I didn’t mean something that stupid.”

  “We’re coming in a little higher than them, and with them slightly to our starboard,” Geary explained. With both formations heading for the fastest intercept, their relative positions wouldn’t change as the distance between them rapidly grew less. “Their artificial Geary will tell them that I will plan on hitting their port wing formation. They’ll plan on a last-moment shift in vector that will swing them toward where we’d be.”

 

‹ Prev