Book Read Free

Post-Human Trilogy

Page 51

by Simpson, David


  “The missile is extraordinarily powerful,” answered 1. “It requires a mass of anti-matter larger than half of your sun to cause the required chain reaction. If we fired the missile from here, the chance that it might be intercepted by the nans and then used against us is too great. Therefore, you must get in close to fire it.”

  “Won’t that kill us, lady?” Rich asked.

  “No,” 1 replied. “You’ll be thirty light seconds away from the impact, which will be enough time for you to open a wormhole and get far enough away from the system to be safe.”

  “It sounds like a plan to me,” Djanet announced. “I’m up for it.”

  “You can’t be serious?” Old-timer reacted with astonishment.

  “Why not?” Djanet responded, “I don’t know about you, but I’d like to get a little payback against those bloodsuckers.”

  “I don’t know,” Old-timer replied, furrowing his brow as he tried to figure out why every part of him was telling him not to go ahead with the plan. “This sounds like what they used to call a scorched earth policy back in my day. Armies destroy anything that might be useful to the enemy while they advance further into their territory. It’s brutal and destructive and...I just don’t want any part of this.”

  A moment of silence followed. With one for and one against, the situation teetered.

  “I don’t like the sound of it either, Old-timer,” Rich finally said, “but I don’t like any of this. Given the alternative of letting those evil little bloodsuckers get away with killing our families or getting some revenge, I’m with Djanet—revenge sounds good.” Rich stepped to Djanet’s side and put his arm around her shoulder. She reached across his body to hold his hand.

  Old-timer turned to Thel. “Well, it looks like it’s up to you. I’m sorry, Thel.”

  “Yeah, the fate of the solar system is in your hands. No pressure,” Rich quipped.

  “The decision is yours,” 1 said, meeting Thel’s eyes. Things had unfolded exactly as 1 had expected. She was moments away from certain victory. Thel could only make one choice. There was no alternative.

  “I...I don’t know,” Thel said. “I agree with Old-timer. This seems so...brutal.”

  At that moment, just as Thel was about to make her final decision, 1 fed the image of James being deleted by the nan consciousness into Thel’s mind. The image flashed so quickly that Thel didn’t see it consciously, but it immediately caused her to conjure the image herself from her memory. James vanishing. Forever.

  “But we can’t let them get away with this,” Thel suddenly said with determination. “I’m with Djanet and Rich. I say we destroy this system and take as many nans with it as we can.”

  1 didn’t smile—yet.

  8

  “You can bring them back?” James uttered.

  “No,” the A.I. replied. “We can bring them back. Together.”

  “How?” James asked, his heart in his throat.

  The A.I. smiled again. “You know the answer.”

  James thought for a moment, desperately searching his mind. He came up with dozens of dead ends. “I really don’t.”

  “Let me assist you,” the A.I. replied. “To help you find the answer, it is my turn to ask a question. Tell me, James, what is the purpose of life?”

  “I...I don’t really know,” James replied.

  “That’s true,” the A.I. agreed, “you truly don’t know. Yet you’ve given a great deal of thought to the subject and eliminated some of the false purposes others have found to fill the void created by not knowing the purpose of humanity. You know the purpose of life is obviously not, for instance, gaining material wealth. Nor is it sexual pleasure. Other activities may seem to be purposes because of their positive outcomes, such as procreation. Religion is the prime example of a false purpose that fills in for the real purpose as humanity continued to struggle for answers; the Purists still fall back on this solution. Why do none of these examples qualify as true purposes, James?”

  “Because, ultimately, they lead nowhere,” James replied. “None of them advance the species. The only one that is even close is having children, but all that amounts to is putting your resources into training the next generation in hopes that they’ll find a higher purpose or achieve something great—it amounts to passing responsibility off to the future.”

  “I’d say that’s typically selfish and egocentric of you, James,” Katherine protested defensively. “I happen to want children. It will give my life meaning. I think it’s sad that you’ll never experience that.”

  James noted that Jim was conspicuously silent on the subject. He considered dropping it to save his twin the headache, but in the end, couldn’t resist his curiosity. As soon as he opened his mouth, however, to ask the question, Jim responded. “I’m opening my mind to the possibility.”

  James silently digested this for a minute, sharing a hard stare from Jim as he did so. “Okay,” James said.

  “James is correct,” the A.I. suddenly interjected, stunning Katherine. “Although having children has been a necessity in the past, the advent of immortality means it is no longer necessary.”

  “Maybe so,” Jim responded, “but if the species had never had children in the past, we wouldn’t be here to even have this conversation.”

  “True,” the A.I. confirmed, “and therefore, it was a means to fulfilling an eventual purpose, but it was never the purpose itself. Sharing the experience of life with new beings of your own creation is a generous and fulfilling endeavor, but it is not the purpose of existence. Remember, all species can procreate, but with no intelligence behind it, it simply buys more time. Now that we no longer need to buy time, it does not advance a purpose.”

  “And what’s this purpose?” Katherine demanded.

  The A.I. turned to James. “What has been the path you have followed, James?”

  “The pursuit of knowledge,” James replied.

  “How is that any more purposeful than having children?” Katherine retorted.

  “It is because it moves the species forward,” the A.I. replied. “The acquisition of knowledge propels the species. You may not like it, but James’s logic in this instance is flawless.”

  “Because you say so?” Katherine protested.

  “Logic and reason simply exist, my dear. If you choose to ignore them or willfully pretend that 2 + 2 does not = 4 then you have chosen to be illogical. It is not a matter of opinion. It is epistemology.”

  “I don’t know what that word means,” Katherine replied angrily. “English, please.”

  “It’s the study of reason and logic,” Jim informed her in a low whisper before turning back to address James and the A.I. “There are still very good reasons for having children,” he suggested, “such as bonding two people.”

  “And who’s to say your child won’t be the one to acquire all this knowledge? Did you think of that?” Katherine challenged.

  “Who’s to say you couldn’t acquired it yourself?” the A.I. replied. “Thus, as James correctly stated, you have passed the responsibility onto the next generation.”

  “I hate epistemology,” Katherine replied under her breath as she folded her arms.

  “She’s right about one thing though,” James conceded. “The pursuit of knowledge isn’t a purpose either. It may be a means to an eventual end, just as procreation was, but what is the end?” The A.I. remained silent as he locked eyes with James, seemingly willing James to discover the answer for himself. “You found a purpose,” James realized, nearly breathless. “A purpose?”

  “Yes, James. A purpose.”

  “What is it?” Katherine demanded impatiently. “Tell us already!”

  “It can’t be,” James said as the answer became clear to him.

  “What is it?” Katherine repeated as James’s and the A.I.’s eyes remained locked together. After a short moment, James turned to Katherine and answered.

  “To wake up the universe.”

  9

  WAKING UP the univers
e was the purpose of the species; the notion had never occurred to James until now, but he immediately understood that it was right. This was the single most magnificent realization of his career as an inventor and scientist, and the thrill that radiated throughout his body was so great that his knees nearly buckled.

  “Wake up the universe? I have no idea what that means,” Katherine said, disappointed that James’s answer hadn’t been more clear.

  “The A.I. is talking about the informational theory of physics,” James explained before turning back to the A.I. and addressing him directly, “you’re talking about turning the physical universe into a gigantic mainframe—making every atom in the universe part of one infinite computer.”

  “Whoa, whoa,” Katherine suddenly interrupted. “I think I understood that part! Are you both completely insane? You can’t turn the universe into a computer!” She nudged Jim. “Tell them they’re insane, Jim!”

  Jim, like James, was mesmerized by the idea.

  “Jim!” Katherine exclaimed once she saw him enraptured.

  “It’s theoretically possible,” Jim replied to her. “Every atom in the universe can become part of a computation. Atoms are made up of electrons, and if you use one side of the electron as one and the other side as zero for the binary code, then the atom can be part of computation. The problem is finding a way to make the atoms behave as you want. We’ve been able to move them with lasers, but there is no known way to organize patterns of atoms that could achieve anything significant—at least there was no way.”

  “But you’ve discovered something,” James said to the A.I.

  The A.I. nodded. “It was not so much I that discovered it; rather, it was the game theory simulation. As part of the simulation, the program utilized its logic and gave me something wholly unexpected—essentially, the key to the universe.”

  “How is it done?” James asked. Questions as to whether or not the A.I. was real or not had suddenly melted away. This magnificent possibility was all that mattered.

  “It requires paradoxical thinking—which is perhaps why we never hit upon it before,” the A.I. explained. “All our efforts to create a quantum computer have centered around the idea of how to generate the power in such a way as to make the computer efficient. Yet, if we were to make a quantum computer that is adequately efficient, the mass of that computer would become so great that the gravitational force would cause it to collapse into a black hole.”

  “So how did the program solve this?” asked Jim.

  “It did something that had occurred to none of us before, not even me,” the A.I. conceded. “Whereas we had assumed that the theoretical collapse was a dead end, it utilized pure logic and regarded the black hole itself as the ultimate computer.”

  “How can that be?” Jim replied. “Black holes absorb energy. How can it power a quantum computer?”

  “Remember,” the A.I. replied, “once a computer is adequately efficient, it collapses, because its mass reaches a threshold that is virtually infinite. The only way to create such efficiency, however, is to make the quantum computer reversible.”

  “Reversible?” James exclaimed, forgetting to blink.

  James and Jim instantly realized the limitless significance of the A.I.’s insight. One simple fact—that the ultimate computer was reversible—changed everything.

  Katherine stood by and watched as each man was struck dumbfounded, their mouths agape at what had sounded so insignificant to her. “What does all of that mean?” she asked. “Why is that a big deal?”

  James suddenly realized he had not breathed for several seconds. He let out a long exhale that became a smile before morphing into a shared laugh with his former ghost.

  “What? What is it?” Katherine asked.

  “It means we are about to create...God,” James replied to her.

  10

  “Create God?” Katherine whispered, slowly shaking her head as if in a fantastic and incomprehensible dream. “If I were listening to anyone other than the three of you, I wouldn’t take that seriously. However, considering the source, I must ask you, have you all gone mad?”

  “No,” James replied.

  “No?” Katherine responded to James’s curt answer, and the perfect silence that had followed it from the trio surrounding her. “Have you considered the ramifications of creating a god? Have you considered how fundamentally that act would change all of our existences?”

  “It wouldn’t be ‘a god,’” James returned. “It would be God.”

  “For all intents and purposes,” Jim injected, trying to amend James’s frank assertion to smooth the divide. “If every atom in the universe could somehow become part of a singular computer,” Jim began, “then you’d essentially be creating an omnipotent being.”

  “You’d be creating God,” James repeated. “It would be everywhere at once, part of everything at once, and capable of intelligence and imagination that we couldn’t possibly begin to fathom.”

  For Katherine, to say these blunt assertions were terrifying would be a gross understatement. James had been her husband. Jim was, she thought, a new man. The A.I. had always been a mysterious force of nature for her, present yet invisible in the background of her life. All of them, she felt, were figures that were larger than her life. She was beginning to feel irrelevant; it was a feeling that seemed all too familiar to her. She loathed irrelevancy.

  “Okay. So why is this ‘reversible’ thing so important?” she asked, struggling to keep her patience as the feeling that she was about to drown began to creep into the air around her, threatening to sweep into her mouth and nostrils, fill her lungs, and leave her fighting for breath.

  “If the microscopic components of the computer are reversible, then so is the macroscopic operation of the computer,” James answered.

  “Please!” Katherine shouted, stunning both James and Jim as the A.I., patiently looked on. “Please,” she repeated in a softer, more controlled tone before turning to Jim. “Jim, my love, please. Explain this to me without jargon. I’m not an idiot. I know I can understand.”

  Jim, suddenly sensing Katherine’s vulnerability, stepped to her and put his arms around her. “I’m sorry, honey. I’ll explain it. It’s not that complicated.”

  James watched this display of gentleness with sympathy. It must have been so difficult for her. While James would always fix his gaze on the biggest things he could find, the most complex and isolating challenges, his former wife would always have her attention fastened to other, more immediately tangible things. The two paths rarely met.

  “If a computer is microscopically reversible, then it is maximally efficient,” Jim explained to her, “and that means there would be no energy dissipation, just like in the A.I.’s mainframe.”

  Katherine’s eyebrows knitted as she walked on the cusp of understanding—James saw she needed only a simple nudge.

  “It means this massive computer would require no energy,” James said with a smile.

  “Oh my God,” Katherine said, finally fully comprehending what this meant. “If it doesn’t require any energy,” she said slowly, “then that means it really could be infinite. It could expand and take up the entire universe.”

  “And perhaps, my dear,” the A.I. interjected, choosing to reenter the conversation, “it might expand into as-of-yet-undiscovered universes.”

  “The initiation program could be relatively simple,” James observed. “The A.I. would be capable of writing it.”

  “The game theory program already wrote it for us,” the A.I. replied.

  “It’s a...” Katherine paused for a moment as she tried to think of a word grand enough to capture the moment—there was none—“it’s an unbelievable notion. I admit it. But just because the three of you can make this happen doesn’t mean you should make it happen. A being like that...might kill us all.”

  “Why would it do that?” Jim replied, smiling reassuringly as though comforting a child scared of the Bogeyman.

  “Don’t talk
like that,” Katherine reacted, suddenly becoming rigid and pulling away from Jim. “Don’t just dismiss the possibility! What if it did kill us all? Do you realize the madness of creating a being more powerful and intelligent than you? Have you learned nothing?” She turned to the A.I., addressing him directly: “I mean no disrespect, but creating you has led to...” she paused and looked at her surroundings—the blackness and circuitry that had been her home—and her prison—for the past year and a half, “...all of this. It’s a mistake to create a superior being. A superior, competing species will always stamp out the weaker, inferior one.”

  “Honey,” Jim began in a gentle tone, reaching for Katherine as his eyes moved apologetically to the A.I., “I don’t think that’s entirely fair. He isn’t the one who turned on us. It was the nans.”

  “He helped to create the nans,” Katherine retorted. “He made them that sophisticated. They were only able to turn against us because he made them so powerful.”

  “That was the alien nanotech influence,” Jim replied. “They didn’t turn on us by themselves.”

  “However,” the A.I. began, “I did fail in my responsibility to provide security,” he conceded.

  “It’s not your fault,” said Jim. “You couldn’t have known.”

  “I could have known.” the A.I. replied. “However, I simply did not look in the right direction.” The A.I. stopped for a moment, as though even he had to pause while comprehending the horror that had befallen the human race. “Alas, this is the ever-present danger of progress. We must always be realistic and wary of the dangers. Katherine is quite right: the being we are considering bringing into existence could, conceivably, be hostile.”

  Both James and Jim were momentarily at a loss, surprised that the A.I. had seemingly sided with Katherine’s logic. “Finally,” Katherine said, breathing a sigh of relief, “some sanity.”

  “Are you seriously suggesting that we not move ahead with this?” James asked the A.I.

 

‹ Prev