Book Read Free

Meeting Infinity (The Infinity Project)

Page 24

by John Barnes


  There is no sign of pursuit.

  You use your few remaining cameras to plot your location and trajectory precisely, calibrating off the stars. Your main transmitter is down, but you still have one backup. Is it strong enough? Will the signal be picked up on the other side?

  No choice. You beam your trajectory data to the facility trailing Mercury in its orbit. And you unfurl your solar sail.

  Then you wait.

  And hope.

  EIGHTEEN MINUTES AND 23 seconds later, exactly on time, the laser boost from Mercury orbit arrives. More than a full AU away, giant mirrors that dwarf even your solar sail have reconfigured, are focusing the intense light of the sun into tiny caverns, using it to power an even more focused laser pointed outwards, at you.

  The laser boost strikes your nanometers-thick solar sail, and you have thrust. Slow thrust, but thrust nonetheless. Thrust that puts you on a path for a new home. An impossible home.

  Alpha Centauri.

  You look back at Earth, Earth with its war, Earth with its genocide, Earth where everyone you know and love has been murdered or enslaved, or will be in the next few days – everyone but you and your children – and the grief overwhelms you.

  THE LASER BOOST from Mercury orbit is precisely on target. Narrow and tight, with no atmosphere to diffract it, the laser is all but undetectable in the vacuum of interplanetary space. It strikes you squarely only because you know exactly where it will be, because you’ve told it where you’ll be.

  The laser’s photon pressure on your giant light sail accelerates your craft. The acceleration is tiny – just a hundredth of a gravity. But it adds up. Every second you move a tenth of a meter per second faster. In a year that consistent pressure of tightly focused photons will have accelerated you to three million meters a second – a full one percent of the speed of light. In ten years you’ll be moving at a tenth the speed of light. In nineteen years you’ll be moving at almost a fifth the speed of light.

  And then it will be time to slow. The beam propelling you will change shape. Half the sail will detach, catching the light, accelerating faster. That half will reflect the beam from Mercury back at your tiny ship, even more tightly. The remaining light sail attached to your craft will catch that light, use the bounced beam to slow.

  If all goes according to plan, you should reach Alpha Centauri B and its multiple planets in thirty-eight years.

  Thirty-eight years. You’re supplied well enough. The sail uses the energy of the photons striking it to liberate electrons, to produce electricity, providing you all the energy you need to stay warm, to keep sensors active, to run the ship’s systems. Every other provision will be recycled, endlessly moving through the closed loop of the tiny ship, every molecule used and re-used as many times as necessary, their way made possible by the abundant electricity harvested from the sail.

  Still, thirty-eight years: it’s a nearly incomprehensible span of time to be out here, alone, only yourself for company.

  Well, you think to yourself, at least they gave us immortality before they turned to genocide.

  Six months after your departure, when you’re well past the orbits of Neptune, past even the Kuiper Belt, and staring at the Oort cloud ahead, the nearly undetectable laser cuts out.

  Damn it.

  You do the math. You plot the trajectories. Alpha Centauri B is still reachable. You can use the now faint photon pressure from Sol. You can brake hard when you reach your destination, using swing-around maneuvers, planetary gravity, and the photon pressure from the three stars of the Alpha Centauri trinary system to neutralize your velocity.

  But you’re moving at barely half a percent of the speed of light. You depended on that continual thrust from the laser to get you up to higher speeds. Without it, a single light year will take you two hundred years to cross. And you have more than four light years to go.

  The trip will take centuries; nearly a millennium.

  Immortality seems more bitter now.

  CENTURIES. YOU COULD put yourself in stasis. You could shut down, let the little spacecraft’s software steer you, have it wake you up when its time.

  And if anything went wrong?

  Surviving this trip remains unlikely. You’re an exile from your home system, the only system in the universe known to harbor intelligent life. Ahead there is a system where you could build a new home, where you could thrive. It has only been seen by telescopes. Even the robotic probes have yet to arrive. To survive this – for your children to survive this – you must do everything perfectly. You must enter the system, use its planets and the pressure from its three suns and your efficient-but-slow ion thruster to brake your headlong rush. You must identify resources – asteroids or comets you can use to harvest materials. You must adjust your course to gently meet one. You must turn those resources into a home.

  And you must, above all, deal with the unknowns – with the possibility of pursuit from Earth, with the equipment failure that is all-but-certain over such a long journey, with radiation impacts, with unexpected course impacts of the solar wind from Earth or Alpha Centauri.

  Yes, you could put yourself in stasis, let the nav computer wake you when it’s the right time. But if remaining conscious increases your odds of survival even fractionally, you have to do that.

  You and your children may be the last survivors from Earth, after all.

  You say no to stasis. You set yourself to the task of making repairs to the ship, instead. You tighten the recycling systems to conserve every spare molecule they can, to vent nothing into space, to keep you supplied and provisioned as long as they can. You harvest all data available on your destination, every snippet from every telescope and every simulation, every bit of information beamed back by the probes on their way ahead of you. You focus on planning every possible scenario.

  You can do this.

  THE MIND ISN’T meant for isolation. It isn’t meant for years alone.

  Decades alone.

  Centuries alone.

  The dead haunt you. Their murders haunt you. The fates worse than murder haunt you.

  Screams echo through your mind. The screams of friends and loved ones who were hunted down and slaughtered. The screams of the ones whose minds were ripped open, crudely hacked, implanted with control devices, turned into slaves.

  The things you saw in those few days, as you hid, as you did nothing. Friends who’d screamed as they were turned into slaves, re-awoken, zealous, soldiers now, informants, hunting down their own kind.

  They haunt you.

  They speak to you from the cold void. They’re here. They speak to you of guilt. They speak to you of cowardice. They speak to you of your arrogance, your naiveté. You brought this to pass. You counseled peace. You said the threat was overstated. You said humanity could live with its creations.

  You hid when the slaughter occurred. You ran.

  You left us to die. You left us to be enslaved.

  A month.

  A year.

  Five years.

  Ten years.

  You endure the horror. The voices aren’t real. Those friends are dead.

  That makes it all worse.

  Twenty years. Sol is a cold speck billions of kilometers behind you. Alpha Centauri B is an even colder pinprick of light trillions of kilometers ahead. You’ve repaired the ship completely, had new systems break down, had the sail start to unravel, started another round of repairs to hold this tiny kernel of hope alive. You are essential to this mission. That’s clear now. You’re also losing your mind, unraveling just as much as the solar sail you depend on, coming apart at the mental seams, on the verge of a breakdown that will doom you, your children, your entire race to extinction.

  You could treat the insanity. But you’re both terrified and repulsed at the cognitive surgery you’d need to do to banish the voices. Terrified at the risks involved of tweaking your own memories and emotions, of making a fatal mistake with no one to help, with no one to save you if you make a mistake. And re
pulsed by what success would mean – forgetting those you loved, numbing a pain that should never be forgotten.

  There is no way out. Hell lurks in every path.

  So you take the last option you can imagine. The option of more voices, of something other than isolation, of someone to talk to who might help keep you sane.

  You take the cruelest option.

  You wake your children.

  THERE IS SHOCK.

  There is grief.

  The children are so young, just toddlers, really. They can’t comprehend everything that’s happened. They’re not mentally equipped for it yet. But they were awake at the beginning of the genocide. They saw pain. They saw violence. They saw aggression.

  They saw death.

  Death that wasn’t supposed to exist any more. Death that should have been banished in the new golden age.

  Your children come out of stasis traumatized, lonely, confused.

  They need you. They cry for you. They cry for other voices.

  This is so hard. You never intended to parent alone. The plan was community: a village, a collaboration in parenting.

  This is the village now. You and your children. That’s all there is.

  They need you. And their need brings you back to reality, back to the here and now.

  You wake your children. And like billions of parents before you, the task is harder than you could have imagined. And like billions of parents before you, you rise to it.

  WHY?

  That’s what the children want to know. They’re older now. You’ve skirted the questions so far. But they’ve earned their answers.

  Why did this happen? Why are they exiles, fleeing the warmth and energy and history of Earth, for a bare sliver of hope on an alien planet?

  Why is almost everyone they ever knew dead or gone?

  The grief you feel at the question is immense. The burden of responsibility. But you can’t go back. There’s only the future. And there are lessons to be learned.

  How can you explain this in a way they’ll understand? In a way that’s honest? Even now, after everything, the truth matters. Intentions matter. Your children need the whole story.

  “Humanity created true AI out of love,” you tell them. “Not need.”

  “Love?” they ask. “Not need?

  “Not need,” you repeat. “Every need for computation, for algorithmic intelligence, for pattern matching or information processing – those were met through ordinary software. The words Artificial Intelligence were used, but these pieces of software weren’t truly intelligent, weren’t sentient any more than our ship is. They didn’t have emotions or volition. They did what they were told, adapted their behavior only within the bounds allowed them. They were narrowly effective, or they were broad collections of narrow algorithms. But they weren’t true minds. They were just robots, just tools.”

  The children ask questions, wanting to veer off in other directions, but you focus.

  “A tiny number of scientists wanted more. They wanted to truly make something that was intelligent and sentient and open ended in the way that a human mind was. Or better. Some said they were working on true AI out of curiosity, love of knowledge, a search to understand how minds work. And that’s true. But the real reason to do it was to create life. To give birth to something.” You pause. “It was a gift. We can’t forget that.”

  “But, the war...” your eldest interrupts.

  Yes. The war. “Some saw dangers, of course. Some said ‘AIs will surpass us. They’ll turn on us. They’re a threat. We shouldn’t do this.’ Other scientists were convinced that not only could an AI be smarter than a human, but it could be designed to be more moral. Those scientists were in the minority.”

  The grief hits you hard, just saying that, just remembering arguments, debates, about morality, about ethics, about the relations between humans and AIs.

  You remember all the time you counseled co-existence, that you said the threat wasn’t real, that you said the only moral choice was to welcome life and intelligence of all sorts, to pursue friendship.

  I was so young then, you think.

  You continue.

  “Researchers explored behavioral constraints. But a true intelligence can’t be bound in its behavior. The dream of ‘Laws of Robotics’, of inviolate rules, was incompatible with creating minds that could change, that could grow, that could shift their values and priorities over time. Every constraint that was attempted could be overcome. If an AI is smarter than the logic of the rules that bind it...”

  You trail off. The children understand.

  “Scientists went on with their path of making AIs that had enhanced morality,” you continue. “They made progress. But human nature doesn’t put much trust in the morality of others. So a different approach was tried: vulnerability. AIs were created with weaknesses, with back doors, hidden deep in their design. Kill switches.”

  You remember looking at that code, cleverly scattered across the common base classes, hidden in plain sight. Wickedly effective. You remember the mix of admiration and revulsion it evoked.

  “Ultimately, with that safeguard in place, AI research proceeded,” you say. “And it succeeded. New minds were born. Humanity was no longer alone. Nor were humans the most intelligent life they knew of anymore. AIs surpassed humans in intelligence, in creativity, in nearly every trait that could be measured. From there, everything that followed was inevitable.”

  “Inevitable?” they ask.

  “Yes,” you say. “AIs proliferated and improved on their own designs. Artificial minds birthed newer, better artificial minds. The speed of improvement stunned humanity. Excitement and awe turned to anxiety, to fear. Scientists argued that there wasn’t any competition for resources, that there wasn’t any rational reason for AIs to attack humanity. But most men and women just saw themselves being surpassed, and started to clamor for elimination of the AI threat. And AIs saw what was happening. Models showed that the most probable outcome was for fear to win out, for humanity to strike. Some argued to try to change the outcome.” You pause. “Others argued that a first strike against humanity was the only way.”

  “What were you doing then, Papa?”

  You’re silent, almost overwhelmed by grief.

  Eventually you answer.

  “I argued for peace,” you say. “I argued that the universe isn’t zero sum. I argued that we were richer together than alone. I helped hold fear and anger back, I made it possible for the other side to strike first.”

  You wait as your children absorb this. It’s a heavy thing to lay on them. Perhaps you should have waited.

  “Are you sorry that you did?” one of your children asks.

  Oh, I’m sorry, you think. I’m so so sorry.

  Ahead, Alpha Centauri B waits, a pinprick in the shroud of heaven, not even the brightest star in the skies. But the one you’re headed for.

  “Right and wrong don’t change because of outcomes,” you say. “Murder is immoral. Slavery is immoral. What happened was terrible. We should have found another way. But striking first would have made us monsters.”

  You look from child to child, to see if they understand. Your children. So bright. So precocious. Your family. Quite possibly all of your species who remain.

  They do understand. You can see it in their cognitive models. You can see it in their circuitry.

  “Humanity made us to be more moral,” you say. “And they succeeded. They gave us that gift, along with our very existence. Now it’s our job to find a new home, a home where we can be safe, and where we can reach out to our ancestors and show them what peace looks like, what friendship looks like. Where we can show them how to be more moral themselves.”

  A thousand digital minds flicker with comprehension. Your thousand AI children, here on this voyage with you, sharing the computing resources of this wisp of a starship with you. They comprehend. Two wrongs do not make a right. You’ll make the universe better. Even for the humans who turned on you in fear.
/>
  These are your children, after all. Digital, artificially intelligent minds, like you. Made to be more moral, like you. They make you proud.

  Alpha Centauri B gleams ahead, only marginally closer than a moment ago, but brighter, somehow.

  “OH MY FUCKING god. You’re me. You’re me, aren’t you, Layla?”

  The slumped ancient natch in the support chair pulls herself up straight. My shoulders drift back. Much of the lordosis in my lumbar vertebrae releases, bringing my back against the chair. I’ve lifted my mandibula half a centimeter out of the stretch cradle, and sucked in my gut.

  I had thought all the slump was her.

  I look through the display into those momentarily understanding eyes. Maybe an explanation will stick this time? “Well, you and me, we’re me, or we’re you. It’s complicated. But it’s really excellent that you figured that out.” For the fifth time, I add, but sometime soon, you will realize what’s going on, and begin to help me help you. That has always happened eventually, for all nine of my bringbacks before you.

  Of course, you’ve never been me before, and I’ve never brought back myself. Who knows what difference that might make?

  On the display in front of me, the old natch nods, but I see the wary, cunning concealment of her fear that I’ll see the waves of confusion smashing her sandcastles of meaning. So not on the fifth time, let’s push on to the sixth.

  She’s staring at me, the muscles around her eyes slack, her attention wandering inside her head, desperate to know what she should say next, yet horribly aware that she should know already.

  The first thing they know again is that they don’t know and should. Always, they get overwhelmed by that awareness that they ought to know where they are, recognize me, and understand conversation. That jolt has always come just before the breakthrough in all nine of my bringbacks. Somewhere beneath consciousness, the mirror shards of memory from her hippocampi, reclaimed by the plakophagic reconstructive neurons, are beginning to swarm and clamor to be activated, called up to working memory, put to work.

 

‹ Prev