Infinity Born
Page 24
She took a deep breath. “So let’s move forward. You were saying your goal was to only complete a transfer at the moment of death. David had asked if this meant you had ruled out creating two or more copies of the same person. Have you?”
Jordan gathered himself. “Yes,” he said finally, obviously finding it hard to return to the conversation after his heartfelt confession. “Absolutely. I’ve opened enough ethical and philosophical cans of worms already. This would be perhaps the biggest of them all.”
“It would be the ultimate mess,” agreed Riley, who seemed to be relieved to get back to wrestling with abstract concepts rather than with messy emotions she couldn’t begin to understand. “If there were ten of me,” she continued, “each would want to be with David Bram, and I’m sure I wouldn’t want to share—even with myself. The legal system would be in a shambles. Who would get ownership of the original’s assets? We could talk about the complexities of this all day long.”
“Which is the very reason my goal is to keep this as simple as possible,” said Jordan. “I’ve instituted two simple rules. One, no transfer of consciousness without death, ensuring that each of us gain immortality, not multiplicity. And two, identical copies only. No one gets a version two point oh. Again, right now it’s about preservation, not improvement.”
Carr rubbed the back of his neck. “But haven’t you already made copies of more than a thousand people for your experiments?” he asked. “So you’ve already created a thorny situation. You can’t return both a volunteer and a copy back to the volunteer’s old life. So how have you handled this?”
Jordan winced. “Good question,” he said unhappily.
39
Riley looked ill. “Did you . . . kill them?” she asked, her expression indicating she was afraid of the answer she might get.
“Kill?” said her father, arching an eyebrow. “Does this mean you’re beginning to see the duplicates as human? If you thought of them as nothing but lifeless computers, you’d have used the word destroy.” He paused. “And if I did kill them, would you see this as murder?”
“You tell me you’ve created a quantum computer that houses human consciousness,” said Riley. “Having no experience with copies, I can’t judge if they’re human or not. But I know you think they are. Just minutes ago you made the argument that if a copy and original react the same way to every possible situation, are indistinguishable based on personality and behavior, then they’re the same. I believe your exact words were on the order of, “If one is conscious and human, how can you not call the other conscious and human?”
“Even so,” said Jordan, “this doesn’t necessarily mean that killing the copy is murder. It’s another complex situation that nothing in human history has prepared us to deal with. One could argue that nature intended that our consciousness be unique. So if I ended a unique consciousness, this would be murder. No question about it.
“But suppose I had a copy of David’s neural patterns and sparked his consciousness to life within a duplicate brain and body, with all of his memories. Suppose I then destroyed this duplicate David, a millionth of a second after I created it. So the birth and death of this new David was nearly simultaneous. You could argue I killed a living thing and this is murder. But you might also argue that it isn’t. I created this copy, and since its destruction hasn’t affected David in any way, his unique pattern hasn’t been lost to the universe.”
Jordan paused. “Or think of it this way: If you wrote a novel and I deleted it, I murdered your novel. But if I made a copy and then deleted the copy, your novel still lives.”
“By that measure,” said Bram, “if you made a copy and deleted the original, the same would apply. You still haven’t murdered my novel.”
“Yes,” said Jordan. “Either way. As long as one remains, you could argue that eliminating redundant copies isn’t murder.”
“I’m not sure the redundant copies would agree with that,” said Bram.
“Even so, it’s potentially a valid argument. I’m not saying it’s one I necessarily support, but it is one that can be made.”
“One that you’d need to make if you were trying to justify killing the copies,” said Riley. “And despite your clever mental gymnastics, you still haven’t answered my question. Did you kill them or not?”
“Not exactly,” said Jordan.
“Not exactly?” repeated Riley.
“At the end of one year of testing, I did have my people kill their bodies. Bodies that I can create again at any time. But I didn’t destroy the quantum computer embedded with their patterns, what I call their E-brain, or emulated brain. E-brains are designed to retain the pattern, the emulation, even when disconnected from a body. But the pattern, the consciousness, becomes frozen in time. Think of it as entering sleep mode. I can bring this consciousness back to life at any time by reinserting it into a body.”
Riley shook her head. “Freezing their brains is the same thing as killing them.”
“Not at all,” said Jordan emphatically. “Do you die when you’re asleep? After all, you’re no longer conscious. In fact, that’s one way to think of the transference. You’re used to your consciousness dying each night, and then springing back to life each morning when you awaken. With transference, the same thing happens, only you reawaken in a new body.”
“But you have no intention of ever reawakening them,” said Riley. “Are you honestly trying to claim there’s a meaningful difference between a sleep that you never awaken from and death?”
Carr was intrigued by the intellectual clashes between Riley and her father. It wasn’t surprising that she continued to challenge him, not given the hatred she had built up over the past eight years. What Carr found astonishing, however, was that she was proving to be the man’s equal.
“Yeah,” said Jordan sheepishly, “I guess I should have made myself clearer. I actually do plan to awaken them. I have some other goals I’m working toward, and they could fit in beautifully.”
“Other goals?” said Carr dryly. “I’m almost afraid to ask.”
“I’m devoting my life to protecting the future of the human race. The key ingredient in this task is to make sure we spread well beyond the solar system.”
“In case Earth gets hit by a meteor?” said Carr.
“Yes, but extinction through natural disaster is the least of our worries. Self-destruction is the greater possibility. We’re our own worst enemy. We’ve made a start at surviving an extinction level event on Earth by colonizing the Moon and Mars, but the solar system is still a tiny island in an infinite sea of empty space. We need to cross that sea.”
He paused. “Are any of you familiar with something called the Fermi Paradox?”
Bram shook his head no and Riley didn’t respond. Carr had a suspicion that she knew what this was, but didn’t want to show up her boyfriend too many times. “I guess not,” replied Carr, speaking for the three of them.
“It’s named after the Nobel-Prize-winning physicist, Enrico Fermi,” explained Jordan. “A man so brilliant he was almost in a class by himself. One day, in the early nineteen fifties, he and some colleagues were discussing the likelihood of intelligent alien life. Fermi made a simple but profound argument. If intelligent life existed, he asked, then where was everyone?”
Carr shrugged. “The universe is a big place, as you said. Isn’t this a needle in a haystack problem?”
Jordan smiled. “It is,” he replied. “But it shouldn’t be. The universe has been around for fourteen billion years. In the grand scheme of things, we’ve been around for a millisecond. If intelligent life was common, it should have arisen all throughout the history of the universe, on planets that came into existence long before Earth. It should have arisen billions of years ago. And once it arose, technology would arise with it, in just another millisecond on the cosmological time scale.
“Homo sapiens have been around for only about two hundred thousand years,” continued Jordan, “and look at the technology we’ve d
eveloped. Even assuming a species could only spread outward from their home planet at a tiny fraction of the speed of light, advanced alien civilizations should have arisen so long ago that the Milky Way should be teeming with intelligent life. Our local neighborhood should have been extensively colonized already by one or more species. Cosmological evidence of their existence should be everywhere. This shouldn’t be like finding a needle in a haystack. It should be like finding evidence of human life in Times Square.”
“Unless the emergence of intelligent life on Earth was a unique event in the cosmos, after all,” said Bram.
“That’s one possibility,” said Jordan. “Entire books of possible explanations have been written. But another obvious one is that sentience has, indeed, arisen on countless planets. But that for a sentient species to emerge, it must have evolved in a highly dangerous, competitive environment, one harboring such fierce predators that keen intelligence and ruthlessness were required for survival. So all sentient species turn out more or less alike. Aggressive, tribal, and warlike. And they all reach a stage at which they can destroy themselves with their technology. And they do. Every time. Like children who’ve been given loaded guns to play with.”
“With one of these guns being ASI,” said Riley.
“That’s right. Not that there aren’t many others to choose from. Bio agents, nuclear warfare, and so on. But given my own personal experience, ASI seems to be the biggest threat of all. By creating Savant, I, myself, could have brought about humanity’s extinction. If this had happened, a future civilization arising on a planet a thousand light years away might develop their own version of the Fermi Paradox, wondering why there were no signs of intelligent life coming from our neck of the woods.”
“Having no idea that we had come and gone,” said Bram.
“Like possibly thousands or millions of other civilizations throughout the galaxy,” said Jordan.
“But if ASI is a fairly common cause for the destruction of biological civilizations,” said Bram, “shouldn’t we see evidence of scores of these machine intelligences?”
“Yes,” said Jordan. “But, as I know better than anyone else, they don’t necessarily think like us. They may not be interested in territorial expansion. They may be operating on another plane we can’t perceive. Or maybe we’re searching in all the wrong places. We’re looking for environments conducive to biological life. But maybe ASIs like to hang out in more energy-rich environments. Near neutron stars or black holes. In the very center of the galaxy, which is jam-packed with stars, many, many times more than are in underpopulated arms like ours.
“Regardless of the explanation, we could be in a position to be the first species to spread ourselves across the galaxy before we self-destruct. We can ensure that intelligent biological life never goes extinct, and ultimately fills the universe. Even if it takes millions or billions of years.”
“And this ties in to your vision of what to do with your many sleeping human emulations?” asked Bram.
“It does,” replied Jordan. “They can be at the vanguard of this effort. I’m setting up an underground facility to manufacture a fleet of R-Drive capable ships. Much smaller and lighter versions of the ones I originally developed, which also means much faster.”
“Let me guess,” said Riley, “you plan to load a number of E-brains, in sleep mode, onto each of these ships and send them on a hundred-, thousand-, or million-year journey to seek out habitable planets.”
“Yes!” gushed Jordan enthusiastically. “Exactly. Guided by supercomputers that are loaded with all human knowledge, which they can later draw upon. When such a planet is found, the craft lands, and an automated system activates a 3D body printer, printing the bodies that go with each E-brain. Automated systems reunite the brains and bodies. The ships would have extensive raw materials in their cargo bays, along with a diverse collection of human sperm and eggs, for in-vitro fertilization. The reconstituted crew would go on to become caretakers of a fledgling generation of purely biological humanity.”
“And when these colonists grow old and die,” said Riley, “the technology will be in place to transfer their consciousness to new bodies, ensuring the colony continues to grow and thrive.”
“Yes. We would include a seed bank for crops,” continued Jordan. “And, of course, we wouldn’t neglect man’s best friend,” he added, smiling lovingly at his daughter. “Banks of canine sperm and eggs from every breed would be included as well.”
Riley nodded, but didn’t comment.
“I’m setting all of this in motion from off stage,” said Jordan. “Currently, I’m working toward automating the R-Drive factory. I anticipate I’ll be able to build ten or fifteen thousand of these seed-ships within a decade.”
“But you only have twelve hundred E-brains,” pointed out Carr.
“He can make as many additional copies of each as he’d like,” said Riley. “His prohibition against multiple copies doesn’t apply when they’re separated by distances that would take thousands of years to cross.”
Carr nodded. He continued to feel stupid on a regular basis.
“Thirty-five of the twelve hundred subjects we’ve tested are especially well-suited for this mission,” said Jordan. “Great natural leaders who have shown themselves to be especially intelligent, ethical, brave, decisive, and so on. Some combination of these thirty-five will be on every ship. They will be copied multiple times. About seven hundred of the remaining eleven hundred sixty-five subjects will be passengers, one on each of seven hundred ships.”
“Why not all of the subjects?” asked Carr.
“If you knew the results of our testing, you wouldn’t have to ask. We didn’t just select subjects we thought were angels. We chose a broad and diverse sample. Almost a third were selected because they had exhibited qualities that could best be termed as . . . antisocial. Cruelty, deceit, violence, hostility, crime. You don’t study the human condition by only selecting exemplary humans.”
“I see,” said Carr. “Not the kind of minds you want to spread to other planets.”
“That’s right,” said Jordan. “But these E-brains won’t be . . . killed, either. I’ll elaborate on that in a minute. But when a ship finds a habitable planet, it will land, and the crews will be revived as already described. Remember, even if a million years have passed since the ship left Earth, they’ll awaken as if no time has passed at all. Then, a computer will explain their situation, and the historic contributions they’ll be making to ensure human immortality.”
“What if this is too much for them to absorb?” said Bram.
“It won’t be,” said Jordan. “I tested out various approaches to bringing these thirty-five subjects up to speed in a virtual reality simulation. I got to gauge how they would react, erase their memory, and try again. Now I know the perfect way to proceed.”
Carr was reminded of an old movie he had once seen called Groundhog Day, in which a man relived the same day over and over again until he managed to get it exactly right.
“Did your future crew members have a choice in the matter?” asked Riley. “Or were they just manipulated beyond their wildest imaginings in your simulations?”
“I disclosed everything to them in virtual reality and they each agreed. Not that they’ll remember. As far as some minor manipulation to lessen the negative psychological impact they’ll experience after awakening on a distant planet, I make no apologies for this. These thirty-five will be the most important people to ever live, instrumental in seeding the galaxy with biological human life, forever ensuring human existence.”
Jordan paused, looking exhausted from reliving Turlock and his extended revelations, and also somewhat relieved to finally be through them. “So that’s my plan for humanity to conquer the galaxy,” he said. “What do you think?”
Riley studied her father for several long seconds. “I know I’ve been hard on you,” she said finally. “And while some of this seems horrific to me, I have to admit that your vision does have a
certain appeal.”
“Sure it does,” said Bram, rolling his eyes. “But mostly because of the dog part, right?”
Riley grinned. “Well, yeah. At least there would be one species on the ships worthy of spreading throughout the galaxy.”
40
Jordan took a deep breath. “So that’s everything,” he said. “I’ve laid all of my cards on the table. I’ve made horrific mistakes, and I’ve played God, as Riley would say. I’ve caused incalculable misery, not the least of which was suffered by my own daughter. I just hope you can understand why events happened the way they did, and agree with me that ASI is too dangerous to ever let loose.”
“One last thing,” said Carr. “What about the other test subjects whose bodies you killed? The psychopaths you don’t want to send into the galaxy? You said you had a plan for their E-brains that didn’t involve eternal sleep.”
“Yes. We’ll awaken them in a virtual world. Which I expect to be fully perfected within ten years.”
“Isn’t the going theory that we’re already in a virtual world?” said Bram. “A simulation constructed by some advanced future humanity?”
“That is the most likely possibility,” said Jordan. “But I think we have no choice but to live our lives as though this isn’t the case. Strive like it isn’t the case.”
He stared off at the waterfall for several seconds, as though taking continued inspiration from it. “With any luck,” he said, “humanity will conquer true interstellar travel over the next few hundred years, faster-than-light travel, obsoleting these seed ships. Regardless, in a few million years, at most, humanity will have conquered every habitable planet in the Milky Way and number in the trillions.”