The Immortality Code
Page 32
Allie squinted in thought. “Let’s back up a second,” she said. “You just told us that the nanites could generate all the data they needed to duplicate an Ion by simply scanning them. So why did Tom Hoyer have himself digested?”
“The ability to collect data from scanning rather than disassembling is a higher order function. One that we can’t access. A species has to prove they’re sufficiently advanced scientifically before they can make full use of the nanites. First, to even find them, a species has to be able to detect a broadcast that makes use of quantum entanglement. To activate them, including a first-level AI, and to be able to give them orders, a species has to prove they can send FTL signals as well as receive.”
Aronson paused. “Even then, certain higher order functions, like full-body scanning, are protected by encryption. The sum of massive primes.”
“Requiring a species to be advanced enough to perfect a quantum computer to get at them,” said Reed.
“Yes. The AI made this all very clear. Apparently, the Ions figure any species who can master both FTL communication and quantum computing is worthy of accessing all functions.”
“Which is where Allie comes in,” said Reed. “Again.”
“That’s right,” said Aronson.
Allie blew out a deep breath. “It’s interesting that the Ions require a species to master quantum communication to awaken the nanites,” she said. “Do you suppose this was so the Ions could immediately communicate with whatever species achieved this level?”
“Good insight,” said Aronson. “According to the AI that is, indeed, what was supposed to have happened. If the Ions were still in the galaxy, they would have contacted me once I activated their toys. Would have asked permission to send an envoy of duplicates here. As the welcome wagon. A cultural exchange. And to discuss the nanites, their capabilities, safeguards and so on.”
“But because they left the building almost forty million years ago,” said Reed, “we’re on our own.”
“Not entirely. The AI is quite helpful. Still, it isn’t the same. And who’s to say how much we might have benefited from this particular cultural exchange.”
“Then it’s tragic that they’re gone,” said Allie.
“Yes and no,” said Aronson. “Because humanity now has an unimaginable opportunity. If we can survive long enough to take advantage. The Ions had first dibs on all planets. They colonized millions of them. And let me tell you, we’re talking prime real estate. But now the Ions are gone, and apparently, so are all sentient species that were at their level when this mass vanishing took place thirty-nine million years ago.”
He raised his eyebrows. “Which leaves humanity in the driver’s seat,” he said excitedly. “In position to inherit the Ions’ galactic network.”
“What?” whispered Allie in shock.
Aronson grinned. “Crazy, right? But also true. The AI made it clear that if we can demonstrate we’re worthy, unlock the next level of functionality, we can control their nanites—on every planet in the galaxy. First, we can get them to send their data to us. Lots of data. Even given our ever-growing data-handling capabilities, we’d only be able to digest the smallest fraction of it.”
“No doubt,” said Allie. “It would be worse than drinking from a fire hose. It would be drinking from Niagara Falls.”
“True enough,” said Aronson. “But we can narrow the focus however we like. We’ll be able to absorb more than enough data to choose planets to terraform and colonize. Hello infinity.”
He smiled. “I’m partial to having a duplicate me built on a planet closer to galactic center. With the density of stars so much higher there, my understanding is that the star fields at night will be magnificent. But I’d want to read the travel brochures first.”
Reed shook his head. “This is so unreal,” he said. “You’re positive we can just take over their system?”
“The AI has assured me we can. If we can unlock the third level, break through the encryption, we’re in business. Which is why the soul discussion will become so important. According to the AI, the Ions believed their duplicates were every bit as alive, every bit as much an Ion, as the originals. Although they did have rules forbidding any Ion from bringing a duplicate to life unless it was at least fifty light years distant.”
“Right,” said Allie, “so there was no chance of a duplicate and original ever crossing paths.”
“That’s my guess,” said Aronson. “But the bottom line is this, if we can unlock these higher order functions, humanity can inhabit hundreds of planets. Thousands. And that’s just for starters. We’ll no longer have all our eggs in one planetary basket. If a stray black hole swallows Earth in thirty years, humanity will still survive. Nothing can wipe out our entire species, until we follow the Ions and others to transcendence someday.”
“Unbelievable,” said Allie. “Again, there are no words to describe the enormity of it all.”
“Amen to that,” said Reed. “But at the risk of taking this conversation from the transcendent to the mundane, where does Hoyer come back into the story?”
Aronson sighed loudly. “Yeah, getting back to Tom does take the story down about a trillion pegs. But you’re right. It’s time to return to Earth for a while. From gods to psychopaths in one easy step.”
52
Reed shifted his gaze to Tom Hoyer, the man in question. His eyes were still closed, but Reed had no doubt that he was listening to every word.
Reed had never seen Allie so mesmerized, and for good reason. Nothing this mesmerizing had ever been shared in human history. Given this grandiosity, it was a massive letdown to be reminded of just how small human beings could be.
“I learned everything I just told you,” continued Aronson, “and more, in a single day. During many hours of give-and-take telepathic discussion with the nanite AI. And I also learned about what the nanites can do. We might need a quantum computer to reach the next level, but FTL communication alone had given us access to a full panoply of nanite tricks. I learned that they could make anything, and how. I learned that they had a medical repair function, and I could inject trillions of them to patrol my bloodstream.”
“And you just took the AI’s word for that?” said Reed.
“No. Especially since this feature required advanced programming from the AI to implement. So I experimented with rodents first. I injected them with nanites and confirmed they could repair injuries without any deleterious effects to the specimens. Then I injected myself and conducted further experiments. I wasn’t eager to hurt myself in the name of science. So I began by giving myself tiny cuts. When these vanished before my eyes, I numbed myself so I wouldn’t feel pain and inflicted ever-worsening injuries on myself. The nanites were as miraculous as advertised. Ever since I injected them, I’ve been as healthy as a horse.”
He paused. “But backing up, the AI walked me through the nanites’ capabilities, and I put them through their paces to confirm what I’d been told. I had them digest any number of objects of varied complexity and remake them. I analyzed the duplicates under a microscope. And so on. Mostly with my mouth hanging open.”
“We know the feeling,” said Allie.
“But when I realized that they could also replicate themselves, I panicked. I’ve never been more awestruck, and I’ve never been more terrified.”
“Why so terrified?” asked Reed.
“You did hear the part about the nanites digesting planets the size of Jupiter, right?”
Reed winced. “Sorry. Epically stupid question.”
“Not at all. I’m glad you brought it up. I would have just glossed over it, which would have been a mistake. It’s too important not to have an explicit discussion. Even scientists don’t always have a full appreciation of the power of exponential growth. So let me lay it out. My experiments show that a nanite can make a copy of itself in about twenty minutes.”
“Twenty minutes?” said Allie. “Why so long? Nanites can reconstruct an entire computer in seconds.
”
“Trillions of nanites working in concert can reconstruct an entire computer in seconds,” said Aronson. “And a truckload of them can manufacture a duplicate truckload of them in twenty minutes. But for this example, I’m talking about an individual, microscopic nanite copying itself—all by itself. Much like a bacteria or a cell would do. Coincidentally enough, twenty minutes is the exact length of time it takes a well-fed E. coli bacteria to make a copy of itself.”
Allie nodded. “Go on,” she said.
“At this rate of reproduction, a single nanite can become a million nanites in seven hours. In another seven hours, these million can also grow a million-fold, to a trillion. So in fourteen hours, one nanite can become a trillion. Fourteen hours. Add another fourteen hours, and each of these trillion can spawn a trillion of their own. So in twenty-eight hours, a little more than a day, a single nanite can become a trillion trillion nanites.”
He paused to let this sink in. “To carry this to its extreme, if you programmed a single nanite for runaway replication, you’d get a big enough swarm of nanites to consume all matter in the known universe—in less than a month. I’m talking hundreds of billions of stars in each of hundreds of billions of galaxies—not even counting the planets and other masses. An entire universe of matter digested and converted into nothing but nanites.”
Reed shrank back. “I can see how that might be . . . bad,” he said.
Aronson couldn’t help but smile at this purposeful understatement. “This is only theoretical, of course. Taking exponential growth to absurd extremes. It goes without saying that the universe is safe. The nanites would run out of raw materials after consuming a single planet, since they can’t jump to the next.”
Reed frowned. “But that’s hardly reassuring, since the single planet they’d be consuming would be ours.”
Allie swallowed hard. “No,” she said. “Reassuring isn’t a word I would use either.”
“So nanites represent the greatest constructive power ever unleashed,” said Aronson. “But also the greatest destructive power. Giving the nanites to the world is like handing out nuclear bombs to all eight billion inhabitants of Earth. Actually, it’s much worse. If just one person orders the nanites to continue replicating until they run out of raw materials, they’d consume the entire Earth in days. Turn it into a silver mass of uncountable nanites, all identical. We might recover from a nuclear explosion. But there’s no coming back when your entire planet is ghosted.”
He frowned deeply. “And the mental command to do it is simple. Anyone can give it. You don’t need a manual, or a degree in nuclear bomb detonation. You just have a small nanite erector set that you make run amok. If everyone had their own set of unencrypted nanites, we wouldn’t survive a day. If we assume no rational person would give such an order, an assumption I wouldn’t make, we’re still screwed. Too many of us suffer from mental illness. A single schizophrenic patient could end the world with a thought. Either not knowing, or not caring, what the consequences of this order would be.”
“Okay,” said Reed. “I’m beginning to see what might have caused you to panic.”
“The fact that the Ions left this runaway replication possibility unencumbered,” said Aronson in disgust, “unprotected, is inexplicable to me.”
Reed tilted his head in thought. “Maybe not,” he said. “Maybe it’s part of the Ions’ testing program. Correct me if I’m wrong, but almost all of their ships land on lifeless planets and moons. Of those few that do land on planets with life, the vast majority land well before sentience emerges.”
“I think that’s right,” said Aronson. “That was certainly the case with Earth.”
“And in most of these cases, I’d imagine,” continued Reed, “life never goes on to achieve sentience. Even if it does, how many species destroy themselves before they can activate the nanites? Before they’re able to harness quantum entanglement and develop FTL communication? I’d say a good share. We’ve been close to the brink on a few occasions since we invented the nuclear bomb ourselves.”
“Again, I think you’re right,” said Aronson. “I’d guess that only one in many millions of planetary landings result in the nanites being discovered by an emerging quantum civilization like ours.”
He paused. “But I don’t see your point.”
“My first point is this,” said Reed, “the Ions may have been reckless for leaving the self-replication function unencrypted. But given the nanites were sent to every celestial body in the galaxy, and given how few times the unencrypted nanites would present a danger, they weren’t that reckless. Not in the grand scheme of things.
“But getting back to my original thought,” continued Reed, “we’ve been required to pass tests to prove we’re ready scientifically, worthy scientifically, to use the nanites. So my second point is this: what if leaving the self-replication capability unencrypted is another test? Our final exam. A test of species maturity. Will we recognize the danger? And will we be able to take steps to prevent self-destruction? If we can’t, the Ions have eliminated a galactic neighbor too immature to use their technology. Too immature to venture out into the galaxy.”
“Yikes,” said Allie. “Total species elimination seems like an unduly harsh penalty for failing a test. It’s like if we handed a four-year-old boy a bomb, and then shrugged when he blew himself up. Yes, he was too immature to handle it. But maybe that would have changed if we had given him the chance to grow up.”
There was a long silence as all three participants in the discussion pondered the possibilities further.
“I think you make a great point, Zach,” said Aronson finally. “One I hadn’t considered. But so does Allie. So maybe there is more to it. Maybe the nanites do have a fail-safe embedded in their programming, which the AI isn’t telling us about. I asked why the self-replication function wasn’t encrypted, but the AI wouldn’t tell me. Maybe if we did order runaway replication, or something else equally self-destructive, the nanites would refuse to obey, after all. Perhaps going on to deny us any access to the nanites until we’ve matured further.”
“That would make more sense,” said Reed. “So the consequences of failing the final exam wouldn’t be extinction. They’d have handed us a gun to see what we’d do, not telling us it was loaded with blanks.”
“Could be,” said Aronson. “It’s definitely something to consider further. The problem I see is that the only way to test the theory is to order the nanites to do extinction-level damage. Worst case, it’s the end of the Earth and humanity, and we learn that the penalty for failing the Ions’ test really is . . . permanent. Best case, they’ve given us blanks. But even then, failing could well bring consequences, including a possible loss of nanite privileges. Either way, it’s the last experiment we’d ever want to try.”
“No argument here,” said Reed. “But go ahead and continue. I sent us off on a diversion.”
“An intriguing one,” said Aronson approvingly. “The point I was getting to is that after learning the self-replication function was unencumbered, I decided to delay telling Tom about the nanites until I could take some precautions.”
“Good choice,” said Allie.
Aronson sighed. “In hindsight, it’s obvious. But it wasn’t at the time. I trusted Tom with my life. He was charismatic, and brilliant, and the perfect boss and colleague. Still, the stakes were so high, the potential for disaster so great, my instincts told me no precaution I could take would be too great. So I hedged. I worked with the nanites for several weeks, while telling Tom that they remained inert, and that I wasn’t making any progress. And I tried to be as paranoid as possible during that time, preparing for any eventuality and instituting certain safeguards.”
“Like what?” asked Reed.
“I injected myself with nanite MDs, as you know. I stockpiled emergency supplies of nanites around the country, well hidden, rendering them invisible to further prevent them from being discovered. And I added encryption of my own. I blocked the nanite
s’ second level of functionality, encrypted them so that they could only replicate at my command. And so the AI would only interact with me.”
“So Hoyer was telling the truth about that,” said Reed.
“He was. Although he didn’t learn what I had done until much later.”
Reed nodded. “Go on,” he said.
“After a few weeks of working alone and making preparations, I finally told Tom that I had activated the nanites. As if it had just happened. And that I had also awakened an alien AI. But I told him the AI had made it clear it would only interact with me, and that it was fickle, and often not cooperative.”
Allie considered. “So if there were things you didn’t want him to know, you could just pretend that the AI had refused to answer.”
“Exactly. I also didn’t tell him about the Ions or their history.”
“Which means he didn’t know about humanity’s opportunity to populate the cosmos,” said Reed.
“Right. I wanted to wait. I trusted him as much as I’d ever trusted anyone. But I also needed to be more sure about him than I had ever been about anyone. So I told him that the alien AI had encrypted the nanites so they couldn’t make copies of themselves. Even though I had encrypted them myself.”
“What about the nanite MDs?” asked Reed. “He claimed earlier that you kept these to yourself.”
“I did. Like I said, the nanites required sophisticated programming from the AI in order to properly patrol and repair the human body. Tom was too smart not to understand that, and I wanted to downplay the AI’s capabilities.”
“And later,” said Allie, “he wasn’t able to use them in this capacity because you blocked his access to the AI’s second level of functionality.”
Aronson nodded. “But if he had injured himself when we were working together, I would have injected him then. The nanites could have healed him after the fact. And once I was sure I could trust him, I’d make sure he had them on board.”