Book Read Free

The First Immortal

Page 40

by James L. Halperin


  The voice of Ronald Berry invaded Ben’s aural canals. This kid was barely twenty-two years old, handsome and marginally intelligent for a new-timer, but in no epoch was he anybody’s genius. He’d never had to work hard for anything, Ben understood. Like most of his generation, Ron’s critical-thinking capacity seemed severely underdeveloped.

  “There’s all this evidence, dontya see?” the boy was saying. “Life on Earth came from outer space. Water came from comets and they had these microbes in ‘em and that’s where everything came from. After the comets start hitting in 2308, then people from Nemesis’ll come down again and redo everything, dontya see? Change everything in the whole world. So what’s the point anyway? Everything’ll just change no matter what we do. They’re coming and that’ll be that. Dontya see?”

  What a weird world this had become, Ben thought. Infinitely better in so many ways, infinitely safer and more intelligent. Yet the same people who scored off-the-charts on twelve-dimensional intelligence tests could turn into imbeciles when faced with intellectual voodoo.

  Ron needed help Ben decided he couldn’t give him. Perhaps psychiatric drug therapy, and maybe even a prolonged stay in a so-called safe environment. Ben’s fingers signaled out a callback message to Ron’s mother, who had arranged this counseling session to begin with. Ron was age-of-consent, but Ben assumed that she would care enough to intervene with a flash-petition to her local magistrate.

  “Those people down in Yucatan had it right, but they were stupid, too, dontya know? Killed themselves for no reason.”

  “Oh?” Ben took heart, until the boy continued:

  “Yeah, those aliens, the ones who live around Nemesis, only come to Egypt, dontya know? That’s the place they’ll find us; we’re the disciples, dontya know? Lottie Crayton, she’s the one who’s had the visions. Says we can all go live with them… forever. None of us, none of the disciples, ever have to die. Not really, anyway. She’s gonna wait for ‘em in Egypt. I’m gonna go, too.” Ron took a breath.

  Ben decided he’d better not waste any time with this one. Young Ron didn’t seem far from core meltdown, from going cuckoo, as they used to say when Ben was a kid. He would give the boy a quick verbal shock treatment, then try to do what he could for him.

  “Ron,” Ben said, his voice ten decibels too loud. The boy’s eyes looked distant, glazed, nonresponsive. “Ron!” he yelled.

  Ron’s eyes flew open. Ben had his attention. “That’s stupid, Ron. Stupid! You listen to crap like that, you’ll ruin your life.”

  “Huh?” Ron flinched, as though no one had ever spoken to him that way before. And perhaps no one ever had. But there wasn’t time for gentle words or subtle philosophy with this kid. Wake him up, then get him some heavy duty help: That was the only way that ever worked with this sort of problem.

  “B-But Lottie Clayton… We all scipped her,” Ron complained. “Lottie’s telling the truth!”

  “That’s right, Ron. Lottie is not lying. She’s not delusional or clinically pathological, either. Back a hundred, hundred fifty years ago, Lottie is what we would’ve called a ditz. She is so, hmmm, intellectually challenged, she believes her own vivid dreams represent truth.

  “Look, Ron. I’m not going to pull any punches with you. Lottie Crayton is an articulate, charismatic idiot. Do not listen to her. You’ve been raised in such an innocent world, you’re susceptible to stuff like this. And being human, you resist the notion that your soul is anything less than eternal, even though there is absolutely no evidence either way. Listen to me, Ron: This life might well be all you get. Nobody knows. Nobody can know. Don’t gamble everything you have on some entity or being that no living person has ever seen outside of their dreams and visions and fantasies. Cherish what you know you have: your life. Here. Now. On Earth. I’ve got friends who can help you understand this better, Ron. But they’ll need to visit you in person instead of by VR. Is that okay?”

  Amazingly, the kid didn’t hesitate even a second. “Why sure, Mr. Smith. That’d be great. I’ve never known anyone who talks like you, dontya see? You’re like from somewhere else, real old-movie-like. Like old, old movies. Dontya see?”

  That was when my message showed up on Ben’s corneal screen, informing him that I had to see him right away. He clicked his tongue once, our signal that he would arrive here in about an hour.

  “Okay, Trip,” my great-grandfather said. “What’s so damn important you could only tell me about it in person?”

  Virginia, Ben, and I headed toward the main conference room at our new headquarters of Neural Nanoscience Laboratories, the business partnership Virginia and I had formed back in 2084.

  “They’ve discovered a way to replicate human beings,” I said as we seated ourselves at a sound-preclusion console near the door.

  “Is this another one of your jokes?” Ben asked me, probably remembering the time I’d programmed micromachines to reposition furniture in his office whenever he glanced at his deskscreen. “Y’know, some Frenchman already beat them to it,” Ben added. “About sixty-two years ago.”

  “No,” Virginia explained, “not just biological cloning; we’re talking about entire adult human beings, including memory and environmentally induced personality.”

  “You’ve got to be kidding.”

  “No, we’re not.”

  “And right now we’re the only facility on the northeast coast of America with the necessary equipment and training,” I added. “We should have a monopoly for several months.”

  Ben stared at us. “But how?”

  “Ever since the Nemesis discovery, human scientists have been feeding all sorts of crazy projects to the AI banks,” Virginia explained. “Of course, some of them were bound to bear fruit. And last week, at the suggestion of about thirty nanoscientists, including Trip, the entire World AI Network spent four days calculating a safe way to upload data from human brains using nano-disassembler/assemblers. To put it in layman’s terms, we survey the position of every molecule in the brain, with some shortcuts, then store the information digitally in datacubes. That way we can transport it. Maybe transmit it all by radio, microwaves, or even infrared. Duplicating the body itself’s simple, of course; all we have to do is replicate a single cell’s DNA and clone it. But not until the memories are attached would you have the entire person.”

  “And probably,” I said, “after a few thousand people have gone through the process, AIs can learn how to analyze what every neuron molecule means; the way each molecule’s position and makeup translates into information.”

  “A radical leap in technology,” Virginia said, “with, if you’ll pardon the pun, mind-boggling implications.”

  “Such as?”

  “We think we’ll be able to implant specific knowledge without disrupting memory. It could eventually lead to AIs being incorporated inside our brains, maybe linking our thoughts to the minds of others via transducers and radio signals, a sort of artificial telepathy. Someday, people might even choose to give up their bodies entirely in favor of nearly unlimited, machine-enhanced intelligence and physicality; to live their lives in ways we can barely imagine.”

  “Maybe even safe from cosmic disasters,” I gushed.

  Ben ignored me. “Why would anyone want a duplicate of themselves, though?” he asked Virginia. “Would such replicas be human? And more to the point, would they still be the same individuals?”

  “Those are interesting questions,” she said, “but maybe more about semantics than science. For example, are you the same person you were ten years ago? Or the same person at age sixty that you were at ten? Arthur Clarke, the science fiction writer, once said the reason he decided not to be frozen was that he became a different person every ten or fifteen years anyway. Do you agree with that?”

  “Well, you’ve taken his statement out of context, but in what you’re saying, absolutely not. It’s ridiculous. Even in context, he was wrong, but explaining it would take a four-hour dissertation.”

  Virginia smiled concu
rrence.

  Ben went on: “If you subscribe to that theory verbatim, why study in high school or go to college? Why learn anything you can’t use right away? Why invest your money for the long term just so ‘some other person’ can spend it? No, I believe I’m essentially the same human being I always was.”

  “Good. I do, too. Okay, then what if you had partial amnesia? Or total amnesia?”

  “Now I’m not so sure. Of course, I’d still have the same face and natural body-type, maybe the same personality…”

  “Yes,” she said, “but so do clones, or identical twins, and we agree that they’re not the same, right?”

  “I see what you mean.”

  “Now let’s say we transplant your brain into my skull and vice versa. You inhabit my body and I inhabit yours. Which one am I?”

  “You’d be the one in my body,” Ben said.

  “Yes, I agree,” Virginia said. “It might even be fun to try someday, but I digress. Now here’s an interesting riddle: Suppose we learned how to duplicate parts of the brain mechanically, and begin to replace your brain at the rate of, let’s say, one percent per year. You don’t actually notice it happening, and your function and memory remain unchanged. After a hundred years, your brain becomes entirely mechanical, yet your personality and recall are the same as they were before. Are you still the same person?”

  “I say: yes,” Ben declared.

  “Okay then, what if you did it in a hundred seconds instead of a hundred years?”

  “I’m not sure it would matter, would it?”

  “Depends on your perspective. But so far, it seems you agree that information constitutes identity.”

  “Maybe, maybe not. Okay, Virginia, I have one for you. Let’s say we construct your perfectly analogous mechanical brain, but keep the old biological one intact, too. I think most people would agree that the biological brain is the one that holds that person’s identity. Yet the only real difference between my example and yours is the continued existence of the biological brain. So why should the mere existence of the original brain affect the identity status of the mechanical one? And do you think it would matter whether or not one brain were aware of the existence of the other?”

  “Interesting supposition. Reminds me of one of Robert Ettinger’s identity ‘experiments.’”

  “What’s that?”

  “Well, back in the 1960s, Ettinger proposed that we imagine a synthetic brain that could not only replicate the exact function of a particular human brain, but also maintain a total, radio-controlled interconnection with it. Now assume we can use various parts of each, the synthetic and the original, and the corresponding part of the other brain simply lies dormant. You could decide to use, say, the artificial medulla oblongata, hypothalamus, and brain stem, along with all the other parts of your original brain. You with me so far?”

  “Sure.”

  “Okay. Now we start the experiment with a normal, fully conscious woman, and the machine brain switched off. But slowly we start disconnecting various parts of her brain and simultaneously activating the corresponding parts of the machine. She never notices any of it, yet when we finish, the machine controls her body. Does she really become the machine?”

  “I don’t know. It’s pretty confusing.”

  “Wait. It gets worse. Now assume the machine has its own sensory apparatus. And whenever we want to, we can also cut off the woman’s senses, simultaneously activating the machine’s. In other words, we can cause her senses to switch from one body to another, woman’s to machine’s.”

  “Wow. Okay, I still think she’s the same person.”

  “So do I. But there’s more! Now her original brain and body are both fully dormant, and to an observer, she appears to be an unconscious woman, alongside a functioning machine that thinks it’s a woman directing a machine.”

  “Uh, okay.”

  “So now we reactivate her brain. She notices no difference. Then we switch her back to her normal human senses so that the machine is dormant. And we keep switching her back and forth, until she becomes accustomed to it, maybe even preferring to occupy the machine. Eventually she might decide it was irrelevant to her which vessel she occupied; perhaps if her original body were destroyed, she wouldn’t care one way or the other.”

  “Fascinating idea,” Ben said, “but a little too weird for me. Obviously we can’t really do any of that stuff, right?”

  “No, of course not. Not yet, anyway,” Virginia said. “I was just trying to give you another way to look at the nature of identity. But applications of this technology even weirder than Ettinger’s experiment are going to emerge. You just wait. In fact one astounding application’s ready right now. Figured out what it is, yet?”

  Ben considered her question. “I can’t visualize the details yet, but this discussion seems to suggest that your process could make a person effectively, uh, immortal…”

  “Bingo!” I said.

  “Okay, now tell me how.”

  “Well, consider how people die,” I said.

  Ben’s eyes narrowed. “Never from disease or old age anymore. Only from freak accidents, or at their own hand.”

  “Exactly. So wouldn’t you want your DNA pattern and memories stored off-site, on a satellite or space station somewhere, maybe even outside our solar system, just in case you, or even the whole planet, happened to get destroyed in an accident? As long as the accident didn’t destroy every last human, or the medical and scientific knowledge we’ve accumulated, then any information needed to reconstruct you would be saved.”

  “You mean like a backup disc for a twentieth century computer?”

  Virginia nodded. “Except that now we could simply send the information to the storage facility. Even if it’s light-years away. Do that two or three times, and the chance of simultaneously losing all stored ‘yous’ becomes incalculably small.”

  “Amazing,” Ben said. Then he grinned. “Take your singularity and shove it.”

  Virginia’s mouth fell open, then she grinned.

  “We think there ought to be quite a market for it,” I said coyly. “Especially since the complete brain-survey process takes only ninety minutes.”

  “How much would it cost?” Ben asked.

  “Oh, we’ll probably charge two weeks’ earnings for it,” I said, “at least until competition from other companies drives down the price. Our first buyers will be those most able to pay; we’ll recover our costs quickly. In a negligible amount of time, anyone will be able to afford it.”

  “Incredible.” Ben folded his arms, nodded and smiled. “Everyone except the cult-crazies should love this. But you didn’t have me come down here just to tell me about it, did you?”

  “Not quite. Virginia and I decided that since you’re the oldest person in our family whose memories are still intact, we want you to be our inaugural Cache.”

  “Cache?”

  “That’s what we named the process,” Virginia said. “It means hidden reserves; like savings for a rainy day. But in this case, we’re preserving something much more valuable than money. Of all the great treasures in the universe, the most precious by far is information. Because information’s the essence of every human being, and of life itself.”

  She paused, so I turned a big grin on Ben and asked him, “Well, what do you say? How’d you like to become the first immortal?”

  October 20, 2098

  —Seven members of the Canadian karate team are temporarily killed during a morning match against Japan. After molecular reconstruction, team captain Montgomery Paul describes his own death at the hands of world champion Akio Narato as “intensely painful, yet indescribably exciting. And no doubt it’ll be great preparation for the big one in 2308.” A spokeswoman for the Tokyo Humanism Council described today’s match as “unconscionable and borderline insane. This sort of nihilistic behavior has increased exponentially since the Nemesis discovery. The reasons, I think, are obvious.”—World presidential candidate Sven Langervist proposes dou
bling income tax rates from 9.5 to 19% of net income to finance a particle collider encircling the sun between the orbits of Jupiter and Saturn. “The potential benefit to humankind of such a research contrivance is unknowable, but it may lead to discoveries that will help thwart disasters from outer space,” declares the candidate, to whom AIs assign a “statistically insignificant” chance of winning next month’s election.—The World Addiction Bureau releases September’s statistics, which show a 7% drop in VR addiction over the previous month. Addiction Czar Bennett Williams declares, “This represents barely a twofold increase over the pre-Nemesis Panic rates, and nearly a 15% decline overall since July’s peak. Suicides are down by 11% as well. Obviously our programs have succeeded brilliantly.” Separately, Williams announces his intention to retire from his post early next year to lecture and write about personal motivation.

  “I’m real glad I did it, Gary. Gives me a comforting feeling, almost like signing up for cryonics did in the 1980s.” Ben tried to express the ideas in terms of his own experience; a thoughtful assessment of the advantages of our Cache service, rather than a plea to his only son. He figured that if he asked his son to do it because he’d hate to ever lose him, Gary might feel pressured. Lord knew, Ben made him feel uncomfortable enough just talking about day-to-day stuff. “In some ways, it was even more soothing.”

  “How so?” Gary asked, grinding his teeth. It seemed the only way to keep his father’s voice from disintegrating into white noise. Hard as it was, Gary found he wanted to listen.

  “Cryonics was like insurance,” Ben explained nervously. “You knew you’d eventually need it. Did you know that after the invention of life insurance, almost a hundred years passed before it caught on?”

  “Yes, I’ve read about that.” With Mnemex now added to all commercial drinking water, both men understood that reading had become the same as knowing.

 

‹ Prev