Nanotech

Home > Other > Nanotech > Page 5
Nanotech Page 5

by Gardner Dozois


  "Too bad you wouldn't let me be one."

  "Karen . . ."

  "Don't you want to know what the reverser is, Paula? It's engineered from human chorionic gonadotropin. The pregnancy hormone. Too bad you never wanted a baby."

  She went on staring at me. Lollie shrieked and splashed with her frog. Her lips were turning blue. I stood up, laid Timmy next to Lori in the portacrib, and buttoned my blouse.

  "You made an experimental error twenty-five years ago," I said to Paula. "Too small a sample population. Sometimes a frog jumps out."

  I went to lift my daughter from the wading pool.

  AXIOMATIC

  Greg Egan

  Today it's everyone's prerogative to change their minds, but enjoy that freedom while you can—in the nanotech-dominated world of the future, you might not only not be able to change your mind, you might not even be able to want to change it . . .

  Only a bit over halfway through the decade, it's already a fairly safe bet to predict that Australian writer Greg Egan is going to come to be recognized (if indeed he hasn't already been so recognized) as being one of the Big New Names to emerge in SF in the nineties. In the last few years, he has become a frequent contributor to Interzone and Asimov's Science Fiction, and has made sales as well as to Pulphouse, Analog, Aurealis, Eidolon, and elsewhere; many of his stories have also appeared in various "Best of the Year" series, and he was on the Hugo Final Ballot in 1995 for his story "Cocoon," which won the Ditmar Award and the Asimov's Readers Award. His first novel, Quarantine, appeared in 1992, to wide critical acclaim, and was followed by a second novel in 1994, Permutation City, which won the John W. Campbell Memorial Award. His most recent books are a collection of his short fiction, Axiomatic, and two new novels, Distress and Diaspora.

  ". . . like your brain has been frozen in liquid nitrogen, and then smashed into a thousand shards!"

  I squeezed my way past the teenagers who lounged outside the entrance to The Implant Store, no doubt fervently hoping for a holovision news team to roll up and ask them why they weren't in school. They mimed throwing up as I passed, as if the state of not being pubescent and dressed like a member of Binary Search was so disgusting to contemplate that it made them physically ill.

  Well, maybe it did.

  Inside, the place was almost deserted. The interior reminded me of a video ROM shop; the display racks were virtually identical, and many of the distributors' logos were the same. Each rack was labeled: psychedelia, meditation and healing. motivation and success. languages and technical skills. Each implant, although itself less than half a millimeter across, came in a package the size of an old-style book, bearing gaudy illustrations and a few lines of stale hyperbole from a marketing thesaurus or some rent-an-endorsement celebrity. "Become God! Become the Universe!" "The Ultimate Insight! The Ultimate Knowledge! The Ultimate Trip!" Even the perennial "This implant changed my life!"

  I picked up the carton of You Are Great!—its transparent protective wrapper glistening with sweaty fingerprints—and thought numbly: If I bought this thing and used it, I would actually believe that. No amount of evidence to the contrary would be physically able to change my mind. I put it back on the shelf, next to Love Yourself A Billion and Instant Willpower, Instant Wealth.

  I knew exactly what I'd come for, and I knew that it wouldn't be on display, but I browsed a while longer, partly out of genuine curiosity, partly just to give myself time. Time to think through the implications once again. Time to come to my senses and flee.

  The cover of Synaesthesia showed a blissed-out man with a rainbow striking his tongue and musical staves piercing his eyeballs. Beside it, Alien Mind-Fuck boasted "a mental state so bizarre that even as you experience it, you won't know what it's like!" Implant technology was originally developed to provide instant language skills for business people and tourists, but after disappointing sales and a takeover by an entertainment conglomerate, the first mass-market implants appeared: a cross between video games and hallucinogenic drugs. Over the years, the range of confusion and dysfunction on offer grew wider, but there's only so far you can take that trend; beyond a certain point, scrambling the neural connections doesn't leave anyone there to be entertained by the strangeness, and the user, once restored to normalcy, remembers almost nothing.

  The first of the next generation of implants—the so-called axiomatics—were all sexual in nature; apparently that was the technically simplest place to start. I walked over to the Erotica section, to see what was available—or at least, what could legally be displayed. Homosexuality, heterosexuality, autoerotism. An assortment of harmless fetishes. Eroticisation of various unlikely parts of the body. Why, I wondered, would anyone choose to have their brain rewired to make them crave a sexual practice they otherwise would have found abhorrent, or ludicrous, or just plain boring? To comply with a partner's demands? Maybe, although such extreme submissiveness was hard to imagine, and could scarcely be sufficiently widespread to explain the size of the market. To enable a part of their own sexual identity, which, unaided, would have merely nagged and festered, to triumph over their inhibitions, their ambivalence, their revulsion? Everyone has conflicting desires, and people can grow tired of both wanting and not wanting the very same thing. I understood that, perfectly.

  The next rack contained a selection of religions, everything from Amish to Zen. (Gaining the Amish disapproval of technology this way apparently posed no problem; virtually every religious implant enabled the user to embrace far stranger contradictions.) There was even an implant called Secular Humanist ("You WILL hold these truths to be self-evident!"). No Vacillating Agnostic, though; apparently there was no market for doubt.

  For a minute or two, I lingered. For a mere fifty dollars, I could have bought back my childhood Catholicism, even if the Church would not have approved. (At least, not officially; it would have been interesting to know exactly who was subsidizing the product.) In the end, though, I had to admit that I wasn't really tempted. Perhaps it would have solved my problem, but not in the way that I wanted it solved—and after all, getting my own way was the whole point of coming here. Using an implant wouldn't rob me of my free will; on the contrary, it was going to help me to assert it.

  Finally, I steeled myself and approached the sales counter.

  "How can I help you, sir?" The young man smiled at me brightly, radiating sincerity, as if he really enjoyed his work. I mean, really, really.

  "I've come to pick up a special order."

  "Your name, please, sir?"

  "Carver. Mark."

  He reached under the counter and emerged with a parcel, mercifully already wrapped in anonymous brown. I paid in cash, I'd brought the exact change: $399.95. It was all over in twenty seconds.

  I left the store, sick with relief, triumphant, exhausted. At least I'd finally bought the fucking thing; it was in my hands now, no one else was involved, and all I had to do was decide whether or not to use it.

  After walking a few blocks towards the train station, I tossed the parcel into a bin, but I turned back almost at once and retrieved it. I passed a pair of armored cops, and I pictured their eyes boring into me from behind their mirrored faceplates, but what I was carrying was perfectly legal. How could the Government ban a device which did no more than engender, in those who freely chose to use it, a particular set of beliefs—without also arresting everyone who shared those beliefs naturally? Very easily, actually, since the law didn't have to be consistent, but the implant manufacturers had succeeded in convincing the public that restricting their products would be paving the way for the Thought Police.

  By the time I got home, I was shaking uncontrollably. I put the parcel on the kitchen table, and started pacing.

  This wasn't for Amy. I had to admit that. Just because I still loved her, and still mourned her, didn't mean I was doing this for her. I wouldn't soil her memory with that lie.

  In fact, I was doing it to free myself from her. After five years, I wanted my pointless love, my useless gr
ief, to finally stop ruling my life. Nobody could blame me for that.

  She had died in an armed hold-up, in a bank. The security cameras had been disabled, and everyone apart from the robbers had spent most of the time face-down on the floor, so I never found out the whole story. She must have moved, fidgeted, looked up, she must have done something; even at the peaks of my hatred, I couldn't believe that she'd been killed on a whim, for no comprehensible reason at all.

  I knew who had squeezed the trigger, though. It hadn't come out at the trial; a clerk in the Police Department had sold me the information. The killer's name was Patrick Anderson, and by turning prosecution witness, he'd put his accomplices away for life, and reduced his own sentence to seven years.

  I went to the media. A loathsome crime-show personality had taken the story and ranted about it on the airwaves for a week, diluting the facts with self-serving rhetoric, then grown bored and moved on to something else.

  Five years later, Anderson had been out on parole for nine months.

  OK. So what? It happens all the time. If someone had come to me with such a story, I would have been sympathetic, but firm. "Forget her, she's dead. Forget him, he's garbage. Get on with your life."

  I didn't forget her, and I didn't forget her killer. I had loved her, whatever that meant, and while the rational part of me had swallowed the fact of her death, the rest kept twitching like a decapitated snake. Someone else in the same state might have turned the house into a shrine, covered every wall and mantelpiece with photographs and memorabilia, put fresh flowers on her grave every day, and spent every night getting drunk watching old home movies. I didn't do that, I couldn't. It would have been grotesque and utterly false; sentimentality had always made both of us violently ill. I kept a single photo. We hadn't made home movies. I visited her grave once a year.

  Yet for all of this outward restraint, inside my head my obsession with Amy's death simply kept on growing. I didn't want it, I didn't choose it, I didn't feed it or encourage it in any way. I kept no electronic scrapbook of the trial. If people raised the subject, I walked away. I buried myself in my work; in my spare time I read, or went to the movies, alone. I thought about searching for someone new, but I never did anything about it, always putting it off until that time in the indefinite future when I would be human again.

  Every night, the details of the incident circled in my brain. I thought of a thousand things I "might have done" to have prevented her death, from not marrying her in the first place (we'd moved to Sydney because of my job), to magically arriving at the bank as her killer took aim, tackling him to the ground and beating him senseless, or worse. I knew these fantasies were futile and self-indulgent, but that knowledge was no cure. If I took sleeping pills, the whole thing simply shifted to the daylight hours, and I was literally unable to work. (The computers that help us are slightly less appalling every year, but air-traffic controllers can't daydream.)

  I had to do something.

  Revenge? Revenge was for the morally retarded. Me, I'd signed petitions to the UN, calling for the worldwide, unconditional abolition of capital punishment. I'd meant it then, and I still meant it. Taking human life was wrong; I'd believed that, passionately, since childhood. Maybe it started out as religious dogma, but when I grew up and shed all the ludicrous claptrap, the sanctity of life was one of the few beliefs I judged to be worth keeping. Aside from any pragmatic reasons, human consciousness had always seemed to me the most astonishing, miraculous, sacred thing in the universe. Blame my upbringing, blame my genes; I could no more devalue it than believe that one plus one equaled zero.

  Tell some people you're a pacifist, and in ten seconds flat they'll invent a situation in which millions of people will die in unspeakable agony, and all your loved ones will be raped and tortured, if you don't blow someone's brains out. (There's always a contrived reason why you can't merely wound the omnipotent, genocidal madman.) The amusing thing is, they seem to hold you in even greater contempt when you admit that, yes, you'd do it, you'd kill under those conditions.

  Anderson, however, clearly was not an omnipotent, genocidal madman. I had no idea whether or not he was likely to kill again. As for his capacity for reform, his abused childhood, or the caring and compassionate alter ego that may have been hiding behind the facade of his brutal exterior, I really didn't give a shit, but nonetheless I was convinced that it would be wrong for me to kill him.

  I bought the gun first. That was easy, and perfectly legal; perhaps the computers simply failed to correlate my permit application with the release of my wife's killer, or perhaps the link was detected, but judged irrelevant.

  I joined a "sports" club full of people who spent three hours a week doing nothing but shooting at moving, human-shaped targets. A recreational activity, harmless as fencing; I practiced saying that with a straight face.

  Buying the anonymous ammunition from a fellow club member was illegal; bullets that vaporized on impact, leaving no ballistics evidence linking them to a specific weapon. I scanned the court records; the average sentence for possessing such things was a five-hundred-dollar fine. The silencer was illegal, too; the penalties for ownership were similar.

  Every night, I thought it through. Every night, I came to the same conclusion: despite my elaborate preparations, I wasn't going to kill anyone. Part of me wanted to, part of me didn't, but I knew perfectly well which was strongest. I'd spend the rest of my life dreaming about it, safe in the knowledge that no amount of hatred or grief or desperation would ever be enough to make me act against my nature.

  I unwrapped the parcel. I was expecting a garish cover—sneering body builder toting sub-machine-gun—but the packaging was unadorned, plain grey with no markings except for the product code, and the name of the distributor, Clockwork Orchard.

  I'd ordered the thing through an on-line catalogue, accessed via a coin-driven public terminal, and I'd specified collection by "Mark Carver" at a branch of The Implant Store in Chatswood, far from my home. All of which was paranoid nonsense, since the implant was legal—and all of which was perfectly reasonable, because I felt far more nervous and guilty about buying it than I did about buying the gun and ammunition.

  The description in the catalogue had begun with the statement Life is cheap! then had waffled on for several lines in the same vein: People are meat. They're nothing, they're worthless. The exact words weren't important, though; they weren't a part of the implant itself. It wouldn't be a matter of a voice in my head, reciting some badly written spiel which I could choose to ridicule or ignore; nor would it be a kind of mental legislative decree, which I could evade by means of semantic quibbling. Axiomatic implants were derived from analysis of actual neural structures in real people's brains, they weren't based on the expression of the axioms in language. The spirit, not the letter, of the law would prevail.

  I opened up the carton. There was an instruction leaflet, in seventeen languages. A programmer. An applicator. A pair of tweezers. Sealed in a plastic bubble labeled sterile if unbroken , the implant itself. It looked like a tiny piece of gravel.

  I had never used one before, but I'd seen it done a thousand times on holovision. You placed the thing in the programmer, "woke it up," and told it how long you wanted to be active. The applicator was strictly for tyros; the jaded cognoscenti balanced the implant on the tip of their little finger, and daintily poked it up the nostril of their choice.

  The implant burrowed into the brain, sent out a swarm of nanomachines to explore, and forge links with, the relevant neural systems, and then went into active mode for the predetermined time—anything from an hour to infinity—doing whatever it was designed to do. Enabling multiple orgasms of the left kneecap. Making the color blue taste like the long-lost memory of mother's milk. Or, hard-wiring a premise: I will succeed. I am happy in my job. There is life after death. Nobody died in Belsen. Four legs good, two legs bad . . .

  I packed everything back into the carton, put it in a drawer, took three sleeping pills, and went
to bed.

  Perhaps it was a matter of laziness. I've always been biased towards those options which spare me from facing the very same set of choices again in the future; it seems so inefficient to go through the same agonies of conscience more than once. To not use the implant would have meant having to reaffirm that decision, day after day, for the rest of my life.

  Or perhaps I never really believed that the preposterous toy would work. Perhaps I hoped to prove that my convictions—unlike other people's—were engraved on some metaphysical tablet that hovered in a spiritual dimension unreachable by any mere machine.

  Or perhaps I just wanted a moral alibi—a way to kill Anderson while still believing it was something that the real me could never have done.

  At least I'm sure of one thing. I didn't do it for Amy.

  I woke around dawn the next day, although I didn't need to get up at all; I was on annual leave for a month. I dressed, ate breakfast, then unpacked the implant again and carefully read the instructions.

  With no great sense of occasion, I broke open the sterile bubble and, with the tweezers, dropped the speck into its cavity in the programmer.

  The programmer said, "Do you speak English?" The voice reminded me of one of the control towers at work; deep but somehow genderless, businesslike without being crudely robotic—and yet, unmistakably inhuman.

  "Yes."

  "Do you want to program this implant?"

  "Yes."

  "Please specify the active period."

  "Three days." Three days would be enough, surely; if not, I'd call the whole thing off.

  "This implant is to remain active for three days after insertion. Is that correct?"

  "Yes."

  "This implant is ready for use. The time is seven forty-three a.m. Please insert the implant before eight forty-three a.m., or it will deactivate itself and reprogramming will be required. Please enjoy this product and dispose of the packaging thoughtfully."

 

‹ Prev