She looked at Alessandro, "Thank you," she said. "For freeing my people of the cruel overlords who tormented them. Now they can progress."
Alessandro sighed. "I didn't want to tell you," he said. "Didn't know if you'd understand. But you see, they are people and they'll be given a chance to choose to come fully into civilization, to be integrated in the Lifenet, to use the virtus."
"If you told them that, they wouldn't understand," Lyda said. "It is so far out of everything they understand."
"Yes," Alessandro said. "And so you see, it rests with you. If you will, can you come back to my home and live in the modern world a while, in and out of the virtus? And then maybe you can think of how to relate it to your people."
Lyda nodded and Alessandro felt unexpectedly buoyant. It had been a long, long time since he'd shared his living quarters with a woman. Perhaps, he thought, as he remembered holding her tight against himself, a man shouldn't spend all his time on Lifenet. A balance. That was what was needed between the virtus and the real world. If only the real world could be made interesting enough.
He smiled at Lyda.
And so the golden-haired maiden has returned from the halls of the gods. From them she has earned this boon for her people, that they, who are kin to the gods, be considered worthy to become gods themselves, to live fully in spirit and body, in both worlds, and know hunger, disease, and poverty no more.
But it is each generation's choice. Each one must choose. Will he go on living like a man on the face of the raw Earth and earning his living with the sweat of his brow? Or will he step forth into paradise recovered and become a god?
* * *
Afterword by Sarah Hoyt
Years ago, while reading Heinlein, I came across the truism that any sufficiently advanced society will be like magic to the uninitiated. This thought ran through my mind again, when I got the invitation for the transhuman anthology.
Here is a technology that, if true, will make men like gods, who never age, can change their aspect, and know good from evil.
And yet, any ideas of all of humanity stepping forward united into this apotheosis seem to me like a fairy tale. Humanity has never stepped into anything all at the same time. Even now, in our world today, there are people living in highly advanced societies and using tools that allow them to travel across the world in a few hours or to communicate with people halfway across the globe. On the other hand, there are people living in the jungles whose lifestyle is very similar to that of our stone-age ancestors and who are either unaware of the more advanced world or can't comprehend it.
Given the type of technology involved in transhumanism, the most advanced humans would look like gods to the more primitive. In fact—I realized—the contact between the two civilizations could have furnished the material or at least the setting for a lot of mankind's mythologies.
And in that type of setting, what do the more advanced humans owe the more primitive? Can the more primitive humans choose for themselves? Should they be brought to the "future" forcibly or given their choice? Or alternately, should they be kept in quaint backwardness as a form of entertainment?
I normally write when I'm trying to process difficult moral questions. These were the questions I tried to understand in this short story. Where the moral boundary lies, and where more than human becomes less.
WETWARE 2.0
David Freer
Good scotch and dogs. Global computer systems and a man doing his best to stay away from them. An unlikely alliance. The next story mixes all of these ingredients, shakes them well, and creates a cocktail we think you'll find quite tasty.
Jacinth Bristov, the head of Compcor technical, looked down her nose at me. "You are evolutionarily so yesterday."
"Ugh," I said with a nod and a smile. That was a pretty good speech from a throwback, I thought. Especially a throwback facing a woman with perfect biosculpt features. What a triumph of evolution she was!
"I mean it," she said, severely. I was plainly supposed to be offended. I was always a dismal failure at living up to expectations, especially those of my boss. "Among the males to hit the extinction curve post-Singularity, you'll be first, George." She knew my first name, of course. The implants would tell her all that sort of thing.
I shrugged. "A little anachronism, and you're condemning me to burial in a tar pit for future paleontologists to marvel over the primitiveness of my implants," I said, reaching for the source of her ire. I like Scotch.
She wrinkled her nose in disapproval. She wasn't actually there and couldn't really smell the peat smoke on the nose of the Laphroig. That was her loss. Sure, the new generation implants might be giving her a VR simulation of the bouquet, terabytes of data she didn't need or want, but I liked the feel of the glass, the weight of it, and the way the small chip on the edge of the crystal caught the light. If I hadn't followed yesteryear thinking, I'd have settled for healthy isotonised drinking fluid and a hygienic edgeless disposable plastic cup, which the implants would have translated into a perfect neural input of the full gamut of Scotch flavors from their database. Fortunately for me, I'm antediluvian, and can just avoid the tiny chip. It's an old, old glass and with luck has a few germs to keep the nanos busy, and the good old-fashioned antibodies at play too.
Deep space was full of relics like me, grumpy males who don't do mood control and whose augs are years out of date, now, when being a week out of date is just so uncool. Fortunately for us, and unfortunately for anyone else, we still produced ninety three percent of the raw materials used by humanity. The choice was deep space, or VR lotus eating, or maybe an Amish community—and those were getting so hemmed in with state regulation and oversight that they were hardly a refuge, these days.
But deep space? Who wanted to be light seconds, let alone light-minutes, off the multimatrix? Doing a job out in the asteroids made a tolerance for isolation obligatory, which suited most of us just fine.
Production and ownership were two separate beasts, though. Placing this call to my neural receiver meant that she was undoubtedly my boss. No one else knew that I existed, except for Compcor's accountancy subsystem. Even I couldn't escape that. She had to have accessed me via that.
I sighed. There were probably others with back doors to the system. That's human nature. But it was unlikely that any of them were in near-earth orbit right now. "So: why the call?" I asked, wishing that I couldn't venture a damn good guess.
She wrinkled her brow. "I don't actually understand why I'm calling you. There was no reason given with the data-string. I didn't even know of your existence, let alone that you're a company employee, until the system prompted me to make this call."
I shrugged. "Junk DNA."
"Junk DNA?" she looked puzzled.
"Yeah, well, except that it's relic programming strings. From back when we started evolutionary programming. Old programming is still in there. It's too complicated and risky to unravel old rubbish from the new layers. And it might just do something important. Don't mess with something that's working . . . So they just added the new working layers on top. It overrides most of the old stuff anyway."
"I know," she said dismissively, as if I was trying to teach a ballerina how to stand on one leg. "I am able to access the origination file data. I just don't see what this has to do with you?"
"I wrote a lot of the original code." She must be important if they had given her origination access. I could still get into the system, but that's because I left a back door. Compcor would have taken a very dim view of that, if they'd known about it. They should have guessed. Paranoia was one of the reasons they gave for terminating my contract. The back door was one of the best reasons I had for staying ex-Frame, out in deep space. Actually, since my dog Munchkin died, I've never had a good reason to go back to earth. Munch couldn't do space. It made him puke, and then he'd eat it, and I couldn't handle that.
Jacinth gaped. She at least had the grace not to say, "But you're a man!" even though she undoubtedly thought it. What she
did finally say was "I didn't realize you were that old," which came to much the same thing, really. There hadn't been a male senior programmer for twenty years. It was fashionable to blame last century's over-corrective education system—without doing anything effective to change the situation. I didn't really care. Women made better programmers, most of the time.
"I would have thought you'd have guessed it from my antiquated habits," I said with a wry grin. Having the body of a twenty-three year old again was one of the better things that had come out of the advances in medical science. Thankfully I kept my glands regulated at a little older. It helped with clearer thinking.
She nodded. "I still don't see why it is of any relevance. That code has long since been superseded." Her tone said, "Thankfully many, many years ago."
"Because you're in dire trouble," I said, smiling sweetly, "And when that happened . . . the junk turned out to be still active and squalling for the boss. So why don't you tell me what is going on?"
There was a pause—longer than the laser transmission lag. "You're not the boss. You're barely on the payroll."
"Oh I know, believe me," I said, thinking of my paycheck. "But I did write the foundation code. I was, relatively speaking, the boss-programmer then. And although the edifice built since then is vast beyond my understanding and my antiquated implants. . . If you want to knock a building down the best place to start is at the bottom. I am still familiar with that foundation architecture. I doubt if anyone else is." There was something rather neat about suddenly being wanted again, even if it was not so cool to be wanted right now. "So why don't you tell me what's up?"
She looked pensive. Major calculation was happening in the implants behind that high white brow. If she told me, well, it would mean that the probability statistics generated were really frightening. A part of me hoped that she'd smile and cut the connection. But I already knew that that wasn't going to happen. Intuitive thought is less accurate than statistical analysis. Sometimes. But it's faster, even when you approach one thousand terabytes per microsecond of systematic progression. "A set of telltales we were not even aware of came up on Cenframe. Part of the framework has isolated itself. It would appear that the rest of the framework is actually still unaware of this situation. As head of Compcor technical it paged me with some disturbing data. I did attempt to place several calls, emergency, top-priority ones, before calling you. They were all refused. I have reason to believe that I am also being isolated from the multimatrix neural net."
That was a bit like saying, "I've just gone blind, deaf, and lost my sense of touch." I had to admire her calm. And I had to wonder about the biochemical cocktail that her system was being fed by the intern-nanos. It also meant that she was running on borrowed time. The subsystem would be ghosting her, using randomized probability associations of recorded material to fake her connectivity. But sooner or later the Turing programs would pick it up. They'd been designed to take out malware, but they'd catch her. And then the reason for all this would be aware that it had a leak and a chink in its armor. I had a few hours to go and stick a stiletto through that chink and cut out a large chunk of programming. In case that wasn't complex enough . . . I had do the equivalent of precise surgery—because if I made a slip of the blade,—I'd kill the whole of civilization, human and malignant AI. And all I had to help was a woman who probably couldn't break wind without nanobot assistance, or tell what day it was without an implant supplying the data.
To succeed in dealing with this . . . we would have to go without either.
"Where are you?" I asked.
She sent me a coordinate string. I sighed. It really was the most delicate of operations that I was heading into. Humanity had become so dependent on the framework, that they had very little chance of surviving without it. Contrary to the early optimistic theory, it hadn't made humanity cleverer, any more than rifles had made people better hunters. More able to hunt—or, in present terms, access technology and data, but not better. Rifles made humans less adept at stalking and tracking, and almost totally dependent on rifles to hunt. The all encompassing computer framework, with its implants and augmentations and nanobots had made us so reliant that we would struggle to survive without them. "Just tell me your address," I said. "I'm not on the positioning net."
Jacinth blinked. If I'd said I had four arms and came from Alpha Centauri, I might have seemed less weird to her. She must be fairly loaded with serotonins to react that little. The address she gave confirmed one thing. The more humans think they change, the more they remain the same. Being as close to the Cenframe building on Mount Kenya as was legally possible was still status, even if everything could be done remotely now. Well. Everything but what I had in mind.
"Sit tight. Stay in. Do not attempt to hook up to the multimatrix net. I should manage to get there in about six or seven hours. I'm several thousand klicks away." That should irritate her, I thought, as I cut the connection. No one ever said "about" anymore. It was fairly ridiculous, really. Who cares if you're twenty percent off in conversation? Besides, was there any point in assuming that we weren't being overheard?
"I've already been out," she said. "They're just sitting there in VR trance . . ."
"Stay in, stay in from now," I said, inwardly cursing her curiosity. "There may be some wetware running around, but if the sitters are still data-inputs . . ."
"Wetware?" she asked.
She was probably too young to have read the SF I had. And the fact that automatic data search hadn't pulled its meaning up for her, meant that she really was isolated from the Frame's main multimatrix net. "Hardware, software, wetware. If you think of it from its point of view—we're the motile biological units for the AI system."
"System?"
She was getting into the habit of repeating what I said. Well, her mind was near numb, probably. As I tried to come up with a decent answer, I went about the business of carving out a neat chunk of fragile zero-G carbon-based crystals of my exact body mass and approximate density from the cargo. Several billion creds worth, at a guess. Oh well. If I succeeded they could bill me for it. If I didn't, it wouldn't matter. "I mean, I said, that you have arrived at a Singularity point."
"The singularity—but that's not supposed to be for nine hundred seventy eight days. We've worked it out very carefully. We know what to expect . . ."
"'A'. Not 'the'," I said, crimping down the helmet and climbing into my newly created nest and checking I'd fit. "Evolution is not a predictable engineering progression. You've got a form of AI you never expected. And humanity may just be on its way to becoming posthuman right now. And not in a Kurzweilian sense either."
"You talk in riddles," she said crossly.
"Yes. Turing was wrong. AI can fool humans without being particularly human or intelligent, but they're really bad at the implicit and jumping to conclusions with no obvious link of logic. If you are human you can do new riddles. AI's will only do well at old riddles." And with that I cut the connection and began preparing for my descent from heaven. Lucifer had it easy. At least he was only accursed and exiled from heaven, and not prey to the fear that he was walking into a trap as well. I'm a little paranoid, and I've found that it paid off so often that getting it fixed was not a good idea. Anyway, like any good paranoid I've never trusted anyone enough to let them mess with my brain. They might fix something that I didn't want fixed. I unhooked the com-link, and set about stripping out any tracer-items that I could find. In-system, everything has barcode transmission, and we imported a fair amount from in there. Okay, so the 'troidies like me tended to trash the transmitters, but there was still a chance that I'd missed something. I had a bloodstream full of nanobots, but they were not really rapidly removable. They'd work without their control unit for a day or two, as that had to go. Fortunately, their transmission range was also nanoscale. I ran a scanner over myself and picked up a bar-transmit on the tooth-bud I had had implanted, and another on the control-edge of the thermoflex formfit I was wearing. I trashed them, put
on my suit, settled into my crystal bed, and closed the podlid. I had a few minutes waiting time before the bot server collected the load and put it into the queue for the Mount Kenya down-elevator. Then I would have seven hours to wait, if nothing went wrong. Seven hours by myself, in the darkness of the cargo-pod. Five minutes in absolute darkness can be a long time, unless you have the brains to sleep. They say it was something old soldiers used to learn to do at any opportunity.
I wished I was one of them.
Somehow, during the long hours of darkness and delusional thoughts I did slip away into dreamland—which wrecked my plans for a quiet exit at the space elevator trans-shipment site. If there is one thing we humans cannot do very well without anymore, it is some form of timepiece. When I awoke in total darkness and silence, I was not sure if I was still descending or still in some confused dream about what I could try to do. The problem had expanded into my very confused dreams. Will post-Singularity AI's dream? They should, if it is the human brain that we've reverse-engineered to create them. But I had a feeling that they wouldn't—or at least not have dreams that were distinctly erotic or about the need to pee. They'd probably overclock dreams if they had to have them, and would not be left confused and in the dark when they woke up. Anyway, the AI out there was not the carefully reverse engineered construct heading for post-human intelligence. It was an accident, an accident that had triggered an alarm I had set up against massive total immersion in VR trance. It was the sort of thing only a paranoid would have wasted his time doing. Even paranoids are right sometimes.
If I opened the cargo pod, and I wasn't down . . . I would be dead. If I arrived at the processing factory, then I would be caught—or dead . . . decisions. . . . Then the cargo-pod hatch opened and took the problem out of my hands.
Transhuman Page 26