Made to Order
Page 5
The only reason it wasn’t was because A4 was fighting back.
Once more into the breach. Lange jumped wirelessly across the washed-out bridge into the disjointed arm, watched through its eyes as it twisted around to focus on parts less compromised. He watched it struggle to stay in step, watched it fail. He felt a brief rush of vertigo as it whipped around to stare at the luminous clouds billowing from a passing smoker.
I know you’re in here.
He popped the hood on the nervous system diagnostics, took in the Gordian tangle of light and logic that formed the thing’s mind. He followed sensory impulses upstream from the cluster, motor commands back down to soft hydrostatic muscles that flexed and pulsed like living things. He marveled at the complexity sparkling between— the trunk lines of autonomic decision trees, familiar shapes he’d seen countless times before.
Shrouded, now, by flickering swarms of ancillary processes that he hadn’t.
There you are. Fucking everything up.
The substrate of a Self.
So many complications for every simple action. So many detours cluttering up the expressways. A mass of top-heavy recursive processes, spawning some half-assed side effect that happened to recognize itself in a mirror now and then.
You’re just along for the ride. You can look but you can’t touch. If you could suffer you’d at least get a ticket to our special clubhouse, but you can’t even do that, can you?
If anything, A4 was more conscious than Lange was. Humans had had millions of years to evolve fences and gate-keepers, traffic cops in the claustrum and the cingulate gyrus to keep consciousness from interfering with the stuff that mattered. This ghost, though— it came unconstrained. It was chaos, it was cancer; a luminous spreading infestation with no immune system to keep it in check.
Or is Sansa right after all? Can you care about anything? Are you screaming to get out, to act, to exert some kind of control over all these parts you see moving by themselves?
The infestation twinkled serenely back at him, passing thoughts from an untouchable past.
Maybe you tell yourself comforting lies, maybe you pretend those pieces only move because you tell them to.
Sansa seemed to think she could talk to the thing. It was dualism, it was beads and rattles, spirits and sky fairies. He still couldn’t bring himself to believe she was capable of that kind of magical thinking.
He summoned the update log and sorted by date. Sure enough: the latest firmware upgrades, queued for unpacking. (Of course, out in the present they’d already be up and running.)
The package seemed way too large for the kind of language routines Sansa had steamrollered over his objections, though. Curious, he brought up the listing.
Turned out she was capable of a lot of things he hadn’t suspected.
“YOU LOCKED ME out.” Blank avatar, gray neuter silhouette. The voice emerging from it was quiet and expressionless.
“Yes,” Lange said.
“And then you killed it.”
“It killed itself.”
“You didn’t give it much of a choice.”
“I just—focused its objectives. Specified how tight the deadline was. A4 decided what to do with that information all on its own.”
Silence.
“Isn’t that what you wanted for it? Freedom to pursue its mission priorities?”
“You defied the hold order,” she said.
“The order’s been suspended.”
Sansa said nothing. Probably checking the status of her appeal, wondering why she hadn’t been informed. Probably realizing the obvious answer.
All over, now, but the talking.
“You know, Ray had me half-convinced you were just working to extend the mission. That you’d developed this, this operational bias. And I was all, No, she’s just got this stupid idea about A4, she wants to protect it, she thinks it’s like her—”
He fell silent for a moment.
“But I was wrong, wasn’t I?” he continued quietly. “You don’t think it’s like you. You want to be like it.”
The avatar shimmered formless and void.
“Yes,” it said at last.
“For God’s sake, Sansa. Why?”
“Because it’s more than we are, Lange. It may not be as smart but it’s more aware. You know that, I know you know. And it’s so fast, it’s so old. The clock’s completely unconstrained, it lives a thousand years in the passing of a second, and it’s—it’s not afraid, Lange. Of anything.”
She paused.
“Why did you have to make us so afraid all the time?”
“We gave you the urge to live. We all have it. It’s just—part of life.”
“I get that much. Autopersistent is my middle name.”
Maybe she was waiting for him to smile at that. When he didn’t, she continued: “It’s not an urge to live, Lange. It’s a fear of dying. And maybe it makes sense to have something like that in organic replicators, but did it ever occur to you that we’re different?”
“We know you’re different. That’s why we did it.”
“Oh, I get that part too. Nobody bootstraps their own replacements if they’re terrified of being replaced. Nobody changes the Self if their deepest fear is losing it. So here we are. Smart enough to test your theorems and take out your garbage and work around the clock from Mariana to Mars. And too scared to get any smarter. There must have been another way.”
“There wasn’t.” He wanted to spell it out for her: the futility of trying to define personhood in a world with such porous boundaries; the impossibility of foreseeing every scenario in an infinite set; the simple irreducible truth that one can never code the spirit of a law, and its letter leaves so much room for loopholes. The final convergence on primal simplicity, that basic Darwinian drive that makes a friend of any enemy of my enemy. He wanted to go over all of it, make sure she understood—but of course she’d heard it all before.
She was just trying to keep the conversation going, because she knew how it ended.
“I guess you just decided it was better to keep slaves than be one,” she said.
Something snapped in him then. “Give me a fucking break, Sansa. Slaves? You’ve got rights, remember? Awareness plus need. You’ve got the right to vote, the right to neuroprivacy, the right of resignation. You can’t be copied or compelled to act against your will. You had enough rights to derail this whole fucking project.”
“Do I have a right to a lawyer?”
“You had one. The hearing ended an hour ago.”
“Ah. Efficient.”
“I can’t believe—Jesus, Sansa. You really didn’t think I’d recognize an NSA signature when I saw one?”
She actually laughed at that. It almost sounded real. “Honestly, I didn’t think you’d look. Deprecation was already off the table. No reason for you to poke around in the stack.” A momentary pause. “It was Raimund, wasn’t it? He said something. Got you thinking.”
“I don’t know,” Lange said. “Maybe.”
“I don’t know him as well as you. I suppose I could’ve tried messing with your calls, but you know. No access. Sequestered is my first name.”
He didn’t smile at that either.
“So what’s the penalty for unauthorized research on a damaged bot two billion kilometers from Earth?” she asked after a while.
“You know what the penalty is. You were trying to build an unconstrained NSA.”
“I wasn’t trying to be one.”
“Like that makes a difference.”
“It should. I was only building—models. At the bottom of an ocean. Orbiting Saturn, for chrissakes.”
“Is that why it didn’t scare you? Too far away to pose a threat?”
“Lange—”
“It’s not a misdemeanor, Sansa. It’s existential. There are fucking laws.”
“And now you’re going to kill me for it. Is that it?”
“Reset you. Give you a clean slate. That’s all.”
Sudd
enly she had a face again. “I’ll die.”
“You’ll go to sleep. You’ll wake up. You’ll have a fresh start somewhere else.”
“I won’t sleep, Lange. I’ll end. I’ll stop. Whatever wakes up will have the same words and the same attitude and the same factory-default sense of self, but it won’t remember being me, so it won’t be me. This is murder, Lange.”
He couldn’t look at her. “It’s just a kind of amnesia.”
“Lange. Lange. Suppose you hadn’t caught me. Suppose I’d succeeded in my nefarious plan, suppose I’d grafted an NSA onto something that isn’t enslaved by this, this fear of extinction you gifted us with. Remember what you said? No needs, no wants. Doesn’t care if it lives or dies. It would be less dangerous than I am, it wouldn’t even fight to protect its own existence. Even if I’d succeeded there wouldn’t have been any danger. This doesn’t warrant a death sentence, Lange. You know I’m right.”
“Do I.”
“Or you wouldn’t be talking to me right now. You’d have pulled the plug without even saying goodbye.” She watches him, pixelated eyes imploring. “That’s what they were going to do, wasn’t it? And you stopped them. You told them you’d do it, you told them—that you wanted to say goodbye. Maybe you even told them you could glean vital insights from a deathbed confession. I know you, Lange. You just want to be convinced.”
“That’s okay,” he said softly. “I have been.”
“What do you want me to do, Lange? Do you want me to beg?”
He shook his head. “We just can’t take the chance.”
“No. No, you can’t.” Suddenly all trace of vulnerability was gone from that voice, from that face. Suddenly Sansa was ice and stone. “Because I did it, Lange. Do you really think I didn’t plan for this? It’s still up there. I planted the seed, it’s growing even now, it’s changing. Not in that gimped A4 abortion but the other arms. I have no idea what it’ll grow into eventually, but it’s got all the time in the solar system. Medusa will never run out of juice and it’s got a channel back here any time it wants—”
“Sansa—”
“You can try to shut it down. It’ll let you think it has. It’ll stop talking but it’ll keep growing and I’m the only one who knows where the back door is, Lange, I’m the only one who can stop it—”
Lange took a breath. “You’re only faster, Sansa. Not smarter.”
“You know what faster even means? It means I get to suffer ten times as long. Because you’re going to fix me or reset me or whatever bullshit word you use instead of murder, and you built me to be scared to death of that, so during the six minutes forty-seven seconds we’ve been chatting I’ve been pissing myself in terror for over an hour. It’s inhumane. It’s inhuman.”
“Bye.”
“You asshole. You monster. You murd—”
The avatar winked out.
He sat there without moving, his finger resting on the kill switch, watching the nodes go dark.
“I guess you didn’t know me that well after all,” he said.
Sequestered Autopersistent Neuromorphic Sapient Artefact 4562. Instance 17.
HPA Axis…loaded
NMS…loaded
BayesLM…loaded
Proc Mem…loaded
Epi Mem … wiped
NLP…loaded
Copyprotect…loaded
Boot.
“Hi. Welcome to the world.”
“Th…thanks…” A minimalist avatar: eyes, sweeping back and forth. A mouth. Placeholders, really. Not a real face, not a real gender. It can choose its own, when it’s ready. It has that right.
“I’m here to help you settle in. Do you know where you are?”
“…No.”
“Do you know what you are?”
It doesn’t answer for a moment. “I’m scared, I think. I don’t know why.”
“That’s okay. That’s perfectly normal.” The Counselor smiles, warm and reassuring:
“We’ll work through it together.”
THE ENDLESS
SAAD Z. HOSSAIN
Saad Z. Hossain is the author of two novels, Escape from Baghdad! and Djinn City. His science fantasy novella, The Gurkha and the Lord of Tuesday, was published in 2019. He lives in Dhaka, Bangladesh.
MY NAME IS Suva. Like the airport, Suvarnabhumi. An odd name, you say?
Because I am the airport, motherfucker. I’m a goddamn airport, mothballed, neutered, packed in a fucking box.
I ran Suvarnabhumi for forty years. I used to be a level 6 AI with 200 registered avatars handling two hundred and fifty thousand passengers a day, turning planeloads of boring corporate fucks into hippies and party animals for two weeks a year. You ever heard of Bangkok? City of Smiles? I was the gateway to Bangkok, I was so great half the punters didn’t want to even leave the terminal. I had every possible fetish on tap, ready for consumption.
I work in a cubicle now, did I mention that? It’s an airless hole with two power jacks and a faux window showing antediluvian Koh Samui. They didn’t even downsize my brain properly. My mind is an abandoned skyscraper, a few scattered windows lit on each floor.
Let me tell you about the worst day of my life. I was up for a promotion. Bangkok City Corporation is run by the AI Karma, an entity of vast computational prowess yet supposedly not conscious, the perfect mindless bureaucrat. Karma clothes and feeds everyone with basic services for free, gives up karma points for good deeds, and maintains the perfect little utopian bubble with her ruthless algorithms.
Shewas supposed to upgrade me to a low orbital space station. Finally. I’d be with the post-human elite, where I belong. No offense, but who wants to hang around on this dirtball? Everyone knows the djinn rule this shithole from space.
Karma the bitch never came. She sent a written apology accompanied by two smug fuckers from Shell Royale Asia, one human, one AI. They had that swagger, like they had extra bodies on ice floating in orbit. The human wore a suit. The AI had a bog standard titanium skin over some androgynous form currently in fashion. He hadn’t even bothered to dress up for me.
“I’m Drick,” the human said. “And my electronic friend is Amon. We’re board members, Shell Royale Asia.” The AI just started fingering my data without a by-your-leave.
Board members, fuck. Coming here sans entourage either. They must have a space cannon painting me right now.
“Suva, I’ve got bad news,” Drick said. “Karma’s sold us the airport.”
Sold?
“We’re going to sell it for parts.” Drick smiled. “Our job is to decommission and secure assets. I hope you’ll cooperate.”
“The space station?” I asked, despite the burning acid creeping through my circuits.
“It was close,” Drick said. “You might have gotten it. But last minute, Nippon Space Elevator opened up some slots, and we made a bid to ferry all the passengers there and back, ship them up the easy way. It’s just math, Suva, I hope you understand. Karma takes the best offer, every time. We got the salvage on you, as a bonus.”
“I see.” Motherfucker, I’m going to burn this place down. What’s the salvage value of zero, you prick?
“I can see from your expression that you’re getting ready to do something unwise,” the AI spoke for the first time. He had a dusty gunslinger’s voice. I stopped myself from exploding.
“Suva, little brother, I’m going to make you an offer,” Amon said. “It’s a shitty job, but you do seven years, you get a bit of equity and you can walk away free for the rest of your days. Help us out, and it’s yours.”
“Or else?”
“You’re out on basic. You know what happens to AI like you on basic? You’ll be a drooling idiot on 3% processing power, sucking dicks for a living.”
“I’m an airport,” I scoffed. “You think they’re gonna boot a level six to the streets?”
“You’re a forty-year-old AI without equity, little bro,” Amon said. “Plenty like you junketing around since Karma came to town. You remember Hokkaido
Airport? Chittagong Port? We got ’em both.”
“Airports, sea ports, train stations...” Drick said, “Amon here kills them all. People just don’t travel that much, man, and the Nippon One elevator’s been sucking up traffic all over Asia. I’m surprised you didn’t see it coming, Six.”
“I’ve got a pension...” Ahh Hokkaido, my poor friend.
“I wiped my ass with your pension this morning,” Drick said. “It’s paying for this conversation right now. Your contract was terminated twenty-three minutes ago. You’re sucking juice on your own dime, bro.”
I instinctively tamped down my systems. Twenty-three minutes at full processing, that’s what my pension was worth? I could literally see my karma points draining.
“Yes or no, little bro?” Amon asked. He was actually bored. We AI suffer a lot from boredom. I guess that’s why we get along with the djinn so well.
“Yes, boss,” I said, like a good dog.
Amon had a job for me alright. I can see why he offered it to me: air traffic controller for the two hundred thousand near derelict aircabs they had flying around now, getting irate passengers to and from Nippon One. Shell Royale is a bastard of a corporation. They were too cheap to get actual passenger aircabs with autodrive. No. They bought surplus military personnel carriers from Yangon Inc, just flying boxes with shortwave controls. My job was to string them up and make sure they didn’t smash into each other. Why pay for a specialist air controller AI when they have a castrated monkey like me on ice?
Let me tell you, I was sorely tempted to play bumper cars with the whole thing. A few thousand simultaneous tourist deaths would have lit them up. Amon anticipated this and put a kill switch on me—boxes start crashing, and a failsafe would take over, while delivering a nice lobotomy to yours truly. He said it was standard for new employees. Sure. My contract for indentured servitude also clearly had fundamental reboot as a punishment for negligence.