Asimov's SF, September 2009

Home > Other > Asimov's SF, September 2009 > Page 6
Asimov's SF, September 2009 Page 6

by Dell Magazine Authors


  Then he looked at the monitor, a gift from his dad, which told him exactly how good he was. He felt the sickness in his hands, the trembling of comparison....

  Victor reached over and shut the monitor off.

  Copyright © 2009 Ferrett Steinmetz

  [Back to Table of Contents]

  * * *

  Novelette: SOULMATES

  by Mike Resnick & Lezli Robyn

  According to Locus Magazine, Mike Resnick is the all-time leading short fiction award winner; most of those stories appeared here. His current books are Stalking The Dragon, Hazards, and Dreamwish Beasts andSnarks. Lezli Robyn is an Australian who broke into print during the past year, and has sold six stories thus far. She is working on her first novel. This talented collaborative duo presents us with a touching look at an unlikely couple of...

  Have you ever killed someone you love—I mean, really love?

  I did.

  I did it as surely as if I'd fired a bullet into her brain, and the fact that it was perfectly legal, that everyone at the hospital told me I'd done a humane thing by giving them permission to pull the plug, didn't make me feel any better. I'd lived with Kathy for twenty-six years, been married to her for all but the first ten months. We'd been through a lot together: two miscarriages, a bankruptcy, a trial separation twelve years ago—and then the car crash. They said she'd be a vegetable, that she'd never think or walk or even move again. I let her hang on for almost two months, until the insurance started running out, and then I killed her.

  Other people have made that decision and they learn to live with it. I thought I could, too. I'd never been much of a drinker, but I started about four months after she died. Not much at first, then more every day until I'd reach the point, later and later each time, where I couldn't see her face staring up at me anymore.

  I figured it was just a matter of time before I got fired—and you have to be pretty messed up to be fired as a night watchman at Global Enterprises. Hell, I didn't even know what they made, or at least not everything they made. There were five large connected buildings, and a watchman for each. We'd show up at ten o'clock at night, and leave when the first shift came on at seven in the morning—one man and maybe sixty robots per building.

  Yeah, being sacked was imminent. Problem was, once you've been fired from a job like this, there's nothing left but slow starvation. If you can't watch sixty pre-programmed robots and make sure the building doesn't blow up, what the hell can you do?

  I still remember the night I met Mose.

  I let the Spy Eye scan my retina and bone structure, and after it let me in I went directly to the bottle I'd hidden in the back of the washroom. By midnight I'd almost forgotten what Kathy looked like on that last day—I suppose she looked pretty, like she always did, but innocent was the word that came to mind—and I was making my rounds. I knew that Bill Nettles—he was head man on the night shift—had his suspicions about my drinking and would be checking up on me, so I made up my mind to ease off the booze a little. But I had to get rid of Kathy's face, so I took one more drink, and then next thing I knew I was trying to get up off the floor, but my legs weren't working.

  I reached out for something to steady myself, to lean against as I tried to stand, and what I found was a metal pillar, and a foot away was another one. Finally my eyes started focusing, and I saw that what I had latched onto were the titanium legs of a robot that had walked over when it heard me cursing or singing or whatever the hell I was doing.

  “Get me on my feet!” I grated, and two strong metal hands lifted me to my feet.

  “All you all right, sir?” asked the robot in a voice that wasn't quite a mechanical monotone. “Shall I summon help?”

  "No!" I half-snapped, half-shouted. “No help!”

  “But you seem to be in physical distress.”

  “I'll be fine,” I said. “Just help me to my desk, and stay with me for a few minutes until I sober up.”

  “I do not understand the term, sir,” it said.

  “Don't worry about it,” I told him. “Just help me.”

  “Yes, sir.”

  “Have you got an ID?” I asked as he began walking me to my desk.

  “MOZ-512, sir.”

  I tried to pronounce it, but I was still too drunk. “I will call you Mose,” I announced at last. “For Old Man Mose.”

  “Who was Old Man Mose, sir?” he asked.

  “Damned if I know,” I admitted.

  We reached the desk, and he helped me into the chair.

  “May I return to work, sir?”

  “In a minute,” I said. “Just stick around long enough to make sure I don't have to run to the bathroom to be sick. Then maybe you can go.”

  “Thank you, sir.”

  “I don't remember seeing you here before, Mose,” I said, though why I felt the need to make conversation with a machine still eludes me.

  “I have been in operation for three years and eighty-seven days, sir.”

  “Really? What do you do?”

  “I am a troubleshooter, sir.”

  I tried to concentrate, but things were still blurry. “What does a troubleshooter do, besides shoot trouble?” I asked.

  “If anything breaks on the assembly line, I fix it.”

  “So if nothing's broken, you have nothing to do?”

  “That is correct, sir.”

  “And is anything broken right now?” I asked.

  “No, sir.”

  “Then stay and talk to me until my head clears,” I said. “Be Kathy for me, just for a little while.”

  “I do not know what Kathy is, sir,” said Mose.

  “She's not anything,” I said. “Not anymore.”

  “She?” he repeated. “Was Kathy a person?”

  “Once upon a time,” I answered.

  “Clearly she needed a better repairman,” said Mose.

  “Not all things are capable of repair, Mose,” I said.

  “Yes, that is true.”

  “And,” I continued, remembering what the doctors had told me, “not all things should be repaired.”

  “That is contradictory to my programming, sir,” said Mose.

  “I think it's contradictory to mine, too,” I admitted. “But sometimes the decisions we have to make contradict how we are programmed to react.”

  “That does not sound logical, sir. If I act against my programming it would mean that I am malfunctioning. And if it is determined that my programming parameters have been compromised, I will automatically be deactivated,” Mose stated matter of factly.

  “If only it could be that easy,” I said, looking at the bottle again as a distorted image of Kathy swam before my eyes.

  “I do not understand, sir.”

  Blinking away dark thoughts, I looked up at the expressionless face of my inquisitor, and wondered: Why do I feel like I have to justify myself to a machine? Aloud I said, “You don't need to understand, Mose. What you do have to do is walk with me while I start my rounds.” I tried unsuccessfully to stand when a titanium arm suddenly lifted me clear out of the seat, setling me down gently beside the desk.

  “Don't ever do that again!” I snapped, still reeling from the effects of alcohol and the shock at being manhandled, if that term can be applied to a robot, so completely. “When I need help, I'll ask for it. You will not act until you are given permission.”

  “Yes, sir,” Mose replied so promptly that I was taken aback.

  Well, there's no problem with your programming, I thought wryly, my embarrassment and alcohol-fueled anger dissipating as I gingerly started out the door and down the corridor.

  I approached the first Spy Eye checkpoint on my rounds, allowing it to scan me so I could proceed into the next section of the building. Mose obediently walked with me, always a step behind as protocol decreed. He had been ordered not to enter Section H, because he wasn't programmed to repair the heavy machinery there, so he waited patiently until I'd gone through it and returned. The central computer logged th
e time and location of each scan, which let my supervisor know if I was completing my rounds in a timely fashion—or, as was becoming the case more and more often, if I was doing them at all. So far I'd received two verbal warnings and a written citation regarding my work, and I knew I couldn't afford another one.

  As we made our way through the Assembly Room I begrudgingly had to lean on Mose several times. I even had to pause to wait for the room to stop spinning. During that second occasion I watched the robots assigned to this section going about their tasks, and truly looked at them for the first time.

  I was trying to put a finger on why their actions seemed ... well, peculiar—but I couldn't tell. All they were doing was assembling parts—nothing strange about that. And then it hit me: It was their silence. None of them interacted with each other except to pass objects—mostly tools—back and forth. There was no conversation, no sound to be heard other than that of machines working. I wondered why I had never noticed it before.

  I turned to Mose, whose diligent focus remained on me rather than the other robots. “Don't you guys ever speak?” I asked, with only a slight slur detectable in my speech. The effect of the alcohol was wearing off.

  “I have been speaking to you, sir,” came his measured reply.

  Before I could even let out an exasperated sigh or expletive, Mose cocked his head to one side as if considering. I had never seen a robot affect such a human-like mannerism before.

  “Or are you are inquiring whether we speak among ourselves?” Mose asked, and waited for me to nod before proceeding. “There is no need, sir. We receive our orders directly from the main computer. We only need to speak when asked a direct question from a Superior.”

  “But you have been asking questions of me all night, and even offering opinions,” I pointed out, suddenly realizing that it was Mose's behavior I found peculiar, not the others who were working on the assembly line. I wasn't used to robots interacting with me the way Mose had been doing for the past half hour.

  I could almost see the cogs working in his head as he considered his reply. “As a troubleshooter I have been programmed with specific subroutines to evaluate, test, and repair a product that is returned to the factory as faulty. These subroutines are always active.”

  “So in other words, you've been programmed with enough curiosity to spot and fix a variety of problems,” I said. “That explains the questions, but not your ability to form opinions.”

  “They are not opinions, sir,” he said.

  “Oh?” I said, annoyed at being contradicted by a machine. “What are they, then?”

  “Conclusions,” replied Mose.

  My anger evaporated, to be replaced by a wry smile. I would have given him a one-word answer—"Semantics"—but then I'd have had to spend the next half hour explaining it.

  We talked about this and that, mostly the factory and its workings, as I made my rounds, and oddly enough I found his company strangely comforting, even though he was just a machine. I didn't dismiss him when I had successfully completed my first circuit of the building, and he wasn't called away for any repairs that night.

  It was when the first rays of sunlight filtered in through the dust-filmed windows that I realized my time with Mose had been the only companionship I'd shared with anyone (or, in this case, anything) since Kathy had died. I hadn't let anyone get close to me since I had killed her, and yet I'd spoken to Mose all night. Okay, he wasn't the best conversationalist in the world, but I had previously pushed everyone away for fear that they would come to harm in my company, as Kathy had. That was when it hit me: A robot can't come to harm in my company, because I can't cause the death of something that isn't alive in the first place.

  On the train home from work, I considered the ramifications of that observation as I reflected on the last thing we'd talked about before I'd dismissed Mose to his workstation. I'd been reaching for my bottle in order to stash it away in its hiding place when he had startled me with another of his disarming opinions.

  “That substance impairs your programming, sir. You should refrain from consuming it while you work.”

  I had glared at him, an angry denial on the tip of my tongue, when I realized that I was more alert than I had been in months. In fact, it was the first time I'd completed my rounds on schedule in at least a week. And all because I hadn't had a drop of alcohol since the start of my shift.

  The damned robot was right.

  I looked at him for a long minute before replying, “My programming was impaired before I started drinking, Mose. I'm damaged goods.”

  “Is there anything I can repair to help you function more efficiently, sir?” he inquired.

  Startled speechless, I considered my answer—and this time it wasn't the effects of alcohol that had me tongue-tied. What on earth had prompted such unsolicited consideration from a robot?

  I looked closely at the robot's ever-impassive face. It had to be its troubleshooting programming. “Humans aren't built like machines, Mose,” I explained. “We can't always fix the parts of us that are faulty.”

  “I understand, sir.” Mose replied. “Not all machines can be repaired either. However, parts of a machine that are faulty can be replaced by new parts to fix them. Is that not the same with humans?”

  “In some cases,” I replied. “But while we can replace faulty limbs and most organs with artificial ones, we can't replace a brain when its function is impaired.”

  Mose cocked his head to the side again. “Can it not be reprogrammed?”

  I paused, considering my answer carefully. “Not in the way you mean it. Sometimes there is nothing left to be programmed again.” A heart-achingly clear image of Kathy laughing at one of my long-forgotten jokes flashed painfully through my mind, followed by a second image of her lying brain-dead on her bed in the hospital.

  My fingers automatically twitched for the bottle in front of me, as I forced myself to continue, if only to banish the image of Kathy from my mind. “Besides, human minds are governed to a great extent by our emotions, and no amount of reprogramming can control how we will react to what we feel.”

  “So emotions are aberrations in your programming then?”

  I almost did a double take. I'd never looked at it that way before. “Not exactly, Mose. Our emotions might lead us to make mistakes at times, but they're the key element that allows us to be more than just our programming.” I paused, wondering how in hell I was supposed to adequately describe humanity to a machine. “The problem with emotions is that they affect each of our programs differently, so two humans won't necessarily make the same decision based on the same set of data.”

  The sound of a heart monitor flatlining echoed through the bypasses of my mind. Did I make the right decision, and if so, why did it still torture me day and night? I didn't want to think about Kathy, yet every one of my answers led to more thoughts of her.

  Suddenly, I realized that Mose was speaking again, and despite the strong urge to reach forward, grab the bottle and take a mind-numbing swig, I found I was curious to hear what he had to say.

  “As a machine I am told what is right and wrong for me to do by humans,” he began. “Yet, as a human your emotions can malfunction even when you do something that is meant to be right. It seems apparent that humans have a fundamental flaw in their construction—but you say that this flaw is what makes you superior to a machine. I do not understand how that can be, sir.”

  I'll tell you, he was one goddamned surprising machine. He could spot a flaw—in a machine or in a statement—quicker than anyone or anything I'd ever encountered. All I could think of was: how the hell am I going to show you you're wrong, when I don't know if I believe it myself ?

  I picked up the bottle, looking at the amber liquid swish hypnotically for a minute before reluctantly stashing it in the back of my desk drawer so I could focus all of my attention on Mose.

  “There is something unique about humans that you need to know if you are to understand us,” I said.

  �
��And what is that, sir?” he asked dutifully.

  “That our flaws, by which I mean our errors in judgment, are frequently the very things that enable us to improve ourselves. We have the capacity to learn, individually and collectively, from those very errors.” I don't know why he looked unconvinced, since he was incapable of expression, but it made me seek out an example that he could comprehend. “Look at it this way, Mose. If a robot in the shop makes a mistake, it will continue making the very same mistake until you or a programmer fixes it. But if a man makes the same mistake, he will analyze what he did wrong and correct it"—if he's motivated and not a total asshole, anyway—"and won't make the same mistake again, whereas the robot will make it endlessly until an outside agent or agency corrects it.”

  If a robot could exhibit emotions, I would have sworn Mose had appeared surprised by my answer. I had expected him to tell me that he didn't understand a word of what I was saying—I mean, really, what could a machine possibly understand about the intricacies of the human mind?—but once again he managed to surprise me.

  “You have given me a lot of data to consider, sir,” said Mose, with his head cocked to the side again. “If my analysis of it is correct, this substance you consume prohibits you from properly evaluating the cause of your problem, or even that you have a problem. So your programming is not impaired as you stated earlier; rather, it is your programming's immediate environment.”

  As I hopped off the train an hour later and trundled in the direction of my local shopping mall, I could still hear his conclusion reverberating through my mind. I had been so embarrassed by the truth of his statement that I couldn't even formulate an adequate reply, so I had simply ordered Mose to return to his workstation.

  And as I turned and walked down yet another nameless street—they all looked the same to me—I tried to find flaws in what the robot had said, but couldn't. Still, he was only a machine. How could he possibly understand the way the death of a loved one plays havoc with your mind, especially knowing that you were the one responsible for her death?

 

‹ Prev