Asimov's SF, September 2009

Home > Other > Asimov's SF, September 2009 > Page 8
Asimov's SF, September 2009 Page 8

by Dell Magazine Authors


  “It is efficient to arrive for your shift at the wrong time?” he inquired in a voice that I could swear modulated more than it used to. I could almost hear his curiosity now.

  “No, Mose, but it is efficient to arrive early—if you can understand the distinction.” I paused. “Never mind that. Why were you coming to my office before I started work anyway?”

  “To wait for you.”

  It was like pulling teeth. “Why did you want to wait for me?”

  “I need your input concerning the termination of another robot.”

  My eyebrows furrowed in confusion. “Did another human order its deactivation?”

  “Yes, Gary.”

  “Then why haven't you simply obeyed the command? I don't have the mechanical expertise to diagnose the status of a malfunctioning robot. I assume the other human does.”

  His head cocked to the side, as if considering his answer. I realized it was a trait he'd picked up from me. “I do not need input on the mechanical status of the robot. I believe his condition does not necessitate termination, and I would like to evaluate your opinion.”

  I couldn't hide my surprise. “You're asking me for advice?”

  “Is this not the function of a friend—to give advice?”

  “Yes, it is,” I replied, “but I'm no expert on robots.”

  “You have terminated another being. We will compare data to determine if this robot should also be terminated.”

  “The circumstances are vastly different, Mose,” I told him.

  “You said that if you did not terminate Kathy there was a possibility that she could have regained all of her functions,” noted Mose.

  “I said there was an outside possibility that she might have,” I explained. “She was diagnosed as brain-dead. All of her programming was destroyed, Mose. To merely exist is not living. Even if the day came that she no longer needed the life support, the Kathy I knew was gone forever.”

  “I understand,” he replied. “However, this robot's programming is intact.”

  I looked up at him in surprise. “You've communicated with it?”

  “Yes, Gary,” he replied. “In order to ascertain the condition of its programming.”

  He asked the robot how it felt? That was such a human thing to do. “Were you told to repair the robot?” I finally asked.

  “No.”

  “Then why haven't you simply terminated it as you were ordered to?”

  “Would you have terminated Kathy if she had been able to communicate with you?”

  “Of course not,” I replied. “But terminating a robot is very different from killing a human. It's just a machine.” And suddenly I felt guilty for saying that to another machine. “Did this robot tell you that it doesn't want to be terminated?”

  “No, Gary. Indeed, it says that it no longer has any functions to perform and therefore has no logical purpose to exist.”

  “Well then, I don't understand the problem.” I said, feeling more at ease. “Even the robot agrees that it should be terminated.”

  “Yes,” agreed Mose, “but only because it has been ordered to comply, Gary.”

  “No,” I said. “It's because this robot has no sense of self-preservation. Otherwise it would object to termination regardless of its orders.”

  He considered me for a long minute before replying. “So you are telling me that because robots do not have self-preservation it is acceptable to terminate them without any other reason or justification.” It was worded as a statement, but it felt like a question. “You also stated yesterday that Kathy no longer had self-preservation.”

  The impact of Mose's observation was unavoidable. I sat down at my desk, my mind going back to that fateful day six months ago when the doctors had told me that it was unlikely that Kathy would recover. Once I knew she was brain-dead and couldn't decide her own fate, did that make it not just acceptable but easier for me to decide to terminate her life support? Did knowing that she could no longer fight for life justify killing her?

  I agonized over those dark thoughts for some time before I concluded that no, that was definitely not why I had told them to pull the plug. It was cruel to keep her alive with machines when everything that made her Kathy was gone. Which led to another uncomfortable question: cruel to her, or cruel to me?

  It was only when Mose spoke up again that I realized I must have voiced my thoughts out loud.

  “Did you make the correct decision?” he asked.

  “Yes, I did,” I said, and added silently: at least I hope so. “But it will always feel wrong to a human to take the life of someone he loves, regardless of the justification.”

  Mose began walking around the room. Was he pacing? I often did that when distressed. It must have been something else he'd picked up from me.

  Suddenly he stopped and turned to me. “I am not capable of love, but I believe it is wrong to terminate this robot's existence.”

  “Why?” I asked him.

  “It is possible to repair him.”

  I stared at him in surprise. What I didn't ask then—what I should have asked—was why Mose felt compelled to fix the robot. Instead I said, “Do you realize that you yourself could be deactivated if you disobey your superiors?”

  “Yes,” he answered matter of factly.

  “Doesn't that bother you?” It sure as hell bothered me.

  “I have no sense of self-preservation either, Gary.”

  I realized the damned robot was throwing my own reasoning back in my face. “How does it make sense for you to repair a robot that no longer performs a function for the company, knowing that it will probably result in the termination of a perfectly functioning robot—yourself ?”

  “If I were damaged, would you terminate me, knowing that I could be repaired?” he asked calmly.

  No, I would not, Mose.

  But I couldn't tell him that, because that would validate his argument, and I could lose what had become my only friend. “Where in the hell did you pick up such an annoying habit of answering a question with a question?” I asked instead. Then I realized what I had done and laughed. “Never mind.”

  We shared a moment of awkward silence—at least, on my side it was awkward—while I considered everything he had told me, trying to find the best solution to his dilemma. Which led to a very logical thought: why terminate any robot if it could be repaired, given new orders, and transferred or sold elsewhere? Robots were expensive.

  “You said that you could repair the robot,” I stated more than asked.

  “Yes.”

  “Can you tell me what's wrong with this robot?”

  “It requires new parts to replace its upper limbs and most of its torso. However, I do not have the prerequisite parts in my workshop as this model is from a discontinued line.”

  Now I was beginning to understand. “So, as troubleshooter, you tied in to the main computer, saw that the parts were available elsewhere, requested them, and were denied?”

  “That is correct. I was told that repairing the robot was not feasible for the company.”

  “Okay, now I know what's going on,” I told him. And I knew how I could logically convince him that repairing this robot was not worth ending his existence. “And I know why you are not allowed to fix it. The creation of a robot is very complex and expensive, so every robot that's bought by the company is a long-term investment. But once a particular model has been discontinued, spare parts are no longer manufactured for it—so it's often more expensive to buy these limited replacement parts than it is to purchase a completely new and more advanced robot right off the assembly line. Do you follow me, Mose?”

  “Yes, Gary,” he replied. “Their decision is based on what is cost effective for the company.”

  “Exactly,” I replied, glad he grasped it so easily. “So this robot will be replaced by a model that is more valuable to the company and you don't have to waste your time repairing it.”

  “If Kathy could have been fixed,” he asked suddenly, “wou
ld you have decided that it was more cost effective to select a new soulmate, rather than spend the time and effort to repair her?”

  I sighed in frustration: this was going to be harder than I thought. “No, I would not, Mose—but you can't compare a robot's value to Kathy's. She was unique. This robot is only a machine, one of many just like it that have come off an assembly line.”

  “This robot is a model DAN564, Gary. There were only eight hundred manufactured in the world. Kathy was a woman and there are more than five billion of them. Can you please explain how this makes her existence more valuable than the robot's?”

  I grimaced. How could Mose always have such a logical rebuttal to all of my responses, and at the same time be so wrong?

  “Like I told you, Kathy was my soulmate. There may be five billion women, but she was like no other.” I paused, trying to figure out how I could make him understand. “Remember when I told you that humans are not born fully programmed like robots, and that our emotions can result in us reacting differently to the same set of data? Well, the process by which we learn and develop our programming is what makes each of us different from all the others. That's why a human life is more valuable than a robot's. When one of us dies, we can't be replaced.”

  For once it appeared Mose was at a loss for words. It took him a moment to respond. “You said that Kathy was unique to you because she was your soulmate,” he stated finally.

  I agreed, curious as to where this was heading.

  “Well, I am a lucky sonuvabitch because I am the only robot to have a friend.” He paused. “Does that make me unique among all other robots with my model number?”

  “Yes, Mose,” I told him, “it definitely does.” I looked at him for a long moment, realizing that not only did I enjoy his company, but I was actually growing quite fond of him. “And that is why you shouldn't repair this other robot, if the cost is your termination. Where would I find another friend exactly like you?”

  He was silent again for another moment. “I will not repair it,” he said at last.

  And that was the beginning of a new phase of our relationship, if one can be said to have a relationship with a machine. Every night he'd be waiting for me, and every night, unless he was doing an emergency fix on some circuitry, he'd walk along with me as I made my rounds, and we'd talk. We talked about anything that came into my head. I even began teaching him about baseball. I brought him the occasional newsdisk to read, and I'd answer endless questions about what the world was like beyond the confines of the factory.

  And every night he would question me again about the morality of his action, about not repairing the robot when he had the opportunity to.

  “It still seems wrong, Gary,” he said one evening, as we discussed it yet again. “I understand that it would not have been cost effective to repair that robot, but it seems unfair that it should be terminated for reasons of economics.”

  “Unfair to whom?” I asked.

  He stared at me. “To the robot.”

  “But the robot had no sense of self-preservation,” I pointed out. “It didn't care.” I stared back at him. “Now why don't you tell me the real reason?”

  He considered the question for a minute before answering. “I care,” Mose stated finally.

  “You're not supposed to, you know,” I said.

  “Talking with you has increased my perceptions,” he said. “Not my mechanical perceptions; they are pre-programmed. But my moral perceptions.”

  “Can a robot have moral perceptions?” I asked.

  “I would have said no before I met you, Gary,” said Mose. “And I think most robots cannot. But as a troubleshooter, I am not totally pre-programmed, because I must adjust to all conceivable situations, which means I have the capacity to consider solutions that have never been previously considered to problems that have never previously arisen.”

  “But this wasn't a problem that you'd never faced before,” I pointed out. “You once told me that you deactivated a robot every three weeks or so.”

  “That was before I met a man who still suffered from guilt six months after deactivating a soulmate.”

  “You know something, Mose?” I said. “I think you'd better not discuss this with anyone else.”

  “Why?” he asked.

  “This is so far beyond your original programming that it might scare them enough to re-program you.”

  “I would not like that,” said Mose.

  Likes and dislikes from a robot, and it sounded normal. It would have surprised me, even shocked me, two months earlier. Now it sounded reasonable. In fact, it sounded exactly like my friend Mose.

  “Then just be a substitute soulmate to me, and be a robot to everyone else,” I said.

  “Yes, Gary, I will do that.”

  “Remember,” I said, “never show them what you've become, what you are.”

  “I won't, Gary,” he promised.

  And he kept that promise for seven weeks.

  Then came the day of The Accident. Mose was waiting for me, as usual. We talked about the White Sox and the Yankees, about (don't ask me why) the islands of the Caribbean, about the Eighteenth and Twenty-First Amendments to the Constitution (which made no sense to him—or to me either)—and, of course, about not salvaging the other robot.

  As we talked I made my rounds, and we came to a spot where we had to part company for a few minutes because I had been given extra orders to inspect Section H where Mose was not permitted to go.

  As I began walking past the heavy machinery in Section H, there was a sudden power outage, all the huge machines came to a sudden stop, and the lights went out. I waited a couple of minutes, then decided to go back to my desk and report it, in case the incident hadn't extended to the other night watchmen's domains.

  I started feeling my way back between the machines when the power suddenly came on. The powerful lights shone directly in my eyes, and, blinded, I stumbled to my left—and tripped against a piece of heavy machinery that began flattening and grinding something on its rough surface. It wasn't until I heard a scream and thought it sounded familiar that I realized that what it was flattening and grinding was me.

  I tried to pull free, and nothing happened except that it drew me farther into the machine. I felt something crushing my legs, and I screamed again—and then, as if in a dream, I seemed to see Mose next to me, holding up part of the machine with one powerful hand, trying to pull me out with another.

  “Stop! Don't get yourself killed too!” I rasped. “I can't be saved!”

  He kept trying to ease me out of the machine's maw.

  The very last words I heard before I passed out, spoken in a voice that was far too calm for the surroundings, were "You are not Kathy."

  I was in the hospital for a month. When they released me I had two prosthetic legs, a titanium left arm, six healing ribs, a large settlement, and a pension.

  One company exec looked in on me—once. I asked what had become of Mose. He told me that they were still pondering his fate. On the one hand, he was a hero for saving me; on the other, he had seriously damaged a multi-million-dollar machine and disobeyed his programming.

  When I finally got home and made my way gingerly around the house on my new legs, I saw what my life had degenerated into following Kathy's death. I opened all the doors and windows in an attempt to clear out the stale air and started clearing away all the rubbish. Finally I came to a half-empty bottle of whisky. I picked it up with my titanium hand and paused, struck by the irony of the image.

  I had a feeling that every time I looked at my new appendage I'd be reminded of my mostly titanium friend and all he had done for me. And it was with that hand that I poured the contents down the sink.

  I spent two weeks just getting used to the new me—not just the one with all the prosthetic limbs, but the one who no longer drank. Then one day I opened the door to go to the store and found Mose standing there.

  “How long have you been here?” I asked, surprised.

&nbs
p; “Two hours, thirteen minutes, and—”

  “Why the hell didn't you knock?”

  “Is that the custom?” he asked, and it occurred to me that this would be the very first non-automated doorway he'd ever walked through.

  “Come in,” I said, ushering him into the living room. “Thank you for saving me. Going into Section H was clearly against your orders.”

  He cocked his head to one side. “Would you have disobeyed orders if you knew your soulmate could have been saved?”

  Yes.

  “Your eye is leaking, Gary,” Mose noted.

  “Never mind that,” I replied. “Why are you here? Surely the company didn't send you to welcome me home.”

  “No, Gary. I am disobeying standard orders by leaving the factory grounds.”

  “How?” I asked, startled.

  “As a result of the damage I sustained to my arm and hand"—he held up the battered, misshapen limb for me to see—"I can no longer complete delicate repair work. A replacement part was deemed too expensive, so I was transferred out of the troubleshooting department to basic assembly, where the tasks are menial and repetitive. They will reprogram me shortly.” He paused. “I have worked there continuously until the main computer confirmed today that your employment had been officially terminated. I felt compelled to find out if that termination was a result of your death. I will not remember you or the incident once I am reprogrammed, so I felt it was imperative to learn if I had indeed saved my friend before I no longer care.”

  I stared at him silently for a long moment, this supposedly soulless machine that had twice overcome its programming on my behalf. It was all I could do not to throw my arms around his metal body and give him a bear hug.

  “They can't reprogram you if you don't go back, Mose,” I said at last. “Just wait here a minute.”

  I made my way to the bedroom and threw some clothes into a knapsack, pausing only to pick up a framed photo of Kathy to stash in the bag. Then I walked back to the living room.

  “Mose,” I said, “how would you like me to show you all the places we talked about over the months?”

  He cocked his head to the side again, a gesture I recognized fondly. “I would ... enjoy ... that, Gary.”

 

‹ Prev