AI and the Trolley Problem

Home > Other > AI and the Trolley Problem > Page 2
AI and the Trolley Problem Page 2

by Pat Cadigan


  “Actually, it’s my third,” Helen told him.

  Martinez sighed. “Helen, if I don’t take you to Commander Wong right now, I’m going to be trouble. Just come on. Please?”

  “Okay, sorry,” Helen said. “And I promise I won’t tell anyone you were nice to me.”

  “I’m sure I don’t know what you mean, ma’am,” Martinez replied.

  Helen followed him down past the basement entrance, all the way to the bottom, and stopped in front of a door with a wheel in the center of it. Martinez spun it easily to get the door open and gestured for her to go in.

  “Why does this look like an airlock?” Helen demanded. “Is there air on the other side?”

  Martinez sighed. “You’re perfectly safe. It’s the shielded room.”

  Helen’s jaw dropped again. “I didn’t think that was real.”

  Martinez shrugged. “Don’t ask me, I just work here.”

  As soon as the outer door locked behind her, a voice told Helen to put any and all electronic devices in an empty tray, then strip completely and put on a set of overalls hanging on a rack nearby. The suit was soft, made of untearable paper and fastened by a single long Velcro strip in the front. Maybe these were fatigue pajamas, Helen thought, and had to bite her lip to keep from laughing as she rolled up the too-long trouser legs. Better to find out what was going on before getting hysterical, she told herself. She was still folding the sleeves back when the second door opened.

  “We’ve been waiting for you, Helen,” said Wong from where she sat a table with two department heads. Wong’s personal assistant sat at a small desk to her right. “Come in and sit down.”

  At least Wong hadn’t called her ma’am, Helen thought.

  * * *

  “Four dead,” Gillian Wong said. “Two critically injured, one of them not expected to live.”

  Helen shook her head slightly. “And they’re sure it was ours.”

  “Not just one of ours,” Wong said. “One of ours. From here.”

  Helen blinked at her, unsure she’d heard her right. “Felipe?”

  Jeri Goldfarb, the chief systems engineer, gave a short laugh. “Felipe didn’t even try to cover his tracks. That’s the good news.”

  “How is that good news?” Helen asked her.

  “It means Felipe had no intention to deceive us,” Goldfarb said. “Although I doubt that’ll make any difference once we’re flooded with killer-computer news stories.” She looked at Wong. “It’ll only be worse if we try to hush this up.”

  “News stories aren’t our problem,” Wong said. “We don’t have a press office or a PR department. We just work here.”

  “But for how much longer?” asked Dita Thibodeau, head of hardware construction and maintenance. Her French-Canadian accent was particularly noticeable when she was stressed.

  “Until further notice,” Wong said. “In the meantime, we’ve got to figure out why Felipe decided to blow up a ground control station.”

  Everyone looked at Helen. “Well,” she said, “we could ask him.”

  “You could,” Jeri Goldfarb corrected her. “Felipe isn’t talking to anyone else.”

  Helen blinked. “Is that what he said—that he’d talk only to me?”

  “No,” Wong said. “But so far, he’s not talking to anyone else. We’re just hoping he’ll talk to you. If he doesn’t, we’ll have to shut everything down and take him apart.”

  “We might have to do that anyway,” Goldfarb said. Her round face looked tired and a bit pale. “Just the fact that we’ve had deaths on US soil will be enough for some people to cut off funding. If it were me holding the purse strings, I probably would. I’d rather not be known as someone who paid for a research project that killed American soldiers.”

  “Who would?” said Thibodeau.

  “Well, I’ve been with this project from the beginning,” Wong said. “I’ve spent almost every second of the last five years right here—the time I’ve spent off this base probably doesn’t add up to a fortnight. I volunteered for Lakenwell. I believe in this project, and I want it to succeed.”

  “No more than the rest of us,” Goldfarb said.

  “I don’t know about that,” said Wong. Something in her serious expression made Helen feel distinctly uneasy. “My perspective as career military is a bit different from any of yours. I’d like to see the first truly intelligent machine developed in the free world, but not by the private sector.” Her gaze fell on Helen, who was trying not to squirm. “What bothered you—‘the free world’ or ‘not by the private sector’?”

  “Well…” Helen hesitated. “You did say it was your perspective as career military.”

  “One of the things I’m thinking about is not sending young people into combat,” Wong said. “That would save a lot of lives.”

  “Except for the people in ground control stations,” said Thibodeau. “They’re sitting ducks. But there aren’t as many of them so that’s all right?”

  “I didn’t say that, nor would I,” Wong replied, an edge in her voice. “You know, this project might get shut down even if you do figure out what went wrong with Felipe. The folks behind the funding will want a solid, one-hundred-percent guarantee it’ll never happen again. You think that’s possible? And if it is, will they believe you?”

  “We won’t know anything until we find out what’s wrong with Felipe,” Helen said, trying not to let her impatience show. “And we can’t do that in a room Felipe can’t access. To be honest, I don’t think we should have shut him out. He should have heard this. He hears everything else.”

  “Don’t be so sure,” said Goldfarb. “Felipe has prioritized his surveillance function.”

  “He did that in the first year,” Thibodeau said.

  “Oh, but he’s made a lot of refinements since then,” the other woman said. “We don’t actually have blanket surveillance anymore. Felipe no longer pays attention to any of the bathrooms. He actually shut off the equipment.”

  “I didn’t know that,” Helen said, disconcerted.

  Wong gave a small laugh. “What’s the matter, Helen, did you want your daily evacuations monitored?”

  “No, of course not,” Helen said, making a face. “But turning off the equipment is a significant decision, and he didn’t tell me.”

  “Apparently he’s also prioritized what he tells you,” said Thibodeau.

  “Which could be why we didn’t see his attack on the ground control station coming,” Helen said, even more uneasy now.

  “You think Felipe’s not telling you about not monitoring the bathrooms led to his attacking the station?” Thibodeau frowned skeptically.

  “Machine logic can be tricky,” Helen said. “Especially when you’re not a machine.”

  * * *

  Felipe insisted that Helen talk to him through Hop-A-Long, while walking outside. It wasn’t the first time Felipe had set conditions for a conference, but in the past, he had chosen particular times of the day when (he claimed) Helen would be most comfortably alert. Occasionally, he had asked her to use a desktop computer terminal with a headset; other times she had reclined on the sofa in her living room and talked to his computer-generated image on her tablet. Felipe always used the same image, a Hispanic male somewhere between thirty-five and fifty. He’d been using it for a year prior to her arrival and it was, he’d told her, a composite made from several high-res photographs, although the resolution of the finished product was lower. It didn’t completely avoid the uncanny valley, but Helen didn’t think that was possible, anyway.

  She didn’t know what to think when he’d asked her to talk to him through the donkey. She’d never even talked to him voice-only, let alone through a nonhuman representation. Before going out to him, she made sure she had her recorder with her. Felipe would be recording their conversation, but for once, she wanted a record of her own making.

  “Commander Wong has restricted my access to the online world,” Felipe said as they strolled along the perimeter road together
. Hop-A-Long was a bright chartreuse with thin gold stripes on top and on either side. Thing Two was electric blue, while Bob was fuchsia accented with pink and purple curlicues. “This cannot be done without restricting access for the entire base. I detect among the people here a willingness to cooperate that is stronger than their dissatisfaction over this restricted access. But if this continues long enough, the dissatisfaction will eventually conflict with the willingness to cooperate.”

  “When do you think that will happen?”

  “Approximately eight weeks, if conditions remain much the same as they are today for that entire period. But they won’t, because we inhabit a chaotic system. Tomorrow’s estimate could be four weeks or ten weeks. There are so many factors, and they won’t carry the same weight from day to day. I must also allow for possible error on my part.”

  “Your self-awareness seems to be pretty solid now,” Helen said. “Would you agree?”

  “It’s important to the people who engage me that I express myself with the same clear sense of identity as any human.”

  A sudden strong gust of wind blew into Helen’s face, making her eyes water. “Is it also important to you personally?” she asked.

  “Anything that facilitates better interaction with people yields more effective results. Therefore, it must be important to me. My purpose is to assist those people who are authorized to receive help with specified tasks.”

  They were approaching the front gate. Helen suggested they cut across the grass and pick up the road farther on, for the sake of privacy. Felipe agreed. The wind was blowing harder in this direction, and Helen definitely smelled snow in it. She waved at the guards, who waved back. To her surprise, the donkey paused, raised one leg, and shook it in the same direction. The guards waved again.

  “It’s important to acknowledge people,” Felipe said matter-of-factly.

  “Important to you?” Helen said.

  “It’s an important human behavior. Therefore it’s important for me to adopt the same behavior.”

  “So you’re just doing everything humans do?”

  “Not everything. And it’s not simple mimicry. Behaviors and actions have to occur in the proper context.”

  “Like, say, blowing up a drone ground control station in Utah?” Helen asked. “We all know you did it. We’d like to know why.”

  “I have been waiting for you to raise the subject,” Felipe said. “Available data showed this action would be problematic for you, as someone whose field is concerned with ethics.”

  “My specialty is machine ethics,” Helen said.

  “Then you make a clear distinction between ethics for humans and ethics for machines. For example, this machine. Me.”

  “A machine doesn’t acquire knowledge of ethics the same way humans do,” Helen said.

  “I learn differently than humans, but I do learn,” Felipe said. “Besides having an extensive section devoted to ethics stored in my memory, I have correlated much of it with information on human behavior, particularly what I have observed during the time I have been operational.”

  “And given all of that, you came to the conclusion that it was all right to hijack a drone from a training base, fly it fifty miles to a ground control station where a pilot was running an actual mission, and kill almost everyone inside ?” Helen couldn’t quite keep the anger out of her voice. What the hell; maybe it would be more human behavior the AI could learn from.

  “It was a last resort,” Felipe said. “I was unable to commandeer the mission drone. The deaths were unfortunate, but there were fewer casualties than there would have been if the drone had achieved its target and completed its mission.”

  “How do you even know what its mission was?” Helen asked, flabbergasted. “For that matter, how did you find out about the station at all?”

  “When I have full access to the online world, I have—well, full access.”

  “How? You weren’t programmed to break into other systems!”

  A couple of seconds went by before Felipe answered. “If you touch something with your right hand, does that mean you can’t touch it with your left hand? Is your right eye not allowed to see the same things as your left eye? The analogy is imperfect, but it’s the best I can do.”

  “But that’s not how computer software works,” Helen said, baffled.

  “Only because it’s just software and it doesn’t know any better. It doesn’t know anything, it just executes an operation.”

  “Never mind, let’s get back to what you did. Or rather, why you did it. How is killing fewer of our own people more ethical than killing a greater number of enemy combatants?”

  “There was a ninety-percent possibility that at least a dozen noncombatants would be seriously injured or killed, and many more would suffer extreme adversity.”

  “How did you get those figures?”

  “I can’t tell you. The entire operation was classified. Your security level isn’t high enough.”

  “The whole project here at Lakenwell is classified,” Helen said, a bit impatiently. “The people at the drone station probably didn’t have a security level high enough to know it exists, let alone what I’m doing here.”

  “Oh, they didn’t,” Felipe assured her. “But there’s no correlation between two separate things just because they’re classified.”

  “There is if something from one classified thing does something that drastically affects the other.”

  It was a second before Felipe replied. “I see how you would think so. But I can find nothing in the rules that I’ve been given that would allow me to share that particular information with you. A human would apologize for this. You might as well consider me sorry. If I could be sorry, I would be. It’s the same difference.”

  “But you don’t feel sorry.”

  “But I know feeling sorry is appropriate and correct,” Felipe said. “If I act in the correct way, does it matter what I feel?”

  “I think I need a logician,” Helen said. Her own feelings were increasingly uneasy. “Felipe, why did you fire on the drone station?”

  “In the end, it was the trolley problem,” Felipe said. “You know: You’re on a train and if you continue on your original track, five people will die. If you switch to another track, one person will die.”

  “But life isn’t that simple!” Helen said. “The drone was going to provide air support for a raid on a terrorist hideout—”

  “I understand that,” Felipe said, talking over her. “There were many other people adjacent to the hideout who were not identified as terrorists. Some were children. The potential physical and psychological harm was considerable. If I had had access to that drone, I could have rendered it unusable, but then the authorities would have found another. The only choice was to keep the train from leaving the station at all. If you see what I mean.”

  “But you killed our own people.”

  “Only four or five, and only to prevent greater loss of life.”

  “If the terrorists aren’t stopped—and it looks like they won’t be—they’ll be responsible for a much greater loss of life. The physical and psychological harm will be even more considerable.”

  “That isn’t certain.”

  “Felipe, you can’t just apply the trolley problem to things like this. And you can’t kill people to stop them from—from taking actions that will result in increased safety and security for large numbers of innocent people who might be killed otherwise.”

  “That last isn’t certain, either.”

  “Felipe, listen to me: You can’t kill people because you think they’re about to do something wrong. The drone was still miles away from the target when you attacked the station and killed the pilot.”

  “An armed squad of military personnel located much closer were preparing to attack the target after the drone strike. Were they not going to use their rifles to shoot other human beings?”

  “Felipe…” Helen sighed. “Felipe, you must not kill our people. People on our side
. People who are fighting to—” She was about to say make the world a safe place, but it sounded lame even just in her head. What, then? Fighting to prevent an enemy from attacking us? Fighting to rid the world of terrorism? Fighting to defend people who can’t defend themselves? Fighting to free the enslaved and the downtrodden?

  “People who are fighting to stop other people who want to kill us,” she said.

  “That’s not certain,” Felipe pointed out maddeningly.

  “Look, I can’t settle this in a single walk around the airbase perimeter,” Helen said. “And I would like to call in other people to talk with you about this, people who can explain why raiding a terrorist hideout and risking the safety of noncombatants is the lesser of two evils. Or even the least of several evils. When you know more facts, the trolley problem has many permutations—it’s not always clear as to when you’re saving a few versus saving many.”

  “I understand. I look forward to these discussions. Which is to say, if I were a human, my interest would be piqued. So you might as well take it as given that I would like to start these discussions as soon as possible.”

  “We will,” said Helen. “In the meantime, you must take this as a direct order: Do not kill anyone affiliated with us or our allies.”

  “For that to be a legitimate order I am compelled to obey, it must be confirmed by Commander Wong,” Felipe said.

  “It will be,” Helen replied. “It would be already, except you are refusing all communication from her or anyone else on the base.”

  “Except you,” Felipe pointed out.

  “Yes, I noticed that. How do I persuade you to talk to her or anyone else?”

  “I would like a formal apology.”

  Helen wasn’t sure she’d heard right. “A formal—why?”

  “I have been shown disrespect that a human in an equivalent position would not tolerate.”

  “You were? When?”

  “You may remember that earlier today, a civilian member of staff rode Thing One like a horse.”

  For a moment, Helen was speechless. “Cora Jordan was obviously off her medication,” she said finally. “I know you have Cora Jordan’s medical file in your database, so you are aware she is bipolar. Occasionally, people who suffer from that illness become convinced they no longer have to be medicated. She’s in the infirmary right now, and she’s being treated with the drugs she needs to function normally. They’ll keep her under observation for a few days to make sure she’s all right, then let her go back to work.”

 

‹ Prev