Book Read Free

Three Laws Lethal

Page 5

by David Walton


  They crossed Thirty-Sixth Street and headed off across the green. The trees overhanging the crisscrossing brick paths were budding, and the fresh-washed scent of spring was in the air. Tyler loved this time of year, even though it often brought rain. With the bitter cold gone, students tossed Frisbees across the lawn, and others sat on benches to do their homework.

  “What do you think of the Gomez bill?” Naomi asked.

  Tyler turned, surprised to hear her speak. Gomez was a bill before Congress that would require autocar manufacturers to include loopholes for law enforcement, such as an override signal a policeman could send to force a car to stop. It addressed the public fear of an autocar gone berserk, with no way for a human to gain control. “I think it’s a terrible idea,” he said. “How long do you think it will be before people other than law enforcement get a hold of the key or find some way to hack the signal? Do you want someone to be able to force your car to a stop at the side of the road at night? If that bill passes, it’ll be a disaster.”

  “It’s unlikely to pass,” Naomi said. “It’s getting some traction in the House, but the Senate is 60–40 against.”

  “People are afraid of all the wrong things,” Tyler said. “They imagine a robot apocalypse run by malevolent AIs bent on murder, and they want protection against that. But they don’t fear the much more likely dangers that AIs protect them from every day.”

  “A recent poll showed that thirty-seven percent of people think artificial intelligence will be a threat to humanity,” Naomi said.

  “It’s not just about AIs, either. If a plane crashes, it makes big news and sends people into a furor calling for measures to make sure it never happens again, no matter the cost. But in the time it took the plane to come down, more people in the country are killed in car accidents, every day. The plane crash is rarer—and somehow scarier—and so it gets more attention than the thing that’s actually likely to kill them.”

  “You’re right about that. People are eighty-six times more likely to die in a car crash than in a plane crash,” Naomi said.

  Tyler gave her a suspicious look. Her last few responses had been oddly stilted, and a bit heavy on the random statistics. “Are you using a conversation bot?” he asked. He regretted the question as soon as it left his lips. If he was right, he would have embarrassed her, and worse, if he was wrong, he would have insulted her.

  She blushed. “I’m sorry.” She looked as though she wanted to dissolve and soak away into the grass. “I’m not very good at conversation, and Jane—I mean the bot—helps. Otherwise I just don’t say anything.”

  “No, I don’t mind. It’s really good,” he said, backpedaling and cursing himself. “I know they exist, but I never heard of one being that good before. Did you write it yourself?”

  She nodded, but looked away.

  “Honestly, I’d like to check it out. Have you open-sourced it? Is it out on GitHub?”

  She shook her head and mumbled something too soft for him to hear. They reached the library doors, and Tyler pulled one open, letting her walk in ahead of him. He considered just saying goodbye right there, hoping he hadn’t screwed things up so bad with her that she wouldn’t program for them, when something she’d said fell into place in his brain, and he followed her inside.

  “Jane,” he said. “You named your conversation bot Jane? Like in Speaker for the Dead?”

  She whirled to face him, this time with a genuine smile on her face. “You know it?”

  “Of course, I know it. It’s Card’s best work.” It had been written well before he was born, but it was an important part of the SF canon.

  “No,” she said. “Nothing beats Ender’s Game.”

  They argued about that briefly, just standing there in the atrium, until Tyler realized he was blocking the entrance. “Sorry,” he said. “Where were you headed?”

  “Um.” She twisted her hair around one finger. “I have some books I need to check out for one of my cognitive science classes. I spend a lot of time here, actually.”

  Tyler grinned. “She sounds like someone who spends a lot of time in libraries, which are the best sorts of people.”

  Naomi clapped her hands. “Catherynne M. Valente!” she said. “From The Girl Who Circumnavigated Fairyland in a Ship of Her Own Making! I love that quote. I had it taped inside my locker in middle school.”

  “Well, if you’re not in a hurry”—Tyler spotted a cluster of unoccupied reading chairs—“I could show you around our code. If you’re really going to help us out, that is.”

  “Okay.”

  Tyler led the way. They sat on two comfortably-stuffed chairs arranged around a half-moon coffee table, decorated with a metal vase of faux dogwood branches and some kind of generic white blooms. The library was new, designed in a sparse, modern style that preferred brushed steel and abstract art over wood paneling and portraiture. High on elegance, but short on mystery.

  “Ever been to the J.P. Morgan Library in New York?” Tyler asked. “When I’m a billionaire, that’s the kind of library I’m going to build. It’s like the one in Beauty and the Beast—a huge room, three stories high, with balconies, murals, a domed ceiling. Only mine will have secret walls that open up and teleportation circles to get around. And carnivorous shadows.” He eyed her for a reaction—he’d been referring to a library in a Doctor Who episode, but if she didn’t recognize it, then that last part would make him sound like an idiot.

  He needn’t have worried. “So big it doesn’t need a name, just a great big ‘The,’” she said, smiling and brushing a lock of hair back behind one ear.

  They synced glasses, and he started walking her through the code, showing her how it was organized, the training data they were using, and their build process. She picked it up quickly, often understanding the intent behind a function before he explained it. Instead of worrying if she’d be good enough to help, Tyler found himself worrying about her opinion of his code. Did she find it amateurish? Was she laughing at him behind that shy reserve? He got the feeling that there was a lot more going on in her mind than showed on her face or came out of her mouth.

  “The real problem is the edge cases,” she said. “These days, it’s easy enough to train an AI to do simple recognition tasks— identifying faces, voices, cyber threats, suspicious behavior. But it’s only ninety-nine percent. When people’s lives are on the line, it’s not good enough. You need an algorithm that can use good judgment with incomplete or conflicting data.”

  It was the longest group of sentences he’d ever heard her string together, but Tyler just went with it. “What does good judgment even mean in this situation? We call what we use ‘AI,’ but it’s not really intelligent. It doesn’t think, not really. We train a sophisticated mathematical configuration to filter out bad choices and select good ones, but that’s not the same as having creativity, or making leaps of intuition, or showing common sense. And there’s no way to test every possible situation.”

  “We need an AI whose highest motivation is to keep human beings safe, with the judgment to evaluate its own decisions on that merit,” Naomi said.

  Tyler grinned. “Three Laws Safe.”

  She took it seriously. “Exactly. What’s the modern equivalent of Asimov’s Three Laws? How can we make autocars inherently safe?”

  “The problem is, cars aren’t safe,” Tyler said. “You’re flying along in a two-ton steel box with lots of other two-ton steel boxes. Asimov’s robots just wouldn’t drive at all. They might even prevent a human from driving, if they could. ‘A robot may not harm a human being, or allow a human being to come to harm.’”

  “Except in ‘Little Lost Robot,’” Naomi said. “In that story, they intentionally modified the First Law, so robots could work with humans doing a somewhat dangerous job without preventing them from doing it.”

  “I remember that story. It was radiation, right? The humans would get a small dose of radiation, and the robots had to be able to allow that to happen.”

  Naomi nodd
ed. Tyler noticed that she still focused her eyes inside her glasses, not at him. He wondered if she was still reviewing the code while she talked, or if it just made her more comfortable to pretend he was an online contact instead of a person in real life. “So that’s what we need,” she said. “A root-level, built-in inability to harm humans directly.”

  “Directly? So, in that case in Seattle, the woman’s car wouldn’t swerve, because hitting the motorcycle would be directly causing harm to humans? Whereas plowing straight into the tree would be inaction—it might kill more people, but not actively on the part of the AI?”

  “It sounds kind of stupid when you put it like that.”

  “Well, not necessarily. This is the kind of question moral philosophers argue about into the wee hours of the night. Is there a difference between doing and allowing? Between allowing harm to happen and doing the harm myself?” Tyler realized he was grinning. This was the kind of conversation he wanted to have with Brandon, but Brandon always resisted it. He cared about practicalities, not morals.

  She thought about it. “I don’t think there is a difference. If I truly have the power to stop it, and I don’t, that’s just as bad as doing it. Neglecting a child is just as wrong as actively hurting her—in both cases, you’re causing harm, even though in the first case, you’re technically doing nothing.”

  Tyler was enjoying this. She had relaxed in the chair opposite him, and although she still wasn’t meeting his eyes, she at least wasn’t browsing her glasses anymore. “Do you know the trolley problem?” he asked.

  She shook her head.

  “Really? It’s something they talked about a lot, back when the first autocars came out. It’s an ethical thought experiment. Here . . .” Tyler pulled a straight dogwood branch out of the vase on the coffee table. He laid it flat on the table. “This is a train track.” He gestured at her travel coffee mug. “May I?” She nodded, and he placed the mug at one end of the branch. “This is a runaway train, brakes not working, and you’re the driver. Down the track a ways, five people are working and don’t see you coming. You’re about to plow through and kill them all. But!” He pulled another branch out of the vase and laid it across the first, creating an alternate, forking path for the train to take. “On this track, there’s only one worker. You have a choice. You can switch tracks, intentionally and actively killing the one person, or you can do nothing, and let the five die.”

  “That’s easy,” Naomi said. “Of course you choose the one. It’s not your fault either way—you don’t intend for anyone to die. You’re just minimizing the loss of life.”

  “Fair enough,” Tyler said. “Most people say the same. Not all, but most. Try this variation, though. Instead of driving the train, you’re on a bridge above the tracks, watching the drama unfold.” He removed the second dogwood branch. “There’s no fork in the track, just five people about to be killed. You realize the train can’t stop, but you’re a railway engineer, and you know that if you could drop a weight of at least three hundred pounds on the track, you could stop the train before it reaches the workers. You don’t have a weight, but there happens to be a fat man on the bridge in front of you, right over the tracks. If you push him over the edge, the train will hit him instead and the workers will be saved. Should you do it?”

  Naomi didn’t hesitate. “Of course. Five for one, the same as before.”

  Tyler opened his mouth and closed it again. He had been expecting her to say no, of course not, you couldn’t push someone off a bridge—that was murder, even for a good cause. Then he could point out how this indicated there must be a difference between actively causing harm and just allowing harm to happen, because of the difference between these two cases. What did her answer say about her—that she was callous? Or just more consistent than most?

  “But what if you were the fat man?” he blurted. “Would you still make the same choice?”

  This time she had to consider. “That’s a very different question,” she said finally. “But the answer is the same. I should throw myself off to save the others, assuming I could know for sure that the others would be saved. It’s the right thing to do. But in real life, would I? What if my sister was the fat man . . . would I then? Probably not. But that’s because my sister is worth more to me than any five strangers.”

  “It’s not a philosophical question, then; it’s a personal one,” Tyler said. “Which is exactly the problem we have with auto-cars. What people want to happen in general, to strangers, is different from what they want to happen when their own loved ones are involved. We somehow need people to agree on what choices are fair and correct before personal considerations get in the way.”

  “It’ll never happen,” Naomi said, and for the first time she met his gaze directly. “Everything in life is personal.”

  Her eyes were a deep brown, and while her face often seemed to hide her feelings, the eyes expressed them. She had none of Abby’s vivacious charm, but Tyler thought she might just be the prettier of the two.

  “Hey, are you free tonight?” Tyler said. “Brandon and I are going car shopping. We need to add a few more vehicles to the fleet, now that we can afford it. We’re spending somebody else’s money. It’ll be fun.”

  Her gaze dropped to the floor again. “I don’t think so. I’m busy.”

  “Okay,” he said. “Maybe we could catch dinner together sometime. You free tomorrow?”

  She stood hastily and picked up her travel mug. “I should go.”

  Tyler studied her face, but she showed nothing. He had thought they were hitting it off together, but maybe not. “Okay,” he said. “I’ll send you the link to our code repository, so you can get started.”

  “Great,” she said, so softly he could barely hear her. She jostled the coffee table on her way out, so that one of the dogwood branches slid onto the floor. She pushed through the library doors and out into the sun. Tyler watched her go, a little stunned. She had never even checked out the books she said she needed for her class.

  CHAPTER 5

  Naomi felt uncomfortable, so she did what she always did in those circumstances. She shut herself away with her software. She couldn’t go directly to her secret library nook, because Tyler might see her. Instead, she stood behind a statue on the green, waited until she saw him leave, and then slipped back into the library and up to the second floor.

  She didn’t have anything against Tyler Daniels. She had actually enjoyed their conversation, at least a little, but the effort of talking with a stranger exhausted her. The idea of going out again, in a situation she couldn’t escape by just walking away, was more than she could handle.

  Besides, she needed to check on her Mikes again. That morning, before leaving the library, she had reviewed the history of the competitive world. She found that most frequently, the Mikes who died did so by attrition, one at a time. The Mikes who contributed least to the survival of the group were denied food when there was a lack. The world wasn’t big enough for any of them to strike out on their own and hope to survive; all available resources were co-opted by the group. That made it unlikely for rival groups to grow and war against each other. Each mike, however, was rewarded for individual survival, not for the survival of the group. The scheme was evolutionary, but unlike in the biological world, the ability to produce offspring came at the end of life, not in the middle, so every Mike had incentive to live as long as possible. Every millisecond counted. As long as an individual mike’s survival was linked to the survival of the group, he would work toward that end. However, once a Mike was marked for death, or predicted that future for himself, his actions would change. Some stole food and fled, and were then hunted down by the rest. Others attempted to destroy the entire village to achieve some small amount of extra time for themselves. Those most individually successful, however, were those who could find ways to preserve a large number of Mikes in their world.

  In the most recent versions, the strategies for claiming sunlight had become more sophisticated, using what
amounted to a series of giant rectangular mirrors to reflect the sunlight away from the rocky terrain to fertile ground, making the grazing grass or crops grow at prodigious rates. In this world, the Mikes had dug, too, sending sunlight down onto underground yam fields to produce more food. They always lost some percentage of their population toward the beginning, before they could establish their infrastructure, but the later generations were increasingly able to survive harsh winters or even the occasional devastating storm. It reminded Naomi of a Dyson sphere—the hypothetical sphere a planetary civilization might build entirely surrounding its sun, exploiting the entire energy output for their own purposes. She wondered if the Mikes might eventually find a way to accomplish the equivalent feat in their own world.

  However, there was still no sign of emergent creative behavior. No art, no sports, no activities that didn’t directly support survival. The Mikes didn’t seem to communicate in any way beyond simple reactions to each other’s actions. To each of them, the other Mikes were nothing more than a part of their environment, to be manipulated however possible to achieve the desired outcome. One could argue that they weren’t so much coordinating as independently discovering strategies that jointly enabled them to survive.

  It was enough. Enough for publication, enough to attract the attention of graduate schools, enough to land her a good job in the industry. But it wasn’t enough for her.

  Naomi selected the best one thousand Mikes from the most successful versions of their worlds, and started building a new Realplanet simulation for them to inhabit. She made this new world a harsher place, scattered with hidden traps, like nests of giant wasps that would attack and injure, and pits with lava that would cause burns. Nothing that would kill by surprise, at least not directly—she wanted to see if the Mikes would communicate to warn each other about the traps. She felt a little bit evil, like a game master in The Hunger Games, setting traps to catch unwary innocents, but it seemed as though competition and danger were critical to the development of intelligence.

 

‹ Prev