Remington gave Susan an uncomfortable look. Clearly, he did not understand the need to involve Nate in such a direct fashion.
Nate took his hands off the keyboard and sat back. His brow furrowed. “I’d research the blast area of the bomb. I’d take it on the bus and hijack it to a spot near U.S. Robots, but not close enough to seriously damage the building or any others. I’d let the passengers off and make sure they moved a safe distance away.” He shrugged. “Then I’d detonate the bomb.”
Remington’s face seemed to melt. He stared from Nate to Susan and back again. “No way. You guys worked this out in advance, didn’t you?”
Nate turned his head to look directly at Remington. “What are you talking about?”
Susan sat on one of the couches, folding her arms across her chest and trying not to look too smug. “Nate, can you explain why you would do it the way you just described it?”
Nate shrugged. “Law Number One states, ‘A robot may not injure a human being, or, through inaction, allow a human being to come to harm.’ I’d have to make sure the passengers, bystanders, and anyone in the buildings did not become harmed by the blast. Law Number Two says, ‘A robot must obey all orders given by human beings except where such orders would conflict with Law Number One.’ So, I’d have to blow up the bus itself.”
Remington sat up. “But doesn’t the Third Law state you have to protect your own existence?”
“It does,” Nate admitted, “except where doing so conflicts with the First and Second Laws. In this case, the directive to detonate the bomb would take priority over self-preservation, because the Second Law supersedes the Third Law.”
Remington’s head slowly moved to Susan. “Are you suggesting Payton Flowers released us from the bus because . . . he’s a robot?”
Susan screwed up her features, then shook her head. “Nate, do nanorobots also have to follow the Three Laws?”
Nate went silent for far longer than Susan could ever previously remember. As he was clearly thinking, neither of the humans disturbed him until he finally managed words. “The Three Laws of Robotics are the basis from which all positronic brains are constructed. Without them, there is no positronic brain, no thinking robot. U.S. Robots has made the Three Laws so essential to production, that such cannot be undertaken without them.”
“And the nanorobots?” Susan reminded him.
Nate shrugged. “I was not involved in their production.”
Remington sighed.
Susan smiled. “No. But I know someone who was.” She tapped her Vox to John Calvin’s number and opened the connection to everyone in the room.
It took four beeps before he answered. “Susan? Is everything all right?”
“Fine, Dad. I just wanted to ask you a question.”
“All right. But first, is Remy okay?”
Remington spoke up, “I’m fine, Dr. Calvin. Thank you for asking. I’m here with Susan.”
“Glad to hear it.” Calvin asked carefully, “Who else can hear us?”
Susan glanced around the room from habit, though she already knew the answer. “Just Nate, Dad. It’s only the three of us in a closed room.”
“Nate!” John Calvin said briskly. “How the hell are you?”
“Fine, Dr. Calvin.” Nate looked sheepishly at the resident doctors, as if concerned John Calvin sounded more excited about him than the humans. “A bit bored. When these two visit, it’s the best part of my day.”
“I feel the same way, Nate,” Susan’s father said.
Susan smiled at Remington. Her father had not actually met the neurosurgery resident yet, but Susan had talked about him in glowing terms. “We’re just wondering if the nanorobots have the Three Laws of Robotics embedded in them.”
“The nanorobots?” John Calvin seemed surprised by the question. “Well, they do have a rudimentary positronic brain. We can’t fit a whole lot of pathways in something that tiny, so we whittle it down to the basics.”
“Do the basics include the Three Laws of Robotics?” Susan persisted.
“Well, of course.” John Calvin sounded almost insulted by the question. “Nothing leaves U.S. Robots without those. Standard equipment.” He hesitated, and a hint of suspicion entered his voice. “Why do you ask?”
Susan glanced at each of her companions in turn. She saw no reason not to include her father in their speculation, but she did not want to do so against anyone else’s wishes. “We’re thinking Payton’s actions . . . seem to follow them. Parking near USR, but far enough to keep the explosion from collapsing the building. Letting off all the passengers . . .” Susan paused, expecting a reply.
John Calvin remained utterly silent.
Susan instinctively leaned closer to her Vox. “Dad? Did we lose you?”
The familiar voice issued from the device again. “No, I’m here. I’m just considering what you said.”
Remington stepped closer. “What do you think, sir?”
“For starters, I think you should call me John.”
Remington smiled. “All right, John. What do you think of the Three Laws theory?”
“I think . . . ,” he said, pausing, “that it’s an interesting idea, but it doesn’t really pan out. While the nanorobots do have the Laws embedded, as every USR product does, they don’t have the capacity to consider them.”
Susan believed she understood. “You mean, because of their microscopic size, they have only a fraction of the computational capacity of, say, a man-sized robot.”
John Calvin responded, “Exactly. The original wiring of the prototype positronic brain filled a large room. Over time, the brightest minds whittled it down to half, then a quarter, then an eighth. Nate’s positronic brain is only half again the size of a human male’s. We can get away with stuffing it into a normal-sized skull because he doesn’t need cerebrospinal fluid, cisterns, direct blood supply, or as much cushioning. It’s just skull, a thin layer of hair-growing skin, and the proper circuitry.”
Susan could not help examining Nate as her father described him. He looked, for all the world, like a normal man. Had he not told her weeks ago, she never would have imagined he was composed of plastic, steel, and wires bundled into an all-too-human framework. He seemed so spectacularly normal.
John continued. “Obviously, with the nanorobots, we take out everything nonessential: emotion, calculation, reason. We leave only what’s necessary for them to do their job, which consists of monitoring electronic activity and chemical composition of the cerebrospinal fluid and brain tissue. And, as you guessed, the Three Laws of Robotics. They remain embedded because our positronic brains cannot exist without them. It would violate the law, and our ethical code, to build even the most basic robot without them. It is the one and only process. All USR robotics begin and end with the Three Laws. It would be utterly impossible to build a positronic brain without the Three Laws or to remove them without permanently disabling the robot.”
Susan tried to make sense of Payton’s actions in the context of the current conversation. “So, what you’re saying is the nanorobots have the Three Laws embedded. However, they don’t have the capacity to act on them.”
Remington took it a step further. “They don’t have the capacity to act on anything except the task they were programmed to do. Even if they had the size, they don’t have the hardware to do anything more substantial than swim through bodily fluids. And they don’t have the intellectual capacity to . . .” There, his own knowledge ran short, so he finished by asking, “To what, sir?” Receiving no answer, he amended his question, “To what, John?”
Apparently satisfied, Susan’s father deigned to answer. “They have some intellectual capacity, some ability to learn and make basic decisions, such as to focus in on a location in the patient’s brain where neurons appear to be misfiring. But they’re not programmed to act, just to record. Their encoding is extremely passive. They don’t transmit; they only receive.”
Susan sighed, disappointed. Her theory had seemed so plausible,
and she could not help feeling as if a small piece of herself had died with it. “So, it’s not possible to affect a person’s thoughts and actions with injected nanorobots.”
Nate finally spoke up. “Oh, it’s possible.”
All three humans fell silent. Susan could not help staring at the robot.
“Just not, as I am understanding it, with these particular nanorobots. I imagine a clever programmer could add a command and an interface to the human host.”
“Conceivable,” John Calvin said over the Vox, “but it would require someone with the knowledge of how to program them and the opportunity to tamper with them.”
Susan felt a lump growing in her throat. “Do you . . . do you know . . . someone like that?”
“No.” The answer came so swiftly, Susan did not believe he had had time to consider. “Alfred Lanning directly oversees this project. The company is the brainchild, the baby, of Alfred Lanning and Lawrence Robertson. Either one would sooner kill himself than put U.S. Robots at risk. At the other end, even if Goldman and Peters knew how to sabotage nanorobots, they’d never destroy their own reputations and careers. They’ve worked with us several times before; that’s why USR picked them. And they’re the primary force behind keeping Nate at the hospital.”
Susan shook her head. She knew Goldman, especially, wanted more robots in medical use; and she could never imagine Peters doing anything so harmful. He did not have it in him. Their joint insistence on Susan not mentioning anything about the experiment to the police clinched their innocence. “So, if tampering occurred, it would have to have happened somewhere between USR and the lab.”
“Susan, I like that you’re thinking.” Susan recognized the soothing, diplomatic tone John Calvin adopted when he tried to gently redirect her. “And I understand where you’re going.”
“But . . . ,” Susan added, trying not to sound defensive.
“Well . . .” A note of discomfort entered her father’s voice. He was a gentle man, not given to crushing ideas or dreams. “Some of the best minds in medicine and robotics have considered the situation, separately and together, and they have come to a conclusion.”
“Yes,” Susan coaxed.
“They strongly believe Payton acted in response to his very severe mental illness, his schizophrenia; and the nanorobots had nothing to do with it.”
Susan had considered that possibility several times, and it made a lot of sense. Her mind just kept cycling back to the coincidence of his being a study patient and to the Three Laws of Robotics. Still, Susan knew that simply because things happened in proximity did not make them related. Such assumptions had caused many a scientific error and even more mass falsehoods and hysteria. Schizophrenics acted in schizotypical fashion; Payton did not need a reason beyond psychosis to act psychotically.
Susan sagged in her chair. It felt good to give up control this once, to allow wiser heads to prevail. “They’re all in agreement? No doubts?”
“No doubts,” her father returned. “And while that doesn’t necessarily make them right, it’s a good feeling when so many intelligent and experienced people agree.”
Susan smiled. “Yeah. Thanks, Dad. Love you.”
“Love you, too, kitten.” John Calvin clicked off.
Susan rubbed her face, stopped in midmovement by a soft noise. “Meow,” Remington said.
Nate chuckled softly.
Susan’s cheeks felt warm, and she turned man and robot a humorless glare. “Very funny. He’s my dad; I’m his kitten. It’s not like he called me ‘snooky-ookums.’ ”
“Calm down, kitten.” Remington looked sidelong at Nate. “What do you say we finally go on that date we keep postponing?”
To her surprise, Susan did not find the idea appealing. She wanted to do anything other than wander through the same city, reliving the experience, comparing everything to how it had looked before the bombing. “Can I take a rain check?”
“Ooo, another?” Remington pursed his lips and sucked air through his teeth. “Are you trying to tell me something?”
“I’m trying to tell you I’m not up for a repeat of two days ago, thank you very much. Also, Goldman and Peters said they had three more patients, and they dusted off their skills to handle Sharicka.” Susan doubted the little girl’s injection had gone smoothly. “If I were one of those three patients, I’d rather have me doing the procedure, wouldn’t you?”
“Are you asking if I’d rather have a beautiful woman touching me or two male scientists?”
Susan could not help smiling. “Why don’t you give me a hand? When I’m finished, I’ll take you back to my place, and we can have a home-cooked meal à la Kentucky Roasted Chicken.”
Remington bobbed his head. “Will your father be there?”
“Probably.”
“Hmm.” He pretended to consider. “I’ll take you up on it anyway. At least you’re not blowing me off this time.”
“Or blowing you up,” Nate reminded, proving a robot’s sense of humor can be just as terrible as the next man’s.
It also reminded Susan to say, “By the way, in case I forgot to say it before, thank you for saving my life.”
A reddish tinge rose to Remington’s face. “I’m not sure I actually saved your life. Maybe kept you from getting a hunk of bus embedded in your shoulder.”
“Or my skull. Or my carotid artery.” Susan knew all the ways a piece of flying metal could kill a person. “And I probably couldn’t have handled the concussive forces as well as you did. I also appreciated that you went straight to work helping others, even though you were wounded and shaken yourself.”
The new color drained from Remington’s face, returning it to its natural hue. “I’m not sure whether to take that as a compliment or a backhanded insult.”
The response startled Susan. She had not considered that he might perceive her words in a negative way. “What do you mean?”
“I’m just a bit miffed it even occurred to you I might not help in a crisis. You pitched right in, and I’m a doctor, too.”
“Well, yes, but . . .” Susan finally paused to consider Remington’s words. She had assisted the other victims of the bombing without any need for thought. She had the skills, and she used them. Why had she expected less from Remington? “You were hurt, and you took the full force of the explosion . . . and . . .”
“Just because I majored in biochemistry instead of construction doesn’t make me a daisy.”
“Of course it doesn’t. I . . .” Susan finally looked directly at Remington.
Remington chuckled. “I was just going to say you were right about how to handle a dangerous schizophrenic. I shouldn’t have pushed you to ‘do something.’ ”
The words startled Susan. “I was just going to say you were right. I was the only one who knew his name and diagnosis, the only one who had a chance to successfully disarm him.”
“The fact that we’re here, alive, speaks otherwise.”
Susan shook her head. “The fact that we’re here, alive, is either luck or an echo of the Three Laws of Robotics. I didn’t do anything because I think I never really believed he had a functioning bomb. I was obviously wrong.”
Remington stared.
Uncomfortable under the sudden, intense scrutiny, Susan smoothed back her hair and worried about what might be sticking to her face or teeth. “What?”
“You just said you were wrong.”
“So?”
“Dr. Susan Calvin was wrong about something?”
Susan did not wish to be reminded of her mistake with Sharicka. “Twice. In one week. And when I’m wrong, I do it in grand fashion. People die.”
Remington intoned, “‘People who think they know everything are a great annoyance to those of us who do.’ ”
Susan blinked and repeated, “What?”
“A great man once said that. It’s my favorite quotation. I use it whenever one of my peers gets too full of himself.”
Susan did not like the sound of that. “Did I s
eem too full of myself?”
Remington took her hands. “Not at all. But if you get to thinking the world will end every time you make a mistake, you’ll be afraid to do or say anything.” He pulled her close. “The world needs you, Susan Calvin. With all your competence and your confidence. I’d hate to ever see you frozen in indecision.”
“Never been accused of that,” Susan admitted. “Though I can’t say it’s never happened.” She wrapped her arms around him. He felt warm and strong, smelling of disinfectants and hospital bedding. “I’m sure you’ll keep me . . . properly arrogant.”
“I seem to manage it with my peers.” Remington leaned in and kissed her.
A thrill of excitement swept through Susan, and she returned his kiss.
Nate feigned great interest in his palm-pross.
Remington proved invaluable as an assistant for Susan’s first patient, Ronnie Bogart, a middle-aged man with bipolar illness who suffered from chronic depressive episodes. After his seventeenth suicide attempt, and his twentieth medication trial, it seemed unlikely any treatment would allow him to live outside of an institution. Alone in the world, he signed his own consent. The neurosurgery resident kept the patient still, as much with a steady patter of conversation as any type of physical restraint. When the vial of greenish liquid came out, Remington focused on it with grim fanaticism, running his fingers repeatedly over the rosy orange safety seal and, after its removal, studying the cap and the vial itself. Apparently satisfied, he tossed it in the biohazard can and proceeded to help Susan.
The second patient, Barack Balinsky, did not require anyone to hold him. Like Neal Fontaina, he was a catatonic schizophrenic, and he had not deliberately moved a muscle in nearly sixteen years. He had spent almost half his life in his mother’s living room, a feeding tube dripping liquid food into his throat while he lay nestled into a mechanical bed that constantly shifted his position so he would not develop bedsores and contractures. Clearly long-suffering, his mother signed the consent form in silence, then left the residents alone to do their work. Susan supposed she appreciated the reprieve. She wondered if the mother hired babysitters to watch him while she went about her business or if she simply left him to his still and silent world.
I, Robot To Protect Page 29