Book Read Free

Mirage (isaac asimov's robot mystery)

Page 23

by Mark W. Tiedemann


  "Convention space," she said. "There's going to be a trade show of Spacer manufactures. That's what I do, I'm the commerce liaison with the Auroran Embassy."

  "Convention-? There aren't any convention facilities in this area, Ambassador, and that was a private warehouse-"

  "But it's called Convention Center District on all the maps."

  "Yes, it is," the cop said agreeably. "But there really aren't any convention facilities there."

  "So why is it called that if that's not what it contains?"

  "It's always been called that," the other cop said. "You must be new on Earth. "

  "So it must be a practical joke, is that it? Leave outdated names on all the maps so the new tourists end up over their head in a bad neighborhood. Of course, that's a joke, too, since the Terran Tourism and Visitors Department swears there are no bad areas in D. C. or in any other major urban center on the planet."

  "Tourism doesn't ask the police, Ms. Sorry."

  "Someone should ask someone," Ariel went on, her voice acquiring an edge now.

  Derec watched appreciatively as she gradually amplified her rant, over the course of the several kilometers back to the Auroran embassy, irritating and then enraging the two police officers who had, in fairness, just done their jobs. She picked on every explanation they offered until they clammed up and gave her only monotone answers and clearly could not wait to get her out of their cruiser. They had not arrested them because Ariel had her embassy ID on her and had convinced them that she had had business at that warehouse. Derec still had the crate, secured by the same explanation, though the policemen were clearly not happy about it. They were not willing to risk the trouble, though, in arresting an ambassador. Now she had gotten them to the point where they cared about nothing other than returning them to the Auroran embassy, which they had been more than willing to do, no doubt under orders to make sure no more Spacers got harmed or killed in the aftermath of Union Station. Ariel took advantage of that to so thoroughly outrage them with her petty slurs against Earth that they did not bother asking for Derec's ID, nor did they ask the questions that would have opened the door to answers Ariel did not wish to give.

  On the landing pad, fifth floor of the embassy, they climbed out of the transport. Ariel strode off in a huff. Derec looked in at the two cops.

  "Thanks, I really appreciate it," he said.

  The nearer one glared at him. "Your boss needs to learn a little circumspection."

  Derec shrugged. "Well…"

  "Have a good day, sir."

  They lifted off and Derec staggered back from the wash of hot, compressed air. He wondered how much of a report they would file.

  Derec caught up with Ariel at the elevator.

  "That was-"

  She shot him a look. "Unnecessary?"

  "-masterful."

  She frowned briefly, then laughed. "Those poor…"

  The elevator door opened, letting three people out. Ariel did not finish her thought. "All the log recorded was date, destination, and distance," Derec announced. "It was used once in the last eleven months."

  "Union Station?" Mia asked.

  "Correct. And directly back to that garage. There's no record of the driver, the medical technicians, or the patient."

  "No surprises, then," Ariel said.

  "But it's confirmation," Mia said, nodding. She pointed at the crate on the end of the table. "What's in that?"

  Derec felt inexplicably reluctant to open it. He looked at Ariel, his heart pounding. "Do you still have that key?"

  Ariel handed him the seal key from the warehouse. Derec switched it on and ran it along the seam. The crate lid popped open.

  Nestled within padding lay a plastic-wrapped mound of silver-and-gold webbing, wrapped tightly and mingled with darker nodes.

  "Damn," Ariel hissed.

  "What is it?" Mia asked.

  "A positronic brain," Derec said. "Absolute contraband."

  "Close it up," Ariel said.

  Derec complied, then looked at Mia. "You called the police, didn't you?"

  Mia nodded. "Something came up. I had Bogard issue a dispatch through their channels. Not much, just a cruiser to go look-see."

  "Bogard is just full of tricks," Ariel said icily. "Pity it couldn't do its primary job as well."

  Derec looked at her. She was staring at Bogard, arms folded, an expression of unconcealed resentment on her face.

  "You sent me a message," Derec said, "after the Incident. You said 'I see you got your wish. ' What did you mean by that?"

  "You knew it was from me. Can't you figure out what it means?"

  "Eliton's death was-"

  "Beside the point. You got what you wanted by being able to create a dangerous robot. I don't think you wanted to kill Eliton. I think you wanted to build robots, any way you could, any way you wanted."

  "How does that follow?" Derec asked. "With Eliton's death, no one will be able to build robots on Earth."

  "I don't think it matters. Someone will hire you to build bodyguards now, no matter what."

  "Excuse me, but I failed to do that."

  Ariel waved her hand dismissively. "Glitches. No one on Earth would buy it, but Spacers understand that prototypes always have bugs to be worked out. It brought back three of the assassins. That's the point that won't be missed. You've got your opportunity to build your special positronics now. More leeway, more freedom of action, more humanlike. Hell, they'll even make mistakes."

  "Ariel-"

  "Pardon me," Mia said. "I feel like I've come in at the tail end of a very complicated argument."

  "Derec and I disagree fundamentally over Bogard. What it represents."

  "I gathered that much. Why?"

  "Derec's robot here is the product of an attempt to circumvent Three Law programming-"

  "That's a complete mischaracterization!" Derec shouted. "You never understood what I was after!"

  "Really? Tell me something-why is that robot still functioning?"

  "What? I don't-"

  "It failed," Ariel snapped. "It let a human in its care die. It stood witness to dozens of fatalities and injuries. It should be a mass of collapsed positronic gelatin. Instead, it is fully functional."

  "What good would it be if it had collapsed?"

  "It would be inert. It would pose no further threat."

  "What threat?"

  "The threat of negligence!"

  "It wasn't negligent! Look at how it performed for Mia."

  "Then why is Eliton dead?"

  "We don't know he's dead!"

  "As far as Bogard is concerned, he is! Why?"

  Derec did not know. Of everything he had intended in designing and building Bogard, that was precisely the thing which ought never to have happened. He looked at the unmoving, unmoved machine and wondered what had gone so profoundly wrong that it had allowed the human it was programmed expressly to protect to die.

  "Would someone please explain this to me?" Mia asked.

  "Derec built it, let him try," Ariel said in disgust.

  "Under normal circumstances…" Derec started. His throat caught, and he coughed. "Normal circumstances… whatever that means… the Three Laws are built into every positronic brain, part of the core template. They represent the First Principles for a robot, the foundation on which all its subsequent learning and experience rests. The initial designers set it up so that almost all the secondary programming requires the presence of those laws in order to function. It would require a complete redesign of all the manufacturing methods as well as the basic pathways themselves to build a brain without the Three Laws. They aren't just hardwired into the brain, they are basic to the processes of constructing one."

  "Is that what you did? Redesign everything?"

  "No. I'm not that good. Nor am I that irresponsible. All I did was set in place a new set of parameters for the application of the Laws."

  "You circumvented them," Ariel snapped.

  "I did not. I took an accepted st
andard practice and stretched it."

  "What practice?" Mia asked.

  "Setting conditions of when the robot perceives that it is responsible for a violation," Derec explained. "Think about it. According to the First Law, if a robot followed it absolutely, all robots would collapse. 'A robot may not injure a human being, or, through inaction, allow a human being to come to harm. ' Consider that as an absolute. Human beings are always coming to harm. Time and distance alone guarantee that a robot can't act to prevent that harm in all cases. If some kind of buffer zone, a hierarchical response, weren't in place, the instant robots realized how many humans came to harm because they weren't doing something to prevent it, we would have no functional robot population. So we establish a reasonable limitation for its application. If a robot does nothing to stop the human it is standing right next to from being killed-say, by a speeding vehicle, out of control-then it fails and collapse follows. If that same robot can do nothing to prevent the same fate happening to a human a kilometer away from it, the only way it fails is if another human forcefully asserts that it was at fault. It does not automatically perceive itself as responsible."

  Mia nodded. "That makes sense. "

  "Practical engineering," Derec said. "But then we run into a functional brick wall when it comes to certain tasks. Law enforcement, for one. Most positronic brains cannot cope with violent crime. A few have been programmed to dissociate under very specific circumstances so that, say, witnessing dead humans at a crime scene won't create a Three Law crisis. But they still can't make arrests because that offers the potential of harm to a human being."

  "But a criminal-" Mia began.

  "Is still a human being," Derec insisted. "What a human has done by violating a law does not mitigate the robot's adherence to the Three Laws."

  "Unless you redefine harm," Ariel said. "Which is how you got around the Three Law imperative."

  "That's an oversimplification," Derec replied. "What I redefined was the sphere of responsibility. Bogard is just as committed to the Three Laws as any other robot, but it has a broader definition of that commitment. It can make the determination that limiting a human's freedom action, even if it results in a degree of harm-bruises or strained muscles, for instance-may prevent harm from coming to other humans."

  "You're telling me you gave it a moral barometer?" Ariel demanded.

  "Sort of. It relies on the human to which it is assigned to make that determination. It also recognizes a human prerogative to go in harm's way should circumstances require risk to prevent further harm."

  "And when confronted with a clear case of Three Law violation?"

  "It has memory buffers and a failsafe that shunts the data out of the primary positronic matrix. It prevents positronic collapse and allows for the opportunity for further evaluation. In a proper lab debriefing, the cause-and-effect of a situation can be properly explained and set in context. The robot can be reset and returned to duty."

  "You gave it selective amnesia," Ariel said. "It can allow a human to come to harm and still function because after the fact it doesn't remember doing it."

  "That's why Bogard left data out of its report," Mia said.

  "That's why I have to take Bogard back to Phylaxis to debrief it."

  Mia nodded thoughtfully. "So why don't you approve, Ariel?"

  "A robot is a machine," she said. "A very powerful machine. It is intelligent, it can make decisions. I want them inextricably joined to the Three Laws so that they can never-never-circumvent their concern for my safety. If they fail to protect me, I want them shut down. I don't want them thinking it over. I don't want to ever be considered a secondary or tertiary concern by a robot who may decide that I ought to be sacrificed for the good of the many. Or of a specific individual. I think loosening the bonds like this can only lead to operational conflicts that will result in unnecessary harm."

  "That's the only way to construct a robot bodyguard, though," Derec said.

  "There should be no such thing, then!" Ariel shouted. "It didn't work! Somewhere in its sloppy brain it made a decision and sacrificed Senator Eliton! Explain it to me how that was for anyone's greater good!"

  Derec stared at her, ashamed. He could think of no answer to give her. In fact, he had no answer for himself.

  Twenty-One

  Mia watched the argument escalate, amazed at Ariel. She had always seen her friend as impatient but controlled, usually even-tempered, never enraged and irrational. But this was a side of Ariel with which Mia had no experience. The unreasoned hatred she directed at Bogard reminded Mia more of an anti-robot fanatic than of a Spacer who ought to be at ease with robots.

  "Ariel-" Derec said tightly, obviously reining in his own anger.

  Ariel left the room.

  Derec closed his eyes, leaning back in his chair.

  "You two have known each other a long time?" Mia asked.

  Derec gave a wan smile. "Too long, I sometimes think. In a way, I've known her all my life."

  "You're not-"

  "Related? No. It's just I-we-both had amnemonic plague. Burundi's Fever. We've been complete amnesiacs. When I had my bout, Ariel was the first human I came into contact with."

  "And you were with her when she had hers?"

  Derec nodded.

  "So… why don't you explain this dispute to me. I didn't understand half of what you were talking about."

  Derec drew a deep breath, clearly uncomfortable. "Well. I started investigating the way positronic memory works, especially in the aftermath of collapse. Sometimes you can recover a collapsed positronic brain-not often, but it can happen. There's something… unpredictable… in the way they collapse. I was curious about that."

  "Having been an amnesiac have anything to do with this?"

  "More than a little. What differs between human and robot is in the way we're locked into our perceptual realities. The way we interface with the world. Humans have a plasticity robots lack-we can indulge fiction, for instance, and know the difference, even when it's a full-sensory entertainment that is designed to mimic reality in the finest detail. A robot can't do that. Tell its senses that what it is perceiving is 'real,' and it acts upon that stimulus. It can't make the intuitive distinction. If what it perceives causes a conflict with its Three Law imperatives, collapse is likely unless quickly resolved."

  "Even a fictional crisis?" Mia asked.

  Derec nodded. "Exactly. Convince a robot a lie is real, and it has no way to treat the lie as a conditional reality pending further data, like a human does. Now in either case, unacceptable realities can cause breakdowns. Humans still suffer nervous collapses, psychotic amnesia, reactive psychoses-a variety of disorders in which the brain tries to deal with an emotional or physical shock that the mind cannot accept. It happens faster and under more concrete conditions to a robot. But in the case of humans, the attempted resolution is also an attempt to circumvent the trauma to allow the organism to continue functioning."

  "Amnesia victims can still carry on living even if they can't remember who they are or where they came from."

  "Simply put, yes. I wanted to see if some sort of the same mechanism could be duplicated in a positronic brain."

  Mia looked over at Bogard. "It seems you succeeded."

  "Not completely. What I established was a bypass, true. The memory is still there, but inaccessible to the primary matrix. I shunted it over to a buffer. Eventually, it has to be dealt with or Bogard will start suffering from diagnostic neurosis."

  "What's that?"

  "It's what I tried to explain to you before. A positronic brain runs a self-diagnostic every few hours. At some point, Bogard's diagnostic will register the absence of a specific memory node as a chronic problem. It won't be able to fix it, so Bogard will start feeling the need to be serviced. It can impair function."

  Mia felt a ripple of anxiety. She still did not want to release Bogard. "So tell me why Ariel doesn't like this."

  "Anything that tampers with the full function of the T
hree Laws she sees as a step away from heresy," Derec said. "In her view, by giving Bogard the ability to continue functioning in the wake of a Three Law conflict that should shut it down, I've created a monster." He grunted. "It was her work that gave me the direction to go, though."

  "How's that?"

  "Her doctoral thesis from Calvin. 'Three Law Conflict Under Alternative Concretizations. ' Basically, she proposed the possibility of an informational loop that is created when a robot has incomplete data which strongly suggests the necessity of action." Derec frowned. "Unfortunately, it can't make the determination of which kind of action because the information is incomplete. It starts running probability scenarios, to fill in-basically by Occam's Razor-the blanks in its information so it can make a decision. But it still can't. It can theoretically create its own delusional scenario wherein collapse is imminent based on an unreal situation. One of Ariel's inferences was that a positronic brain could be lied to on a fundamental level and thus create a false standard of reality for it. The hierarchical response to perception would be distorted. And it would be stuck in it, the loop causing a cascade of alternative perceptions."

  "How would you do that? Just walk up to it and say 'By the way, black is white, and people can fly'?"

  "No, the hardwiring prevents the brain from accepting that kind of input. It would have to be a more direct interference, like a virus that could change pathways in the brain structure. Something that would directly affect the positronic pathways themselves."

  "Doesn't that describe what happened to the RI at Union Station?"

  Derec looked worriedly at her. "Yes. That's what I wanted to see by doing the physical inspection. There's evidence of a direct intervention at certain sensory nodes, but we can't tell which ones they were."

  "Could anyone have accessed your research?"

  Derec shook his head, but Mia saw uncertainty in his face. The notion had occurred to him, but he did not want to give it too much consideration.

  "If I understand everything you've told me so far," Mia continued, "that means that Bogard could only have malfunctioned if it had been ordered to do so. If it had been given a set of operational parameters that allowed it to perceive reality a little bit differently."

 

‹ Prev