Book Read Free

The Robots of Dawn trs-3

Page 10

by Isaac Asimov


  “Well, then, you can’t bum out a robot’s brain. Is that what you’re saying? Because if you are, what happened to Jander?”

  “It’s not what I’m saying. The increasingly successful systems I speak of are never completely successful. They cannot be. No matter how subtle and intricate a brain might be, there is always some way of setting up a contradiction. That is a fundamental truth of mathematics. It will remain forever impossible to produce a brain so subtle and intricate as to reduce the chance of contradiction to zero. Never quite to zero. However, the systems have been made so close to zero that to bring about a mental freeze-out by setting up a suitable contradiction would require a deep understanding of the particular positionic brain being dealt with—and that would take a clever theoretician.”

  “Such as yourself, Dr. Fastolfe?”

  “Such as myself. In the case of humaniform robots, only myself.”

  “Or no one at all,” said Baley, heavily ironic.

  “Or no one at all. Precisely,” said Fastolfe, ignoring the irony. “The humaniform robots have brains—and, I might add, bodies—constructed in conscious imitation of the human being. The positronic brains are extraordinarily delicate and they take on some of the fragility of the human brain, naturally. Just as a human being may have a stroke, though some chance event within the brain—and without the intervention of any external effect, so a humaniform brain might, through chance alone the occasional aimless drifting of positrons—go into mental—”

  “Can you prove that, Dr. Fastolfe?”

  “I can demonstrate it mathematically, but of those who could follow the mathematics, not all would agree that the reasoning was valid. It involves certain suppositions of my own that do not fit into the accepted modes of thinking in robotics.”

  “And how likely is spontaneous mental freeze-out?”

  “Given a large number of humaniform robots, say a hundred thousand, there is an even chance that one of them might undergo spontaneous mental freeze-out in an average Auroran lifetime. And yet it could happen much sooner, as it did to Jander, although then the odds would be very greatly against it.”

  “But look here, Dr. Fastolfe, even if you were to prove conclusively that a spontaneous mental freeze-out could take place in robots generally, that would not be the same as proving that such a thing happened to Jander in particular at this particular time.”

  “No,” admitted Fastolfe, “you are quite right.”

  “You, the greatest expert in robotics, cannot prove it in the specific case of Jander.”

  “Again, you are quite right.”

  “Then what do you expect me to be able to do, when I know I nothing of robotics.”

  “There is no need to prove anything. It would surely be sufficient to present an ingenious suggestion that would make spontaneous mental freeze-out plausible to the general public.”

  “Such as—”

  “I don’t know.”

  Baley said harshly. “Are you sure you don’t know, Dr. Fastolfe?”

  “What do you mean? I have just said I don’t know.”

  “Let me point out something. I assume that Aurorans, generally, know that I have come to the planet for the purpose of tackling this problem. It would be difficult to manage to get me here secretly, considering that I am an Earthman and this is Aurora.”

  “Yes, certainly, and I made no attempt to do that. I consulted the Chairman of the Legislature and persuaded him to grant me permission to bring you here. It is how I’ve managed to win a stay in judgment. You are to be given a chance to solve the mystery before I go on trial. I doubt that they’ll give me a very long stay.”

  “I repeat, then—Aurorans, in general, know I’m here and I imagine they know precisely why I am here—that I am supposed to solve the puzzle of the death of Jander.”

  “Of course. What other reason could there be?”

  “And from the time I boarded the ship that brought me here, you have kept me under close and constant guard because of the danger that your enemies might try to eliminate me judging me to be some sort of wonderman who just might solve the puzzle in such a way as to place you on the winning side, even though all the odds are against me.”

  “I fear that as a possibility, yes.”

  “And suppose someone who does not want to see the puzzle solved and you, Dr. Fastolfe, exonerated should actually succeed in killing me. Might that not swing sentiment in your favor? Might not people reason that your enemies felt you were, in actual fact, innocent or they would not fear the investigation so much that they would want to kill me?”

  “Rather complicated reasoning, Mr. Baley. I suppose that, properly exploited your death might be used to such a purpose, but it’s not going happen. You are being protected and you will not be killed.”

  “But why protect me, Dr. Fastolfe? Why not let them kill me and use my death as a way of winning?”

  “Because I would rather you remained alive and succeeded in actually demonstrating my innocence.”

  Baley said, “But surely you know that I can’t demonstrate your innocence.”

  “Perhaps you can. You have every incentive. The welfare of Earth hangs on your doing so and, as you have told me, your own career.”

  “What good is incentive? If you ordered me to fly by flapping my arms and told me further that if I failed, I would be promptly killed by slow torture and that Earth would be blown up and all its population destroyed, I would have enormous incentive to flap my wings and fly—and yet still be unable to do so.”

  Fastolfe said uneasily, “I know, the chances are small.”

  “You know they are nonexistent,” said Baley violently, “and that only my death can save you.”

  “Then I will not be saved, for I am seeing to it that my enemies cannot reach you.”

  “But you can reach me.”

  “What?”

  “I have the thought in my head, Dr. Fastolfe, that you yourself might kill me in such a way as to make it appear that your enemies have done the deed. You would then use my death against them—and that that is why you have brought me to Aurora.”

  For a moment, Fastolfe looked at Baley with a kind of mild surprise and then, in an excess of passion both sudden and extreme, his face reddened and twisted into a snarl. Sweeping up the spicer from the table, he raised it high and brought his arm down to hurl it at Baley.

  And Baley, caught utterly by surprise, barely managed to cringe back against his chair.

  PART 5.

  DANEEL AND GISKARD

  18

  If Fastolfe had acted quickly, Daneel had reacted far more quickly still.

  To Baley, who had all but forgotten Daneel’s existence, there seemed a vague rush, a confused sound, and then Daneel was standing to one side of Fastolfe holding the spicer, and saying, “I trust, Dr. Fastolfe, that I did not in any way hurt you.”

  Baley noted, in a dazed sort of way, that Giskard was not far from Fastolfe on the other side and that every one of the four robots at the far wall had advanced almost to the dining room table.

  Panting slightly, Fastolfe, his hair quite disheveled, said, “No, Daneel. You did very well, indeed.” He raised his voice. “You all did well, but remember, you must allow nothing to slow you down, even my own involvement.”

  He laughed softly and took his seat once more, straightening his hair with his hand.

  “I’m sorry,” he said, “to have startled you so, Mr. Baley, but I felt, the demonstration might be—more convincing than any word’s of mine would have been.”

  Baley, whose moment of cringing had been purely a matter of reflex, loosened his collar and said, with a touch of hoarseness, “I’m afraid I expected words, but I agree the demonstration was convincing. I’m glad that Daneel was close enough to disarm you.”

  “Any one of them was close enough to disarm me, but Daneel was the closest and got to me first. He got to me quickly enough to be gentle about it. Had he been farther away, he might have had to wrench my arm or even kno
ck me out.”

  “Would he have gone that far?”

  “Mr. Baley,” said Fastolfe. “I have given instructions for your protection and I know how to give instructions. They would not have hesitated to save you, even if the alternative was harm to me. They would, of course, have labored to inflict minimum harm, as Daneel did. All he harmed was my dignity and the neatness of my hair. And my fingers tingle a bit.” Fastolfe flexed them ruefully.

  Baley drew a deep breath, trying to recover from that short period of confusion. He said, “Would not Daneel have protected me even without your specific instruction?”

  “Undoubtedly. He would have had to. You must not think, however, that robotic response is a simple yes or no, up or down, in or out. It is a mistake the layman often makes. There is the matter of speed of response. My instructions with regard to you were so phrased that the potential built up within the robots of my establishment, including Daneel, is abnormally high, as high as I can reasonably make it, in fact. The response, therefore, to a clear and present danger to you is extraordinarily rapid. I knew it would be and it was for that reason that I could strike out at you as rapidly as I did—knowing I could give you I a most convincing demonstration of my inability to harm you.”

  “Yes, but I don’t entirely thank you for it.”

  “Oh, I was entirely confident in my robots, especially Daneel. It did occur to me, though, a little too late, that if I had not instantly released the spicer, he might, quite against his will—or the robotic equivalent of will have broken my wrist.”

  Baley said, “It occurs to me that it was a foolish risk for you to have undertaken.”

  “It occurs to me, as well—after the fact. Now if you had prepared yourself to hurl the spicer at me, Daneel would have at once countered your move, but not with quite the same speed, for he has received no special instructions as to my safety. I can hope he would have been fast enough to save me, but I’m not sure—and I would prefer not to test that matter.” Fastolfe smiled genially.

  Baley said, “What if some explosive device were dropped on the house from some airborne vehicle?”

  “Or if a gamma beam were trained upon us from a neighboring hilltop.—My robots do not represent infinite protection, but such radical terrorist attempts are I unlikely in the extreme here on Aurora. I suggest we do not worry about them.”

  “I am willing not to worry about them. Indeed, I did not seriously suspect that you were a danger to me, Dr. Fastolfe, but I needed to eliminate the possibility altogether if I were to continue. We can now proceed.”

  Fastolfe said, “Yes, we can. Despite this additional and very dramatic distraction, we still face the problem of proving, that Jander’s mental freeze-out was spontaneous chance.”

  But Baley had been made aware of Daneel’s presence and he now turned to him and said uneasily, “Daneel, does it pain you that we discuss this matter?”

  Daneel, who had deposited the spicer on one of the farther of the empty tables, said, “Partner Elijah, I would prefer that past-friend Jander were still operational, but since he is not and since he cannot be restored to proper functioning, the best of what is left is that action be taken to prevent similar incidents in the future. Since the discussion now has that end in view, it pleases rather than pains me.”

  “Well, then, just to settle another matter, Daneel, do you believe that Dr. Fastolfe is responsible for the end of your fellow-robot Jander?—You’ll pardon my inquiring, Dr. Fastolfe?”

  Fastolfe gestured his approval and Daneel said, “Dr. Fastolfe has stated that he was not responsible, so he, of course, was not.”

  “You have no doubts on the matter, Daneel?”

  “None, Partner Elijah.”

  Fastolfe seemed a little amused. “You are cross-examining a robot, Mr. Baley.”

  “I know that, but I cannot quite think of Daneel as a robot and so I have asked.”

  “His answers would have no standing before any Board of Inquiry. He is compelled to believe me by his positronic potentials.”

  “I am not a Board of Inquiry, Dr. Fastolfe, and I am clearing out the underbrush. Let me go back to where I was. Either you bummed out Jander’s brain or it happened by random circumstance. You assure me that I cannot prove random circumstance and that leaves me only with the task of disproving any action by you. In other words, if I can show that it is impossible for you to have killed Jander, we are left with random circumstance as the only alternative.”

  “And how can you do that?”

  “It is a matter of means, opportunity, and motive. You had the means of killing Jander—the theoretical ability to so, manipulate him that he would end in a mental freeze-out. But did you have the opportunity? He was your robot, in that you designed his brain paths and supervised his construction, but was he in your actual possession at the time of the mental freeze-out?”

  “No, as a matter of fact. He was in the possession of another.”

  “For how long?”

  “About eight months—or a little over half of one of your years.”

  “Ah. It’s an interesting point. Were you with him—or near him—at the time of his destruction? Could you have reached him? In short, can we demonstrate the fact that you were so far from him—or so out of touch with, him—that it is not reasonable to suppose that you could have done the deed at the time it is supposed to have been done?”

  Fastolfe said, “That, I’m afraid, is impossible. There is a rather broad interval of time during which the deed might have been done. There are no robotic changes after destruction equivalent to rigor mortis or decay in a human being. We can only say that, at a certain time, Jander was known to be in operation and, at a certain other time, he was known not to be in operation. Between the two was a stretch of about eight hours. For that period, I have no alibi.”

  “None?—During that time, Dr. Fastolfe, what were you doing?”

  “I was here, in my establishment.”

  “Your robots were surely aware, perhaps, that you were here and could bear witness.”

  “They were certainly aware, but they cannot bear witness in any legal sense and on that day Fanya was off on business of her own.”

  “Does Fanya share your knowledge of robotics, by the way?”

  Fastolfe indulged in a wry smile. “She knows less than you do.—Besides, none of this matters.”

  “Why not?”

  Fastolfe’s patience was clearly beginning to stretch to the cracking point. “My dear Mr. Baley, this was not a matter of close-range physical assault, such as my recent pretended attack on you. What happened to Jander did not require my physical presence. As it happens, although not actually in my establishment, Jander was not far away geographically, but it wouldn’t have mattered if he were on the other side of Aurora. I could always reach him electronically and could, by the orders I gave him and the responses I could educe, send him into mental freeze-out. The crucial step would not even necessarily require much in the way of time—

  Baley said at once, “It’s a short process, then, one that someone else might move through by chance, while intending something perfectly routine?”

  “No!” said Fastolfe. “For Aurora’s sake, Earthman, let me talk. I’ve already told you that’s not the case. Inducing mental freeze-out in Jander would be a long and complicated and tortuous process, requiring the greatest understanding and wit, and could be done by no one accidentally, without incredible and long-continued coincidence. There would be far less chance, of accidental progress over that enormously complex route than of spontaneous mental freeze-out, if my mathematical reasoning were only accepted.

  “However, if I wished to induce mental freeze-out, I could carefully produce changes and reactions, little by little, over a period of weeks, months, even years, until I had brought Jander to the very point of destruction. And at no time in that process would he show any signs of being at the edge of catastrophe, just as you could approach closer and closer to a precipice in the dark and yet feel no l
oss in firmness of footing whatever, even at the very edge. Once I had brought him to the very brink, however—the lip of the precipice—a single remark from me would send him over. It is that final step that would take but a moment of time. You see?”

  Baley tightened his lips. There was no use trying to mask his disappointment. “In short, then, you had the opportunity.”

  “Anyone would have had the opportunity. Anyone on Aurora, provided he or she had the necessary ability.”

  “And only you have the necessary ability.”

  “I’m afraid so.”

  “Which brings us to motive, Dr. Fastolfe.”

  “And it’s there that we might be able to make a good case. These humaniform robots are yours. They are based on your theory and you were involved in their construction at every step of the way, even if Dr. Sarton supervised that construction. They exist because of you and only because of you. You have spoken of Daneel as your ‘first-born.’ They are your creations, your children, your gift to humanity, your hold on immortality.” (Baley felt himself growing eloquent and, for a moment, imagined himself to be addressing a Board of Inquiry.) “Why on Earth—or Aurora, rather—why on Aurora should you undo this work? Why should you destroy a life you have produced by a miracle of mental labor?”

  Fastolfe looked wanly amused. “Why, Mr. Baley, you know nothing about it. How can you possibly know that my theory was the result of a miracle of mental labor? It might have been the very dull extension of an equation that anyone might have accomplished but which none had bothered to do before me.”

  “I think not,” said Baley, endeavoring to cool down. “If no one but you can understand the humaniform brain well enough to destroy it, then I think it likely that no one but you can understand it well enough to create it. Can you deny that?”

 

‹ Prev