Book Read Free

Asimov’s Future History Volume 4

Page 49

by Isaac Asimov


  “Are you certain of all that? Really certain?’

  “Completely.”

  “And you stated so publicly?”

  “Of course. There was a public inquiry, my dear Earthman. I was asked the questions you arc now asking and I answered truthfully. It is an Auroran custom to do so.”

  Baley said, “I do not, at the moment, question that you were convinced you were answering truthfully. But might you not have been swayed by a natural pride in yourself? That might also be typically Auroran, might it not?”

  “You mean that my anxiety to be considered the best would make me willingly put myself in a position where everyone would be forced to conclude I had mentally frozen Jander?”

  “I picture you, somehow, as content to have your political and social status destroyed, provided your scientific reputation remained intact.”

  “I see. You have an interesting way of thinking, Mr. Baley. This would not have occurred to me. Given a choice between admitting I was second-best and admitting I was guilty of, to use your phrase, a roboticide, you are of the opinion I would knowingly accept the latter.”

  “No, Dr. Fastolfe, I do not wish to present the matter quite so simplistically. Might it not be that you deceive yourself into thinking you are the greatest of all roboticists and that you are completely unrivaled, clinging to that at all costs, because you unconsciously–unconsciously, Dr. Fastolfe–realize that, in fact, you are being overtaken–or have even already been overtaken–by others.”

  Fastolfe laughed, but there was an edge of annoyance in it. “Not so, Mr. Baley. Quite wrong.”

  “Think, Dr. Fastolfe! Arc you certain that none of your roboticist colleagues can approach you in brilliance?”

  “There are only a few who are capable of dealing at all with humaniform robots. Daneel’s construction created virtually a new profession for which there is not even a name–humaniformicists, perhaps. Of the theoretical roboticists on Aurora, not one, except for myself, understands the workings of Daneel’s positronic brain. Dr. Sarton did, but he is dead–and he did not understand it as well as I do. The basic theory is mine.”

  “It may have been yours to begin with, but surely you can’t expect to maintain exclusive ownership. Has no one learned the theory?”

  Fastolfe shook his head firmly. “Not one. I have taught no one and I defy any other living roboticist to have developed the theory on his own.”

  Baley said, with a touch of irritation, “Might there not be a bright young man, fresh out of the university, who is cleverer than anyone yet realizes, who–”

  “No, Mr. Baley, no. I would have known such a young man. He would have passed through my laboratories. He would have worked with me. At the moment, no such young man exists. Eventually, one will; perhaps many will. At the moment, none!”

  “If you died, then, the new science dies with you?”

  “I am only a hundred and sixty-five years old. That’s metric years, of course, so it is only a hundred and twenty-four of your Earth years, more or less. I am still quite young by Auroran standards and there is no medical reason why my life should be considered even half over. It is not entirely unusual to reach an age of four hundred years–metric years. There is yet plenty of time to teach.”

  They had finished eating, but neither man made any move to leave the table. Nor did any robot approach to clear it. It was as though they were transfixed into immobility by the intensity of the back and forth flow of talk.

  Baley’s eyes narrowed. He said, “Dr. Fastolfe, two years ago I was on Solaria. There I was given the clear impression that the Solarians were, on the whole, the most skilled roboticists in all the worlds.”

  “On the whole, that’s probably true.”

  “And not one of them could have done the deed?”

  “Not one, Mr. Baley. Their skill is with robots who are, at best, no more advanced than my poor, reliable Giskard. The Solarians know nothing of the construction of humaniform robots.”

  “How can you be sure of that?”

  “Since you were on Solaria, Mr. Baley, you know very well that Solarians can approach each other with only the greatest of difficulty, that they interact by trimensional viewing–except where sexual contact is absolutely required. Do you think that any of them would dream of designing a robot so human in appearance that it would activate their neuroses? They would so avoid the possibility of approaching him, since he would look so human, that they could make no reasonable use of him.”

  “Might not a Solarian here or there display a surprising tolerance for the human body? How can you be sure?”

  “Even if a Solarian could, which I do not deny, there are no Solarian nationals on Aurora this year.”

  “None?”

  “None! They do not like to be thrown into contact even with Aurorans and, except on the most urgent business, none will come here–or to, any other world. Even in the case of urgent business, they will come no closer than orbit and then they deal with us only by electronic communication.”

  Baley said, “In that case, if you are–literally and actually–the only person in all the worlds who could have done it, did you kill Jander?”

  Fastolfe said, “I cannot believe that Daneel did not tell you I have denied this deed.”

  “He did tell me so, but I want to hear it from you.”

  Fastolfe crossed his arms and frowned. He said, through clenched teeth, “Then I’ll tell you so. I did not do it.”

  Baley shook his head. “I believe you believe that statement.”

  “I do. And most sincerely. I am telling the truth. I did not kill Jander.”

  “But if you did not do it, and if no one else can possibly have done it, then–But wait. I am, perhaps, making an unwarranted assumption. Is Jander really dead or have I been brought here under false pretenses?”

  “The robot is really destroyed. It will be quite possible to show him to you, if the Legislature does not bar my access to him before the day is over–which I don’t think they will do.”

  “In that case, if you did not do it, and if no one else could possibly have done it, and if the robot is actually dead–who committed the crime?”

  Fastolfe sighed. “I’m sure Daneel told you what I have maintained at the inquiry–but you want to hear it from my own lips.”

  “That is right, Dr. Fastolfe.”

  “Well, then, no one committed the crime. It was a spontaneous event in the positronic flow along the brain paths that set up the mental freeze-out in Jander.”

  “Is that likely?”

  “No, it is not. It is extremely unlikely–but if I did not do it, then that is the only thing that can have happened.”

  “Might it not be argued that there is a greater chance that you arc lying than that a spontaneous mental freeze-out took place.”

  “Many do so argue. But I happen to know that I did not do it and that leaves only the spontaneous event as a possibility.”

  “And you have had me brought here to demonstrate–to prove–that the spontaneous event did, in fact, take place?”

  “Yes.”

  “But how does one go about proving the spontaneous event? Only by proving it, it seems, can I save you, Earth, and myself.”

  “In order of increasing importance, Mr. Baley?”

  Baley looked annoyed. “Well, then, you, me, and Earth.”

  “I’m afraid,” said Fastolfe, “that after considerable thought, I have come to the conclusion that there is no way of obtaining such a proof.”

  17.

  BALEY STARED AT Fastolfe in horror. “No way?”

  “No way. None.” And then, in a sudden fit of apparent abstraction, he seized the spicer and said, “You know, I am curious to see if I can still do the triple genuflection.”

  He tossed the spicer into the air with a calculated flip of the wrist. It somersaulted and, as it came down, Fastolfe caught the narrow end on the side of his right palm (his thumb tucked down). It went up slightly and swayed and was caught on the si
de of the left palm. It went up again in reverse and was caught on the side of the right palm and then again on the left palm. After this third genuflection, it was lifted with sufficient force to produce a ftp. Fastolfe caught it in his right fist, with his left hand nearby, palm upward. Once the spicer was caught, Fastolfe displayed his left hand and there was a fine sprinkling of salt in it.

  Fastolfe said, “It is a childish display to the scientific mind and the effort is totally disproportionate to the end, which is, of course, a pinch of salt, but the good Auroran host is proud of being able to put on a display. There are some experts who can keep the spicer in the air for a minute and a half, moving their hands almost more rapidly than the eye can follow.

  “Of course,” he added thoughtfully, “Daneel can perform such actions with greater skill and speed than any human. I have tested him in this manner in order to check on the workings of his brain paths, but it would be totally wrong to have him display such talents in public. It would needlessly humiliate human spicists–a popular term for them, you understand, though you won’t find it in dictionaries.”

  Baley grunted.

  Fastolfe sighed. “But we must get back to business.”

  “You brought mc through several parsecs of space for that purpose.”

  “Indeed, I did.–Let us proceed!”

  Baley said, “Was there a reason for that display of yours, Dr. Fastolfe?”

  Fastolfe said, “Well, we seem to have come to an impasse. I’ve brought you here to do something that can’t be done. Your face was rather eloquent and, to tell you the truth, I felt no better. It seemed, therefore, that we could use a breathing space. And now–let us proceed.”

  “On the impossible task?”

  “Why should it be impossible for you, Mr. Baley? Your reputation is that of an achiever of the impossible.”

  “The hyperwave drama? You believe that foolish distortion of what happened on Solaria?”

  Fastolfe spread his arms. “I have no other hope.”

  Baley said, “And I have no choice. I must continue to try; I cannot return to Earth a failure. That has been made clear to me.–Tell me, Dr. Fastolfe, how could Jander have been killed? What sort of manipulation of his mind would have been required?”

  “Mr. Baley, I don’t know how I could possibly explain that, even to another roboticist, which you certainly arc not, and even if I were prepared to publish my theories, which I certainly am not. However, let me sec if I can’t explain something.–You know, of course, that robots were invented on Earth.”

  “Very little concerning robotics is dealt with on Earth–”

  “Earth’s strong antirobot bias is well-known on the Spacer worlds.”

  “But the Earthly origin of robots is obvious to any person on Earth who thinks about it. It is well-known that hyperspatial travel was developed with the aid of robots and, since the Spacer worlds could not have been settled without hypcrspatial travel, it follows that robots existed before settlement had taken place and while Earth was still the only inhabited planet. Robots were therefore invented on Earth by Earthpeople.”

  “Yet Earth feels no pride in that, does it?”

  “We do not discuss it,” said Baley shortly.

  “And Earthpeople know nothing about Susan Calvin?”

  “I have come across her name in a few old books. She was one of the early pioneers in robotics.”

  “Is that all you know of her?”

  Baley made a gesture of dismissal. “I suppose I could find out more if I searched the records, but I have had no occasion to do so.

  “How strange,” said Fastolfe. “She’s a demigod to all Spacers, so much so that I imagine that few Spacers who are not actually roboticists think of her as an Earthwoman. It would seem a profanation. They would refuse to believe it if they were told that she died after having lived scarcely more than a hundred metric years. And yet you know her only as an early pioneer.”

  “Has she got something to do with all this, Dr. Fastolfe?”

  “Not directly, but in a way. You must understand that numerous legends cluster about her name. Most’ of them are undoubtedly untrue, but they cling to her, nonetheless. One of the most famous legends–and one of the least likely to be true–concerns a robot manufactured in those primitive days that, through some accident on the production lines, turned out to have telepathic abilities–”

  “What!”

  “A legend! I told you it was a legend–and undoubtedly untrue! Mind you, there is some theoretical reason for supposing this might be possible, though no one has ever presented a plausible design that could even begin to incorporate such an ability. That it could have appeared in positronic brains as crude and simple as those in the prehyperspatial era is totally unthinkable. That is why we are quite certain that this particular talc is an invention. But let mc go on anyway, for it points out a moral.”

  “By all means, go on.”

  “The robot, according to the tale, could read minds. And when asked questions, he read the questioner’s mind and told the questioner what he wanted to hear. Now the First Law of Robotics states quite clearly that a robot may not injure a human being or, through inaction, allow a person to come to harm, but t~robots generally that means physical harm. A robot who can read minds, however, would surely decide that disappointment or anger or any violent emotion would make the human being feeling those emotions unhappy and the robot would interpret the inspiring of such emotions under the heading of ‘harm.’ If, then, a telepathic robot knew that the truth might disappoint or enrage a questioner or cause that person to feel envy or unhappiness, he would tell a pleasing lie, instead. Do you see that?”

  “Yes, of course.”

  “So the robot lied even to Susan Calvin herself. The lies could not long continue, for different people were told different things that were not only inconsistent among themselves but unsupported by the gathering evidence of reality, you see. Susan Calvin discovered she had been lied to and realized that those lies had led her into a position of considerable embarrassment. What would have disappointed her somewhat to begin with had now, thanks to false hopes, disappointed her unbearably.–You never heard the story?’

  “I give you my word.”

  “Astonishing! Yet it certainly wasn’t invented on Aurora, for it is equally current on all the worlds.–In any case, Calvin took her revenge. She pointed out to the robot that, whether he told the truth or told a lie, he would equally harm the person with whom he dealt. He could not obey the First Law, whatever action he took. The robot, understanding this, was forced to take refuge in total inaction. If you want to put it colorfully, his positronic pathways burned out. His brain was irrecoverably destroyed. The legend goes on to say that Calvin’s last word to the destroyed robot was ‘Liar!”

  Baley said, “And something like this, I take it, was what happened to Jander Panel. He was faced with a contradiction in terms and his brain burned out?”

  “It’s what appears to have happened, though that is not as easy to bring about as it would have been in Susan Calvin’s day. Possibly because of the legend, roboticists have always been careful to make it as difficult as possible for contradictions to arise. As the theory of positronic brains has grown more subtle and as the practice of positronic brain design has grown more intricate, increasingly successful systems have been devised to have all situations that might arise resolve into nonequality, so that some action can always be taken that will be interpreted as obeying the First Law.”

  “Well, then, you can’t burn out a robot’s brain. Is that what you’re saying? Because if you arc, what happened to Jander?”

  “It’s not what I’m saying. The increasingly successful systems I speak of, are never completely successful. They cannot be. No matter how subtle and intricate a brain might be, there is always some way of setting up a contradiction. That is a fundamental truth of mathematics. It will remain forever impossible to produce a brain so subtle and intricate as to reduce the chance of contradict
ion to zero. Never quite to zero. However, the systems have been made so close to zero that to bring about a mental freeze-out by setting up a suitable contradiction would require a deep understanding of the particular positronic brain being dealt with–and that would take a clever theoretician.”

  “Such as yourself, Dr. Fastolfe?”

  “Such as myself. In the case of humaniform robots, only myself.”

  “Or no one at all,” said Baley, heavily ironic.

  “Or no one at all. Precisely,” said Fastolfe, ignoring the irony. “The humaniform robots have brains–and, I might add, bodies–constructed in conscious imitation of the human being. The positronic brains are extraordinarily delicate and they take on some of the fragility of the human brain, naturally. Just as a human being may have a stroke, though some chance event within the brain and without the intervention of any external effect, so a humaniform brain might, through chance alone–the occasional aimless drifting of positrons–go into mental freeze.”

  “Can you prove that, Dr. Fastolfe?”

  “I can demonstrate it mathematically, but of those who could follow the mathematics, not all would agree that the reasoning was valid. It involves certain suppositions of my own that do not fit into the accepted modes of thinking in robotics.”

  “And how likely is spontaneous mental freeze-out?”

  “Given a large number of humaniform robots, say a hundred thousand, there is an even chance that one of them might undergo spontaneous mental freeze-out in an average Auroran lifetime. And yet it could happen much sooner, as it did to Jander, although then the odds would be very greatly against it.”

  “But look here, Dr. Fastolfe, even if you were to prove conclusively that a spontaneous mental freeze-out could take place in robots generally, that would not be the same as proving that such a thing happened to Jander in particular at this particular time.”

  “No,” admitted Fastolfe, “you arc quite right.”

  “You, the greatest expert in robotics, cannot prove it in the specific case of Jander.”

 

‹ Prev