by Isaac Asimov
Baley wondered if he ought to question Giskard, in order to confirm the conclusions he reached from his conversation with Daneel—and, without much hesitation, decided not to. Giskard's simple and rather unsubtle mind would be of no use. He would “Yes, sir” and “No, sir” to the end. It would be like questioning a recording.
Well, then, Baley decided, he would continue with Daneel, who was at least capable of responding with something approaching subtlety.
He said, “Daneel, let us consider the case of Jander Panell, which I assume, from what you have said so far, is the first case of roboticide in the history of Aurora. The human being responsible—the killer—is, I take it, not known.”
“If,” said Daneel, “one assumes that a human being was responsible, then his identity is not known. In that, you are right, Partner Elijah.”
“What about the motive? Why was Jander Panell killed?”
“That, too, is not known.”
“But Jander Panell was a humaniform robot, one like yourself and not one like, for instance, R. Gis—I mean, Giskard.”
“That is so. Jander was a humaniform robot like myself.”
“Might it not be, then, that no case of roboticide was intended?”
“I do not understand, Partner Elijah.”
Baley said, a little impatiently, “Might not the killer have thought this Jander was a human being, that the intention was homicide, not roboticide?”
Slowly, Daneel shook his head. “Humaniform robots are quite like human beings in appearance, Partner Elijah, down to the hairs and pores in our skin. Our voices are thoroughly natural, we can go through the motions of eating, and so on. And yet, in our behavior there are noticeable differences. There may be fewer such differences with time and with refinement of technique, but as yet they are many. You—and other Earthmen not used to humaniform robots—may not easily note these differences, but Aurorans would. No Auroran would mistake Jander—or me—for a human being, not for a moment.”
“Might some Spacer, other than an Auroran, make the mistake?”
Daneel hesitated. “I do not think so. I do not speak from personal observation or from direct programmed knowledge, but I do have the programming to know that all Spacer worlds are as intimately acquainted with robots as Aurora is—some, like Solaria, even more so—and I deduce, therefore, that no Spacer would miss the distinction between human and robot.”
“Are there humaniform robots on the other Spacer worlds?”
“No, Partner Elijah, they exist only on Aurora so far.”
“Then other Spacers would not be intimately acquainted with humaniform robots and might well miss the distinctions and mistake them for human beings.”
“I do not think that is likely. Even humaniform robots will behave in robotic fashion in certain definite ways that any Spacer would recognize.”
“And yet surely there are Spacers who are not as intelligent as most, not as experienced, not as mature. There are Spacer children, if nothing else, who would miss the distinction.”
“It is quite certain, Partner Elijah, that the— roboticide—was not committed by anyone unintelligent, inexperienced, or young. Completely certain.”
“We're making eliminations. Good. If no Spacer would miss the distinction, what about an Earthman? Is it possible that—”
“Partner Elijah, when you arrive in Aurora, you will be the first Earthman to set foot on the planet since the period of original settlement was over. All Aurorans now alive were born on Aurora or, in a relatively few cases, on other Spacer worlds.”
“The first Earthman,” muttered Baley. “I am honored. Might not an Earthman be present on Aurora without the knowledge of Aurorans?”
“No!” said Daneel with simple certainty.
“Your knowledge, Daneel, might not be absolute.”
“No!” came the repetition, in tones precisely similar to the first.
“We conclude, then,” said Baley with a shrug, “that the roboticide was intended to be roboticide and nothing else.”
“That was the conclusion from the start.”
Baley said, “Those Aurorans who concluded this at the start had all the information to begin with. I am getting it now for the first time.”
“My remark, Partner Elijah, was not meant in any pejorative manner. I know better than to belittle your abilities.”
“Thank you, Daneel. I know there was no intended sneer in your remark. —You said just a while ago that the robotocide was not committed by anyone unintelligent, inexperienced, or young and that this is completely certain. Let us consider your remark—”
Baley knew that he was taking the long route. He had to. Considering his lack of understanding of Auroran ways and of their manner of thought, he could not afford to make assumptions and skip steps. If he were dealing with an intelligent human being in this way, that person would be likely to grow impatient and blurt out information—and consider Baley an idiot into the bargain. Daneel, however, as a robot, would follow Baley down the winding road with total patience.
That was one type of behavior that gave away Daneel as a robot, however humaniform he might be. An Auroran might be able to judge him a robot from a single answer to a single question. Daneel was right as to the subtle distinctions.
Baley said, “One might eliminate children, perhaps also most women, and many male adults by presuming that the method of roboticide involved great strength— that Jander's head was perhaps crushed by a violent blow or that his chest was smashed inward. This would not, I imagine, be easy for anyone who was not a particularly large and strong human being.” From what Demachek had said on Earth, Baley knew that this was not the manner of the roboticide, but how was he to tell that Demachek herself had not been misled?
Daneel said, “It would not be possible at all for any human being.”
“Why not?”
“Surely, Partner Elijah, you are aware that the robotic skeleton is metallic in nature and much stronger than human bone. Our movements are more strongly powered, faster, and more delicately controlled. The Third Law of Robotics states: ‘A robot must protect its own existence.’ An assault by a human being could easily be fended off. The strongest human being could be immobilized. Nor is it likely that a robot can be caught unaware. We are always aware of human beings. We could not fulfill our functions otherwise.”
Baley said, “Come now, Daneel. The Third Law states: ‘A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.’ The Second Law states: ‘A robot must obey the orders given it by a human being, except where such orders would conflict with the First Law.’ And the First Law states: ‘A robot may not injure a human being or, through inaction, allow a human being to come to harm.’ A human being could order a robot to destroy himself—and a robot would then use his own strength to smash his own skull. And if a human being attacked a robot, that robot could not fend off the attack without harming the human being, which would violate First Law.”
Daneel said, “You are, I suppose, thinking of Earth's robots. On Aurora—or on any of the Spacer worlds— robots are regarded more highly than on Earth, and are, in general, more complex, versatile, and valuable. The Third Law is distinctly stronger in comparison to the Second Law on Spacer worlds than it is on Earth. An order for self-destruction would be questioned and there would have to be a truly legitimate reason for it to be carried through—a clear and present danger. And in fending off an attack, the First Law would not be violated, for Auroran robots are deft enough to immobilize a human being without hurting him.”
“Suppose, though, that a human being maintained that, unless a robot destroyed himself, he—the human being—would be destroyed? Would not the robot then destroy himself?”
“An Auroran robot would surely question a mere statement to that effect. There would have to be clear evidence of the possible destruction of a human being.”
“Might not a human being be sufficiently subtle to so arrange matt
ers in such a way as to make it seem to a robot that the human being was indeed in great danger? Is it the ingenuity that would be required that makes you eliminate the unintelligent, inexperienced, and young?”
And Daneel said, “No, Partner Elijah, it is not.”
“Is there an error in my reasoning?”
“None.”
“Then the error may lie in my assumption that he was physically damaged. He was not, in actual fact, physically damaged. Is that right?”
“Yes, Partner Elijah.”
(That meant Demachek had had her facts straight, Baley thought.)
“In that case, Daneel, Jander was mentally damaged. Roblock! Total and irreversible!”
“Roblock?”
“Short for robot-block, the permanent shutdown of the functioning of the positronic pathways.”
“We do not use the word ‘roblock’ on Aurora, Partner Elijah.”
“What do you say?”
“We say ‘mental freeze-out.’ ”
“Either way, it is the same phenomenon being described.”
“It might be wise, Partner Elijah, to use our expression or the Aurorans you speak to may not understand; conversation may be impeded. You stated a short while ago that different words make a difference.”
“Very well. I will say ‘freeze-out.’ —Could such a thing happen spontaneously?”
“Yes, but the chances are infinitesimally small, roboticists say. As a humaniform robot, I can report that I have never myself experienced any affect that could even approach mental freeze-out.”
“Then one must assume that a human being deliberately set up a situation in which mental freeze-out would take place.”
“That is precisely what Dr. Fastplfe's opposition contends, Partner Elijah.”
“And since this would take robotic training, experience, and skill, the unintelligent, the inexperienced, and the young cannot have been responsible.”
“That is the natural reasoning, Partner Elijah.”
“It might even be possible to list the number of human beings on Aurora with sufficient skill and thus set up a group of suspects that might not be very large in number.”
“That has, in actual fact, been done, Partner Elijah.”
“And how long is the list?”
“The longest list suggested contains only one name.”
It was Baley's turn to pause. His brows drew together in an angry frown and he said, quite explosively, “Only one name?”
Daneel said quietly, “Only one name, Partner Elijah. That is the judgment of Dr. Han Fastolfe, who is Aurora's greatest theoretical roboticist.”
“But what is, then, the mystery in all this? Whose is the one name?”
R. Daneel said, “Why, that of Dr. Han Fastolfe, of course. I have just stated that he is Aurora's greatest theoretical roboticist and, in Dr. Fastolfe's professional opinion, he himself is the only one who could possibly have maneuvered Jander Panell into total mental freeze-out without leaving any sign of the process. However, Dr. Fastolfe also states that he did not do it.”
“But that no one else could have, either?”
“Indeed, Partner Elijah. There lies the mystery.”
“And what if Dr. Fastolfe—” Baley paused. There would be no point in asking Daneel if Dr. Fastolfe was lying or was somehow mistaken, either in his own judgment that no one but he could have done it or in the statement that he himself had not done it. Daneel had been programmed by Fastolfe and there would be no chance that the programming included the ability to doubt the programmer.
Baley said, therefore, with as close an approach to mildness as he could manage, “I will think about this, Daneel, and we will talk again.”
“That is well, Partner Elijah. It is, in any case, time for sleep. Since it is possible that, on Aurora, the pressure of events may force an irregular schedule upon you, it would be wise to seize the opportunity for sleep now. I will show you how one produces a bed and how one manages the bedclothes.”
“Thank you, Daneel,” muttered Baley. He was under no illusion that sleep would come easily. He was being sent to Aurora for the specific purpose of demonstrating that Fastolfe was innocent of roboticide—and success in that was required for Earth's continued security and (much less important but equally dear to Baley's heart) for the continued prospering of Baley's own career—yet, even before reaching Aurora, he had discovered that Fastolfe had virtually confessed to the crime.
8
Baley did sleep—eventually, after Daneel demonstrated how to reduce the field intensity that served as a form of pseudo-gravity. This was not true antigravity and it consumed so much energy that the process could only be used at restricted times and under unusual conditions.
Daneel was not programmed to be able to explain the manner in which this worked and, if he had, Baley was quite certain he would not have understood it. Fortunately, the controls could be operated without any understanding of the scientific justification.
Daneel said, “The field intensity cannot be reduced to zero—at least, not by these controls. Sleeping under zero-gravity is not, in any case, comfortable, certainly not for those inexperienced in space travel. What one needs is an intensity low enough to give one a feeling of freedom from the pressure of one's own weight, but high enough to maintain an up-down orientation. The level varies with the individual. Most people would feel most comfortable at the minimum intensity allowed by the control, but you might find that, on first use, you would wish a higher intensity, so that you might retain the familiarity of the weight sensation to a somewhat greater extent. Simply experiment with different levels and find the one that suits.”
Lost in the novelty of the sensation, Baley found his mind drifting away from the problem of Fastolfe's affirmation/denial, even as his body drifted away from wakefulness. Perhaps the two were one process.
He dreamed he was back on Earth (of course), moving along an Expressway but not in one of the seats. Rather, he was floating along beside the high-speed strip, just over the head of the moving people, gaining on them slightly. None of the ground-bound people seemed surprised; none looked up at him. It was a rather pleasant sensation and he missed it upon waking.
After breakfast the following morning—
Was it morning actually? Could it be morning—or any other time of day—in space?
Clearly, it couldn't. He thought awhile and decided he would define morning as the time after waking, and he would define breakfast as the meal eaten after waking, and abandon specific timekeeping as objectively unimportant. —For him, at least, if not for the ship.
After breakfast, then, the following morning, he studied the news sheets offered him only long enough to see that they said nothing about the roboticide on Aurora and then turned to those book-films that had been brought to him the previous day (“wake period”?} by Giskard.
He chose those whose titles sounded historical and, after viewing through several hastily, he decided that Giskard had brought him books for adolescents. They were heavily illustrated and simply written. He wondered if that was Giskard's estimate of Baley's intelligence—or, perhaps, of his needs. After some thought, Baley decided that Giskard, in his robotic innocence, had chosen well and that there was no point in brooding over a possible insult.
He settled down to viewing with greater concentration and noted at once that Daneel was viewing the book-film with him. Actual curiosity? Or just to keep his eyes occupied?
Daneel did not once ask to have a page repeated. Nor did he stop to ask a question. Presumably, he merely accepted what he read with robotic trust and did not permit himself the luxury of either doubt or curiosity.
Baley did not ask Daneel any questions concerning what he read, though he did ask for instructions on the operation of the print-out mechanism of the Auroran viewer, with which he was not familiar.
Occasionally, Baley stopped to make use of the small room that adjoined his room and could be used for the various private physiological functions, so priv
ate that the room was referred to as “the Personal,” with the capital letter always understood, both on Earth and—as Baley discovered when Daneel referred to it—on Aurora. It was just large enough for one person—which made it bewildering to a City-dweller accustomed to huge banks of urinals, excretory seats, washbasins, and showers.
In viewing the book-films, Baley did not attempt to memorize details. He had no intention of becoming an expert on Auroran society, nor even of passing a high school test on the subject. Rather, he wished to get the feel of it.
He noticed, for instance, even through the hagio-graphic attitude of historians writing for young people, that the Auroran pioneers—the founding fathers, the Earthpeople who had first come to Aurora to settle in the early days of interstellar travel—had been very much Earthpeople. Their politics, their quarrels, every facet of their behavior had been Earthish; what happened on Aurora was, in ways, similar to the events that took place when the relatively empty sections of Earth had been settled a couple of thousand years before.
Of course, the Aurorans had no intelligent life to encounter and to fight, no thinking organisms to puzzle the invaders from Earth with questions of treatment, humane or cruel. There was precious little life of any kind, in fact. So the planet was quickly settled by human beings, by their domesticated plants and animals, and by the parasites and other organisms that were adventitiously brought along. And, of course, the settlers brought robots with them.
The first Aurorans quickly felt the planet to be theirs, since it fell into their laps with no sense of competition, and they had called the planet New Earth to begin with. That was natural, since it was the first extrasolar planet—the first Spacer world—to be settled. It was the first fruit of interstellar travel, the first dawn of an immense new era. They quickly cut the umbilical cord, however, and renamed the planet Aurora after the Roman goddess of the dawn.
It was the World of the Dawn. And so did the settlers from the start self-consciously declare themselves the progenitors of a new kind. All previous history of humanity was a dark Night and only for the Aurorans on this new world was the Day finally approaching.