by Isaac Asimov
“That was the conclusion from the start.”
Baley said, “Those Aurorans who concluded this at the start had all the information to begin with. I am getting it now for the first time.”
“My remark, Partner Elijah, was not meant in any pejorative manner. I know better than to belittle your abilities.”
“Thank you, Daneel. I know there was no intended sneer in your remark.–You said just a while ago that the roboticide was not committed by anyone unintelligent, inexperienced, or young and that this is completely certain. Let us consider your remark–”
Baley knew that he was taking the long route. He had to. Considering his lack of understanding of Auroran ways and of their manner of thought, he could not afford to make assumptions and skip steps. If he were dealing with an intelligent human being in this way, that person would be likely to grow impatient and blurt out information–and consider Baley an idiot into the bargain. Daneel, however, as a robot, would follow Baley down the winding road with total patience.
That was one type of behavior that gave away Daneel as a robot, however humaniform he might be. An Auroran might be able to judge him a robot from a single answer to a single question. Daneel was right as to the subtle distinctions.
Baley said, “One might eliminate children, perhaps also most women, and many male adults by presuming that the method of roboticide involved great strength–that Jander’s head was perhaps crushed by a violent blow or that his chest was smashed inward. This would not, I imagine, be easy for anyone who was not a particularly large and strong human being.” From what Demachek had said on Earth, Baley knew that this was not the manner of the roboticide, but how was he to tell that Demachek herself had not been misled?
Daneel said, “It would not be possible at all for any human being.”
“Why not?”
“Surely, Partner Elijah, you are aware that the robotic skeleton is metallic in nature and much stronger than human bone. Our movements are more strongly powered, faster, and more delicately controlled. The Third Law of Robotics states: ‘A robot must protect its own existence.’ An assault by a human being could easily be fended off. The strongest human being could be immobilized. Nor is it likely that a robot can be caught unaware. We are always aware of human beings. We could not fulfill our functions otherwise.”
Baley said, “Come, now, Daneel. The Third Law states: ‘A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.’ The Second Law states: ‘A robot must obey the orders given it by a human being, except where such orders would conflict with the First Law.’ And the First Law states: ‘A robot may not injure a human being or, through inaction, allow a human being to come to harm.’ A human being could order a robot to destroy himself–and a robot would then use his own strength to smash his own skull. And if a human being attacked a robot, that robot could not fend off the attack without harming the human being, which would violate First Law.”
Daneel said, “You are, I suppose, thinking of Earth’s robots. On Aurora–or on any of the Spacer worlds–robots are regarded more highly than on Earth and are, in general, more complex, versatile, and valuable. The Third Law is distinctly stronger in comparison to the Second Law on Spacer worlds than it is on Earth. An order for self-destruction would be questioned and there would have to be a truly legitimate reason for it to be carried through–a clear and present danger. And in fending off an attack, the First Law would not be violated, for Auroran robots are deft enough to immobilize a human being without hurting him.”
“Suppose, though, that a human being maintained that, unless a robot destroyed himself, he–the human being–would be destroyed? Would not the robot then destroy himself?”
“An Auroran robot would surely question a mere statement to that effect. There would have to be clear evidence of the possible destruction of a human being.”
“Might not a human being be sufficiently subtle to so arrange matters in such a way as to make it seem to a robot that that human being was indeed in great danger? Is it the ingenuity that would be required that makes you eliminate the unintelligent, inexperienced, and young?”
And Daneel said, “No, Partner Elijah, it is not.”
“Is there an error in my reasoning?”
“None.”
“Then the error may lie in my assumption that he was physically damaged. He was not, in actual fact, physically damaged. Is that right?”
“Yes, Partner Elijah.”
(That meant Demachek had had her facts straight, Baley thought.)
“In that case, Daneel, Jander was mentally damaged. Roblock! Total and irreversible!”
“Roblock?”
“Short for robot-block, the permanent shutdown of the functioning of the positronic pathways.”
“We do not use the word ‘roblock’ on Aurora, Partner Elijah.”
“What do you say?”
“We say ‘mental freeze-out.”
“Either way, it is the same phenomenon being described.”
“It might be wise, Partner Elijah, to use our expression or the Aurorans you speak to may not understand; conversation may be impeded. You stated a short while ago that different words make a difference.”
“Very well. I will say ‘freeze-out.’–Could such a thing happen spontaneously?”
“Yes, but the chances are infinitesimally small, roboticists say. As a humaniform robot, I can report that I have never myself experienced any effect that could even approach mental freezeout.”
“Then one must assume that a human being deliberately set up a situation in which mental freeze-out would take place.”
“That is precisely what Dr. Fastolfe’s opposition contends, Partner Elijah.”
“And since this would take robotic training, experience, and skill, the unintelligent, the inexperienced, and the young cannot have been responsible.”
“That is the natural reasoning, Partner Elijah.”
“It might even be possible to list the number of human beings on Aurora with sufficient skill and thus set up a group of suspects that might not be very large in number.”
“That has, in actual fact, been done, Partner Elijah.”
“And how long is the list?”
“The longest list suggested contains only one name.”
It was Baley’s turn to pause. His brows drew together in an angry frown and he said, quite explosively, “Only one name?’
Daneel said quietly, “Only one name, Partner Elijah. That is the judgment of Dr. Han Fastolfe, who is Aurora’s greatest theoretical roboticist.”
“But what is, then, the mystery in all this? Whose is the one name?”
R. Daneel said, “Why, that of Dr. Han Fastolfe, of course. I have just stated that he is Aurora’s greatest theoretical roboticist and, in Dr. Fastolfe’s professional opinion,, he himself is the only one who could possibly have maneuvered Jander Panel into total mental freeze-out without leaving any sign of the process. However, Dr. Fastolfe also states that he did not do it.”
“But that no one else could have, either?’
“Indeed, Partner Elijah. There lies the mystery.”
“And what if Dr. Fastolfe–” Baley paused. There would be no point in asking Daneel if Dr. Fastolfe was lying or was somehow mistaken, either in his own judgment that no one but he could have done it or in the statement that he himself had not done it. Daneel had been programmed by Fastolfe and there would be no chance that the programming included the ability to doubt the programmer.
Baley said, therefore, with as close an approach to mildness as he could manage, “I will think about this, Daneel, and we will talk again.”
“That is well, Partner Elijah. It is, in any case, time for sleep. Since it is possible that, on Aurora, the pressure of events may force an irregular schedule upon you, it would be wise to seize the opportunity for sleep now. I will show you how one produces a bed and how one manages the bedclothes.”
“Thank you, Daneel
,” muttered Baley. He was under no illusion that sleep would come easily. He was being sent to Aurora for the specific purpose of demonstrating that Fastolfe was innocent of roboticide–and success in that was required for Earth’s continued security and (much less important but equally dear to Baley’s heart) for the continued prospering of Baley’s own career–yet, even before reaching Aurora, he had discovered that Fastolfe had virtually confessed to the crime.
8.
BALEY DID SLEEP–eventually, after Daneel demonstrated how to reduce the field intensity that served as a form of pseudo-gravity. This was not true antigravity and it consumed so much energy that the process could only be used at restricted times and under unusual conditions.
Daneel was not programmed to be able to explain the manner in which this worked and, if he had, Baley was quite certain he would not have understood it. Fortunately, the controls could be operated without any understanding of the scientific justification.
Daneel said, “The field intensity cannot be reduced to zero–at least, not by these controls. Sleeping under zero-gravity is not, in any case, comfortable, certainly not for those inexperienced in space travel. What one needs is an intensity low enough to give one a feeling of freedom from the pressure of one’s own weight, but high enough to maintain an up-down orientation. The level varies with the individual. Most people would feel most comfortable at the minimum intensity allowed by the control, but you might find that, on first use, you would wish a higher intensity, so that you might retain the familiarity of the weight sensation to a somewhat greater extent. Simply experiment with different levels and find the one that suits.”
Lost in the novelty of the sensation, Baley found his mind drifting away from the problem of Fastolfe’s affirmation/denial, even as his body drifted away from wakefulness. Perhaps the two were one process.
He dreamed he was back on Earth (of course), moving along an Expressway but not in one of the seats. Rather, he was floating along beside the high-speed strip, just over the head of the moving people, gaining on them slightly. Noise of the ground-bound people seemed surprised; none looked up at him. It was a rather pleasant sensation and he missed it upon waking.
After breakfast the following morning–Was it morning actually? Could it be morning–or any other time of day–in space?
Clearly, it couldn’t. He thought awhile and decided he would define morning as the time after waking, and he would define breakfast as the meal eaten after waking, and abandon specific timekeeping as objectively unimportant.–For him, at least, if not for the ship.
After breakfast, then, the following morning, he studied the news sheets offered him only long enough to see that they said nothing about the roboticide on Aurora and then turned to those book-films that had been brought to him the previous day (“wake period”?) by Giskard.
He chose those whose titles sounded historical and, after viewing through several hastily, he decided that Giskard had brought him books for adolescents. They were heavily illustrated and simply written. He wondered if that was Giskard’s estimate of Baley’s intelligence–or, perhaps, of his needs. After some thought, Baley decided that Giskard, in his robotic innocence, had chosen well and that there was no point in brooding over a possible insult.
He settled down to viewing with greater concentration and noted at once that Daneel was viewing the book-film with him. Actual curiosity? Or just to keep his eyes occupied?
Daneel did not once ask to have a page repeated. Nor did he stop to ask a question. Presumably, he merely accepted what he read with robotic trust and did not permit himself the luxury of either doubt or curiosity.
Baley did not ask Daneel any questions concerning what he read, though he did ask for instructions on the operation of the print-out mechanism of the Auroran viewer, with which he was not familiar.
Occasionally, Baley stopped to make use of the small room that adjoined his room and could be used for the various private physiological functions, so private that the room was referred to as “the Personal,” with the capital letter always understood, both on Earth and–as Baley discovered when Daneel referred to it–on Aurora. It was just large enough for one person–which made it bewildering to a City-dweller accustomed to huge banks of urinals, excretory seats, washbasins, and showers.
In viewing the book-films, Baley did not attempt to memorize details. He had no intention of becoming an expert on Auroran society, nor even of passing a high school test on the subject. Rather, he wished to get the feel of it.
He noticed, for instance, even through the hagiographic attitude of historians writing for young people, that the Auroran pioneers–the founding fathers, the Earthpeople who had first come to Aurora to settle in the early days of interstellar travel–had been very much Earthpeople. Their politics, their quarrels, every facet of their behavior had been Earthish; what happened on Aurora was, in ways, similar to the events that took place when the relatively empty sections of Earth had been settled a couple of thousand years before.
Of course, the Aurorans had no intelligent life to encounter and to fight, no thinking organisms to puzzle the invaders from Earth with questions of treatment, humane or cruel. There was precious little life of any kind, in fact. So the planet was quickly settled by human beings, by their domesticated plants and animals, and by the parasites and other organisms that were adventitiously brought along. And, of course, the settlers brought robots with them.
The first Aurorans quickly felt the planet to be theirs, since it fell into their laps with no sense of competition, and they had called the planet New Earth to begin with. That was natural, since it was the first extrasolar planet–the first Spacer world–to be settled. It was the first fruit of interstellar travel, the first dawn of an immense new era. They quickly cut the umbilical cord, however, and renamed the planet Aurora after the Roman goddess of the dawn.
It was the World of the Dawn. And so did the settlers from the start self-consciously declare themselves the progenitors of a new kind. All previous history of humanity was a dark Night and only for the Aurorans on this new world was the Day finally approaching.
It was this great fact, this great self-praise, that made itself felt over all the details: all the names, dates, winners, losers. It was the essential.
Other worlds were settled, some from Earth, some from Aurora, but Baley paid no attention to that or to any of the details. He was after the broad brushstrokes and he noted the two massive changes that took place and pushed the Aurorans ever farther away from their Earthly origins. These were first, the increasing integration of robots into every facet of life and second, the extension of the life-span.
As the robots grew more advanced and versatile, the Aurorans grew more dependent on them. But never helplessly so. Not like the world of Solaria, Baley remembered, on which a very few human beings were in the collective womb of very many robots. Aurora was not like that.
And yet they grew more dependent.
Viewing as he did for intuitive feel–for trend and generality–every step in the course of human/robot interaction seemed to depend on dependence. Even the manner in which a consensus of robotic rights was reached–the gradual dropping of what Daneel would call “unnecessary distinctions”–was a sign of the dependence. To Baley, it seemed not that the Aurorans were growing more humane in their attitude out of a liking for the humane, but that they were denying the robotic nature of the objects in order to remove the discomfort of having to recognize the fact that human beings were dependent upon objects of artificial intelligence.
As for the extended life-span, that was accompanied by a slowing of the pace of history. The peaks and troughs smoothed out. There was a growing continuity and a growing consensus.
There was no question but that the history he was viewing grew less interesting as it went along; it became almost soporific. For those living through it, this had to be good. History was interesting to the extent that it was catastrophic and, while that might make absorbing viewing, it made horribl
e living. Undoubtedly, personal lives continued to be interesting for the vast majority of Aurorans and, if the collective interaction of lives grew quiet, who would mind?
If the World of the Dawn had a quiet sunlit Day, who on that world would clamor for storm?
–Somewhere in the course of his viewing, Baley felt an indescribable sensation. If he had been forced to attempt a description, he would have said it was that of a momentary inversion. It was as though he had been turned inside out–and then back as he had been–in the course of a small fraction of a second.
So momentary had it been that he almost missed it, ignoring it as though it had been a tiny hiccup inside himself.
It was only perhaps a minute later, suddenly going over the feeling in retrospect, that he remembered the sensation as something he had experienced twice before: once when traveling to Solaria and once when returning to Earth from that planet.
It was the “Jump,” the passage through hyperspace that, in a timeless, spaceless interval, sent the ship across the parsecs and defeated the speed-of-light limit of the Universe. (No mystery in words, since the ship merely left the Universe and traversed something which involved no speed limit. Total mystery in concept, however, for there was no way of describing what hyperspace was, unless one made use of mathematical symbols which could, in any case, not be translated into anything comprehensible.)
If one accepted the fact that human beings had learned to manipulate hyperspace without understanding the thing they manipulated, then the effect was clear, At one moment, the ship had been within microparsecs of Earth and, at the next moment, it was within microparsecs of Aurora.
Ideally, the Jump took zero-time–literally zero–and, if it were carried through with perfect smoothness, there would not, could not be any biological sensation at all. Physicists maintained, however, that perfect smoothness required infinite energy so that there was always an “effective time” that was not quite zero, though it could be made as short as desired. It was that which produced that odd and essentially harmless feeling of inversion.