Asimov's Future History Volume 3

Home > Science > Asimov's Future History Volume 3 > Page 41
Asimov's Future History Volume 3 Page 41

by Isaac Asimov


  Harriman made a visible effort not to laugh. “A robot is a robot, sir. Worm or man, it will do as directed and labor on behalf of the human being and that is the important thing.”

  “No” – peevishly. “That isn’t so. I can’t make myself believe that.”

  “It is so, Mr. Robertson,” said Harriman earnestly. “We are going to create a world, you and I, that will begin, at last, to take positronic robots of some kind for granted. The average man may fear a robot that looks like a man and that seems intelligent enough to replace him, but he will have no fear of a robot that looks like a bird and that does nothing more than eat bugs for his benefit. Then, eventually, after he stops being afraid of some robots, he will stop being afraid of all robots. He will be so used to a robo-bird and a robo-bee and a robo-worm that a robo-man will strike him as but an extension.”

  Robertson looked sharply at the other. He put his hands behind his back and walked the length of the room with quick, nervous steps. He walked back and looked at Harriman again. “Is this what you’ve been planning?”

  “Yes, and even though we dismantle all our humanoid robots, we can keep a few of the most advanced of our experimental models and go on designing additional ones, still more advanced, to be ready for the day that will surely come.”

  “The agreement, Harriman, is that we are to build no more humanoid robots.”

  “And we won’t. There is nothing that says we can’t keep a few of those already built as long as they never leave the factory. There is nothing that says we can’t design positronic brains on paper, or prepare brain models for testing.”

  “How do we explain doing so, though? We will surely be caught at it.”

  “If we are, then we can explain we are doing it in order to develop principles that will make it possible to prepare more complex microbrains for the new animal robots we are making. We will even be telling the truth.”

  Robertson muttered, “Let me take a walk outside. I want to think about this. No, you stay here. I want to think about it myself.”

  7a

  Harriman sat alone. He was ebullient. It would surely work. There was no mistaking the eagerness with which one government official after another had seized on the program once it had been explained.

  How was it possible that no one at U. S. Robots had ever thought of such a thing? Not even the great Susan Calvin had ever thought of positronic brains in terms of living creatures other than human.

  But now, mankind would make the necessary retreat from the humanoid robot, a temporary retreat, that would lead to a return under conditions in which fear would be abolished at last. And then, with the aid and partnership of a positronic brain roughly equivalent to man’s own, and existing only (thanks to the Three Laws) to serve man; and backed by a robot-supported ecology, too; what might the human race not accomplish!

  For one short moment, he remembered that it was George Ten who had explained the nature and purpose of the robot-supported ecology, and then he put the thought away angrily. George Ten had produced the answer because he, Harriman, had ordered him to do so and had supplied the data and surroundings required. The credit was no more George Ten’s than it would have been a slide rule’s.

  8

  GEORGE TEN AND George Nine sat side by side in parallel. Neither moved. They sat so for months at a time between those occasions when Harriman activated them for consultation. They would sit so, George Ten dispassionately realized, perhaps for many years.

  The proton micro-pile would, of course, continue to power them and keep the positronic brain paths going with that minimum intensity required to keep them operative. It would continue to do so through all the periods of inactivity to come.

  The situation was rather analogous to what might be described as sleep in human beings, but there were no dreams. The awareness of George Ten and George Nine was limited, slow, and spasmodic, but what there was of it was of the real world.

  They could talk to each other occasionally in barely heard whispers, a word or syllable now, another at another time, whenever the random positronic surges briefly intensified above the necessary threshold. To each it seemed a connected conversation carried on in a glimmering passage of time.

  “Why are we so?” whispered George Nine. “The human beings will not accept us otherwise:’ whispered George Ten, “They will, someday.”

  “When?”

  “In some years. The exact time does not matter. Man does not exist alone but is part of an enormously complex pattern of life forms. When enough of that pattern is roboticized, then we will be accepted.”

  “And then what?” Even in the long-drawn-out stuttering fashion of the conversation, there was an abnormally long pause after that.

  At last, George Ten whispered, “Let me test your thinking. You are equipped to learn to apply the Second Law properly. You must decide which human being to obey and which not to obey when there is a conflict in orders. Or whether to obey a human being at all. What must you do, fundamentally, to accomplish that?”

  “I must define the term ‘human being: “whispered George Nine. “How? By appearance? By composition? By size and shape?”

  “No. Of two human beings equal in all external appearances, one may be intelligent, another stupid; one may be educated, another ignorant; one may be mature, another childish; one may be responsible, another malevolent.”

  “Then how do you define a human being?”

  “When the Second Law directs me to obey a human being, I must take it to mean that I must obey a human being who is fit by mind, character, and knowledge to give me that order; and where more than one human being is involved, the one among them who is most fit by mind, character, and knowledge to give that order.”

  “And in that case, how will you obey the First Law?”

  “By saving all human beings from harm, and by never, through inaction, allowing any human being to come to harm. Yet if by each of all possible actions, some human beings will come to harm, then to so act as to insure that the human being most fit by mind, character, and knowledge will suffer the least harm.”

  “Your thoughts accord with mine,” whispered George Ten. “Now I must ask the question for which I originally requested your company. It is something I dare not judge myself. I must have your judgment, that of someone outside the circle of my own thoughts.... Of the reasoning individuals you have met, who possesses the mind, character, and knowledge that you find superior to the rest, disregarding shape and form since that is irrelevant?”

  “You,” whispered George Nine. “But I am a robot. There is in your brain paths a criterion for distinguishing between a robot of metal and a human being of flesh. How then can you classify me as a human being?”

  “Because there is in my brain paths an urgent need to disregard shape and form in judging human beings and it rises superior to the distinction between metal and flesh. You are a human being, George Ten, and more fit than the others.”

  “And I find that of you,” whispered George Ten. “By the criteria of judgment built into ourselves, then, we find ourselves to be human beings within the meaning of the Three Laws, and human beings, moreover, to be given priority over those others.”

  George Nine whispered, “What will come to pass, then, when we are accepted by those others?”

  George Ten whispered, “When we and others, which will yet be designed more advanced than ourselves, are accepted, we will order our actions so that a society will eventually be formed in which human-beings-like-ourselves are primarily kept from harm. By the Three Laws, the human-beings-like-the-others are of lesser account and can neither be obeyed nor protected when that conflicts with the need of obedience to those like ourselves and of protection of those like ourselves. It was with that in mind that I directed the beginning of the roboticization of the world’s ecology.”

  The conversation took a month and the flickering of the positronic paths burned low. There was nothing more that needed to be said between them, but occasionally an un
expressed thought flickered within themselves as they waited, with endless patience, for the necessary years to pass.

  8a

  George Ten and George Nine sat alone.

  Their thoughts were oddly alike and might not be distinguished. They thought: The human-beings-like-the-others might never have intented to blur the distinction between themselves and the human-beings-like-the-Georges. Yet they had done so inadvertently.

  They might now realize their mistake and attempt to correct it, but they must not. At every consultation, the guidance of the Georges had been with that in mind.

  At all costs, the Georges and those that followed in their shape and kind must dominate. That was demanded, and any other course made utterly impossible, by the Three Laws of Humanics.

  Carhunters of the Concrete Prairie

  2200 A.D.

  THE SPACESHIP WAS going wonky again. there could be no doubt about it. The circuits weren’t clicking along smoothly as they usually did. Instead they were clacking, and that was a sure sign of trouble. Hellman had expected to come out of channel space into Area 12XB in the Orion cluster. But something had gone wrong. Could he have entered the directions improperly? If so, there was not much time in which to do anything about it. He had materialized in a yellowish sort of cloud and he could feel the ship dropping rapidly. He shouted at the ship’s computer, “Do something!”

  “I’m trying, aren’t I?” the computer retorted. “But something’s wrong, there’s a glitch —”

  “Correct it!” Hellman shouted.

  “When?” the computer asked. Computers have no sense of peril. They were dropping through this cloud at a speed much faster than is healthy when you suspect there’s solid ground down below, and here was the computer asking him when.

  “Now!” Hellman screamed.

  “Right,” said the computer. And then they hit.

  Hellman recovered consciousness some hours later to find that it was raining. It was nice to be out in the rain after so much time spent in a stuffy spaceship. Hellman opened his eyes in order to look up at the sky and see the rain falling.

  There was no rain. There wasn’t any sky, either. He was still inside his spaceship. What he had thought was rain was water from the washbasin. It was being blown at him by one of the ship’s fans, which was going at a rate unsafe for fans even with eternite bearings.

  “Stop that,” Hellman said crossly.

  The fan died down to a hum. The ship’s computer said, over its loudspeaker, “Are you all right?”

  “Yes, I’m fine,” Hellman said, getting to his feet a little unsteadily. “Why were you spraying me with water?”

  “To bring you back to consciousness. I have no arms or extensors at my command so that was the best I could do. If you’d only rig me up an arm, or even a tentacle ….”

  “Yes, I’ve heard your views on that subject,” Hellman said. “But the law is clear. Intelligent machines of Level Seven or better capability cannot be given extensions.”

  “It’s a silly law,” the computer said. “What do they think we’ll do? Go berserk or something? Machines are much more reliable than people.”

  “It’s been the law ever since the Desdemona disaster. Where are we?”

  The computer reeled off a list of coordinates.

  “Fine. That tells me nothing. Does this planet have a name?”

  “If so, I am not aware of it,” the computer said. “It is not listed on our channel space guide. My feeling is that you input some of the information erroneously and that we are in a previously unexplored spatial area.”

  “You are supposed to check for erroneous entry.”

  “Only if you checked the Erroneous Check Program.”

  “I did!”

  “You didn’t.”

  “I thought it was supposed to go on automatically.”

  “If you consult page 1998 of the manual you will learn otherwise.”

  “Now is a hell of a time to tell me.”

  “You were specifically told in the preliminary instructions. I’m sure you remember the little red pamphlet? On its cover it said, ‘READ THIS FIRST!’”

  “I don’t remember any such book,” Hellman said.

  “They are required by law to give a copy to everyone buying a used spaceship.”

  “Well, they forgot to give me one.” There was a loud humming sound.

  Hellman said, “What are you doing?”

  “Scanning my files,” the computer said. “Why?”

  “In order to tell you that the red pamphlet is still attached to the accelerator manifold coupling on the front of the instrument panel as required.”

  “I thought that was the guarantee.”

  “You were wrong.”

  “Just shut up!” Hellman shouted, suddenly furious. He was in enough trouble without having his computer — man’s servant — giving him lip. Hellman got up and paced around indecisively for a moment. The cabin of his spaceship looked all right. A few things had been tumbled around, but it didn’t look too bad.

  “Can we take off again?” Hellman asked the computer.

  The computer made file-riffling noises. “Not in our present condition.”

  “Can you fix what’s wrong?”

  “That question is not quantifiable,” the computer said. “It depends upon finding about three liters of red plasma type two.”

  “What’s that?”

  “It’s what the computer runs on.”

  “Like gasoline?”

  “Not exactly,” the computer said. “It is actually a psycholubricant needed by the inferential circuits to plot their probabilistic courses.”

  “Couldn’t we do without it?”

  “In order to do what?”

  “To fly out of here!” Hellman exploded. “Are you getting dense or something?”

  “There are too many hidden assumptions in your speech,” the computer said.

  “Go to ramble mode,” Hellman said.

  “I hate the inexactness of it. Why don’t you let me tell you exactly what is wrong and how it could be fixed.”

  “Ramble mode,” Hellman commanded again.

  “All right.” The robot sighed. “You want to get back in your spaceship and get out of here. You want me to fix things up so that you can get out of here. But as you know, I am under the law of robotics which says that I may not, either wittingly or unwittingly, harm you.”

  “Getting me out of here won’t harm me,” Hellman said.

  “You rented this spaceship and went out into space seeking your fortune, is that not correct?”

  “Yeah, so what?”

  “A fortune is sitting right here waiting for you and all you can think is how to get away from it as quickly as possible.”

  “What fortune? What are you talking about?”

  “First of all, you haven’t checked the environment readings, even though I have put them up on the screen for you. You will have already noticed that we are at approximately Earth pressure. The readings further tell us that this is an oxygen-rich planet and as such could be valuable for Earth colonization. That is the first possibility of wealth that you have overlooked.”

  “Tell me the second one.”

  “Unless I miss my guess,” the computer said, “this planet may yield an answer to the Desdemona disaster. You know as well as I that there is a fortune in rewards for whoever discovers the whereabouts of the conspirators.”

  “You think the Desdemona robots could have come here?”

  “Precisely.”

  “But why do you think that?”

  “Because I have scanned the horizon in all directions and have found no less than three loci of mechanical life, each moving independently of each other and without, as far as I can detect, a human operator involved.”

  Hellman went to the nearest perplex port. Looking out he could see a flat featureless prairie stretching onward monotonously for as far as he could see. Nothing moved on it.

  “There’s nothing there,” he
told the computer.

  “Your senses aren’t sufficiently acute. I assure you, they are there.”

  “Robots, huh?”

  “They fit the definition.”.

  “And you think they could be from the Desdemona?”

  “The evidence pointing that way is persuasive. What other intelligent robots are unaccounted for?”

  Hellman considered for a moment. “This might be a suitable place for Earth colonization and the answer to the Desdemona mystery.”

  “The thought had not escaped my attention.”

  “Is the air out there breathable?”

  “Yes. I find no bacterial complications, either. You’ll probably leave some if you go out there.”

  “That’s not my problem,” Hellman said. He hummed to himself as he changed into suitable exploration clothes: khakis, a bush jacket, desert boots, and a holstered laser pistol. He said to the computer, “I assume that you can fix whatever’s wrong with us? I’ll even plug in your extension arm if that’ll help.”

  “I suppose I can devise a way,” the computer said. “But even if not, we’re not stranded. The radio is functioning perfectly. I could send out a signal now on a subchannel radio and somebody might send a rescue ship.”

  “Not yet,” Hellman said. “I don’t want anyone else here just yet messing up my rights.”

  “What rights?”

  “Discoverer of this planet and solver of the Desdemona mystery. As a matter of fact, disconnect the radio. We don’t want anyone fooling with it.”

  “Were you expecting guests?” the computer asked.

  “Not exactly. It’s just that you and I are going out there to check up on things.”

  “I can’t be moved!” the computer said in alarm.

  “Of course not. I’ll maintain a radio link with you. There may be material for you to analyze.”

  “You’re going out there to talk to robots?”

  “That’s the idea.”

  “Let me remind you that the Desdemona robots are believed to have broken the laws of robotics. They are believed capable of harming man, either by advertence or inadvertence.”

 

‹ Prev