by Isaac Asimov
“You seem certain,” said Justice Shane, in renewed ill-temper, “that judgment on this point will be in your favor.”
“Not at all, Your Honor. If it is not, we simply turn the truck about. I have made no presumptions concerning your decision.”
The judge nodded. “The request on the part of the Defense is granted.”
The crate was carried in on a large dolly and the two men who handled it opened it. The courtroom was immersed in a dead silence.
Susan Calvin waited as the thick slabs of celluform went down, then held out one hand. “Come, Easy.”
The robot looked in her direction and held out its large metal arm. It towered over her by two feet but followed meekly, like a child in the clasp of its mother. Someone giggled nervously and choked it off at a hard glare from Dr. Calvin.
Easy seated itself carefully in a large chair brought by the bailiff, which creaked but held.
Defense said, “When it becomes necessary, Your Honor, we will prove that this is actually Robot EZ-27, the specific robot in the employ of Northeastern University during the period of time with which we are concerned.”
“Good,” His Honor said. “That will be necessary. I, for one, have no idea how you can tell one robot from another.”
“And now,” said Defense, “I would like to call my first witness to the stand. Professor Simon Ninheimer, please.”
The clerk hesitated, looked at the judge. Justice Shane asked, with visible surprise, “You are calling the plaintiff as your witness?”
“Yes, Your Honor.”
“I hope that you’re aware that as long as he’s your witness, you will be allowed none of the latitude you might exercise if you were cross-examining an opposing witness.”
Defense said smoothly, “My only purpose in all this is to arrive at the truth. It will not be necessary to do more than ask a few polite questions.”
“Well,” said the judge dubiously, “you’re the one handling the case. Call the witness.”
Ninheimer took the stand and was informed that he was still under oath. He looked more nervous than he had the day before, almost apprehensive.
But Defense looked at him benignly.
“Now, Professor Ninheimer, you are suing my clients in the amount of $750,000.”
“That is the – uh – sum. Yes.”
“That is a great deal of money.”
“I have suffered a great deal of harm.”
“Surely not that much. The material in question involves only a few passages in a book. Perhaps these were unfortunate passages, but after all, books sometimes appear with curious mistakes in them.”
Ninheimer’s nostrils flared. “Sir, this book was to have been the climax of my professional career! Instead, it makes me look like an incompetent scholar, a perverter of the views held by my honored friends and associates, and a believer of ridiculous and – uh – outmoded viewpoints. My reputation is irretrievably shattered! I can never hold up my head in any – uh – assemblage of scholars, regardless of the outcome of this trial. I certainly cannot continue in my career, which has been the whole of my life. The very purpose of my life has been – uh – aborted and destroyed.”
Defense made no attempt to interrupt the speech, but stared abstractedly at his fingernails as it went on.
He said very soothingly, “But surely, Professor Ninheimer, at your present age, you could not hope to earn more than – let us be generous – $l5O, OOO during the remainder of your life. Yet you are asking the court to award you five times as much.”
Ninheimer said, with an even greater burst of emotion, “It is not in my lifetime alone that I am ruined. I do not know for how many generations I shall be pointed at by sociologists as a – uh – a fool or maniac. My real achievements will be buried and ignored. I am ruined not only until the day of my death, but for all time to come, because there will always be people who will not believe that a robot made those insertions –”
It was at this point that Robot EZ-27 rose to his feet. Susan Calvin made no move to stop him. She sat motionless, staring straight ahead. Defense sighed softly.
Easy’s melodious voice carried clearly. It said, “I would like to explain to everyone that I did insert certain passages in the galley proofs that seemed directly opposed to what had been there at first –”
Even the Prosecuting Attorney was too startled at the spectacle of a seven-foot robot rising to address the court to be able to demand the stopping of what was obviously a most irregular procedure.
When he could collect his wits, it was too late. For Ninheimer rose in the witness chair, his face working.
He shouted wildly, “Damn you, you were instructed to keep your mouth shut about –”
He ground to a choking halt, and Easy was silent, too. Prosecution was on his feet now, demanding that a mistrial be declared.
Justice Shane banged his gavel desperately. “Silence! Silence! Certainly there is every reason here to declare a mistrial, except that in the interests of justice I would like to have Professor Ninheimer complete his statement. I distinctly heard him say to the robot that the robot had been instructed to keep its mouth shut about something. There was no mention in your testimony, Professor Ninheimer, as to any instructions to the robot to keep silent about anything!”
Ninheimer stared wordlessly at the judge. Justice Shane said, “Did you instruct Robot EZ-27 to keep silent about something? And if so, about what?”
“Your Honor –” began Ninheimer hoarsely, and couldn’t continue.
The judge’s voice grew sharp. “Did you, in fact, order the inserts in question to be made in the galleys and then order the robot to keep quiet about your part in this?”
Prosecution objected vigorously, but Ninheimer shouted, “Oh, what’s the use? Yes! Yes!” And he ran from the witness stand. He was stopped at the door by the bailiff and sank hopelessly into one of the last rows of seats, head buried in both hands.
Justice Shane said, “It is evident to me that Robot EZ-27 was brought here as a trick. Except for the fact that the trick served to prevent a serious miscarriage of justice, I would certainly hold attorney for the Defense in contempt. It is clear now, beyond any doubt, that the plaintiff has committed what is to me a completely inexplicable fraud since, apparently, he was knowingly ruining his career in the process –”
Judgment, of course, was for the defendant.
Dr. Susan Calvin had herself announced at Dr. Ninheimer’s bachelor quarters in University Hall. The young engineer who had driven the car offered to go up with her, but she looked at him scornfully.
“Do you think he’ll assault me? Wait down here.”
Ninheimer was in no mood to assault anyone. He was packing, wasting no time, anxious to be away before the adverse conclusion of the trial became general knowledge.
He looked at Calvin with a queerly defiant air and said, “Are you coming to warn me of a countersuit? If so, it will get you nothing. I have no money, no job, no future. I can’t even meet the costs of the trial.”
“If you’re looking for sympathy,” said Calvin coldly, “don’t look for it here. This was your doing. However, there will be no countersuit, neither of you nor of the university. We will even do what we can to keep you from going to prison for perjury. We aren’t vindictive.”
“Oh, is that why I’m not already in custody for forswearing myself? I had wondered. But then,” he added bitterly, “why should you be vindictive? You have what you want now.”
“Some of what we want, yes,” said Calvin. “The university will keep Easy in its employ at a considerably higher rental fee. Furthermore, certain underground publicity concerning the trial will make it possible to place a few more of the EZ models in other institutions without danger of a repetition of this trouble.”
“Then why have you come to see me?”
“Because I don’t have all of what I want yet. I want to know why you hate robots as you do. Even if you had won the case, your reputation would have been ruine
d. The money you might have obtained could not have compensated for that. Would the satisfaction of your hatred for robots have done so?”
“Are you interested in human minds, Dr. Calvin?” asked Ninheimer, with acid mockery.
“Insofar as their reactions concern the welfare of robots, yes. For that reason, I have learned a little of human psychology.”
“Enough of it to be able to trick met”
“That wasn’t hard,” said Calvin, without pomposity. “The difficult thing was doing it in such a way as not to damage Easy.”
“It is like you to be more concerned for a machine than for a man.” He looked at her with savage contempt.
It left her unmoved. “It merely seems so, Professor Ninheimer. It is only by being concerned for robots that one can truly be concerned for twenty-first-century man. You would understand this if you were a roboticist.”
“I have read enough robotics to know I don’t want to be a roboticist!”
“Pardon me, you have read a book on robotics. It has taught you nothing. You learned enough to know that you could order a robot to do many things, even to falsify a book, if you went about it properly. You learned enough to know that you could not order him to forget something entirely without risking detection, but you thought you could order him into simple silence more safely. You were wrong.”
“You guessed the truth from his silencer’ “It wasn’t guessing. You were an amateur and didn’t know enough to cover your tracks completely. My only problem was to prove the matter to the judge and you were kind enough to help us there, in your ignorance of the robotics you claim to despise.”
“Is there any purpose in this discussion?” asked Ninheimer wearily.
“For me, yes,” said Susan Calvin, “because I want you to understand how completely you have misjudged robots. You silenced Easy by telling him that if he told anyone about your own distortion of the book, you would lose your job. That set up a certain potential within Easy toward silence, one that was strong enough to resist our efforts to break it down. We would have damaged the brain if we had persisted.
“On the witness stand, however, you yourself put up a higher counterpotential. You said that because people would think that you, not a robot, had written the disputed passages in the book, you would lose far more than just your job. You would lose your reputation, your standing, your respect, your reason for living. You would lose the memory of you after death. A new and higher potential was set up by you – and Easy talked.”
“Oh, God,” said Ninheimer, turning his head away. Calvin was inexorable. She said, “Do you understand why he talked? It was not to accuse you, but to defend you! It can be mathematically shown that he was about to assume full blame for your crime, to deny that you had anything to do with it. The First Law required that. He was going to lie – to damage himself – to bring monetary harm to a corporation. All that meant less to him than did the saving of you. If you really understood robots and robotics, you would have let him talk. But you did not understand, as I was sure you wouldn’t, as I guaranteed to the defense attorney that you wouldn’t. You were certain, in your hatred of robots, that Easy would act as a human being would act and defend itself at your expense. So you flared out at him in panic – and destroyed yourself.”
Ninheimer said with feeling, “I hope some day your robots turn on you and kill you!”
“Don’t be foolish,” said Calvin. “Now I want you to explain why you’ve done all this.”
Ninheimer grinned a distorted, humorless grin. “I am to dissect my mind, am I, for your intellectual curiosity, in return for immunity from a charge of perjury?”
“Put it that way if you like,” said Calvin emotionlessly. “But explain.”
“So that you can counter future anti-robot attempts more efficiently? With greater understanding?”
“I accept that.”
“You know,” said Ninheimer, “I’ll tell you – just to watch it do you no good at all. You can’t understand human motivation. You can only understand your damned machines because you’re a machine yourself, with skin on.”
He was breathing hard and there was no hesitation in his speech, no searching for precision. It was as though he had no further use for precision.
He said, “For two hundred and fifty years, the machine has been replacing Man and destroying the handcraftsman. Pottery is spewed out of molds and presses. Works of art have been replaced by identical gimcracks stamped out on a die. Call it progress, if you wish! The artist is restricted to abstractions, confined to the world of ideas. He must design something in mind – and then the machine does the rest.
“Do you suppose the potter is content with mental creation? Do you suppose the idea is enough? That there is nothing in the feel of the clay itself, in watching the thing grow as hand and mind work together? Do you suppose the actual growth doesn’t act as a feedback to modify and improve the idea?”
“You are not a potter,” said Dr. Calvin. “I am a creative artist! I design and build articles and books. There is more to it than the mere thinking of words and of putting them in the right order. If that were all, there would be no pleasure in it, no return.
“A book should take shape in the hands of the writer. One must actually see the chapters grow and develop. One must work and rework and watch the changes take place beyond the original concept even. There is taking the galleys in hand and seeing how the sentences look in print and molding them again. There are a hundred contacts between a man and his work at every stage of the game and the contact itself is pleasurable and repays a man for the work he puts into his creation more than anything else could. Your robot would take all that away.”
“So does a typewriter. So does a printing press. Do you propose to return to the hand illumination of manuscripts?”
“Typewriters and printing presses take away some, but your robot would deprive us of all. Your robot takes over the galleys. Soon it, or other robots, would take over the original writing, the searching of the sources, the checking and cross-checking of passages, perhaps even the deduction of conclusions. What would that leave the scholar? One thing only – the barren decisions concerning what orders to give the robot next! I want to save the future generations of the world of scholarship from such a final hell. That meant more to me than even my own reputation and so I set out to destroy U. S. Robots by whatever means.”
“You were bound to fail,” said Susan Calvin. “I was bound to try,” said Simon Ninheimer. Calvin turned and left. She did her best to feel no pang of sympathy for the broken man.
She did not entirely succeed.
First Law
2035 A.D.
MIKE DONOVAN LOOKED at his empty beer mug, felt bored, and decided he had listened long enough. He said, loudly, “If we’re going to talk about unusual robots, I once knew one that disobeyed the First Law.”
And since that was completely impossible, everyone stopped talking and turned to look at Donovan.
Donovan regretted his big mouth at once and changed the subject. “I heard a good one yesterday,” he said, conversationally, “about –”
MacFarlane in the chair next to Donovan’s said, “You mean you knew a robot that harmed a human being?” That was what disobedience to First Law meant, of course.
“In a way,” said Donovan. “I say I heard one about –”
“Tell us about it,” ordered MacFarlane. Some of the others banged their beer mugs on the table.
Donovan made the best of it. “It happened on Titan about ten years ago,” he said, thinking rapidly. “Yes, it was in twenty-five. We had just recently received a shipment of three new-model robots, specially designed for Titan. They were the first of the MA models. We called them Emma One, Two and Three.” He snapped his fingers for another beer and stared earnestly after the waiter. Let’s see, what came next?
MacFarlane said, “I’ve been in robotics half my life, Mike. I never heard of an MA serial order.”
“That’s because they
took the MA’s off the assembly lines immediately after – after what I’m going to tell you. Don’t you remember?”
“No.” Donovan continued hastily. “We put the robots to work at once. You see, until then, the Base had been entirely useless during the stormy season, which lasts eighty percent of Titan’s revolution about Saturn. During the terrific snows, you couldn’t find the Base if it were only a hundred yards away. Compasses aren’t any use, because Titan hasn’t any magnetic field.
“The virtue of these MA robots, however, was that they were equipped with vibro-detectors of a new design so that they could make a beeline for the Base through anything, and that meant mining could become a through-the-revolution affair. And don’t say a word, Mac. The vibro-detectors were taken off the market also, and that’s why you haven’t heard of them.” Donovan coughed. “Military secret, you understand.”
He went on. “The robots worked fine during the first stormy season, then at the start of the calm season, Emma Two began acting up. She kept wandering off into corners and under bales and had to be coaxed out. Finally she wandered off Base altogether and didn’t come back. We decided there had been a flaw in her manufacture and got along with the other two. Still, it meant we were shorthanded, or short-roboted anyway, so when toward the end of the calm season, someone had to go to Kornsk, I volunteered to chance it without a robot. It seemed safe enough; the storms weren’t due for two days and I’d be back in twenty hours at the outside.
“I was on the way back – a good ten miles from Base – when the wind started blowing and the air thickening. I landed my air car immediately before the wind could smash it, pointed myself toward the Base and started running. I could run the distance in the low gravity all right, but could I run a straight line? That was the question. My air supply was ample and my suit heat coils were satisfactory, but ten miles in a Titanian storm is infinity.
“Then, when the snow streams changed everything to a dark, gooey twilight, with even Saturn dimmed out and the sun only a pale pimple, I stopped short and leaned against the wind. There was a little dark object right ahead of me. I could barely make it out but I knew what it was. It was a storm pup; the only living thing that could stand a Titanian storm, and the most vicious living thing anywhere. I knew my space suit wouldn’t protect me, once it made for me, and in the bad light, I had to wait for a point-blank aim or I didn’t dare shoot. One miss and he would be at me.