The Rest of the Robots

Home > Science > The Rest of the Robots > Page 15
The Rest of the Robots Page 15

by Isaac Asimov


  'The laws, Professor Goodfellow, are not simple ones. Robots may not be used on public thoroughfares or within public edifices. They may not be used on private grounds or within private structures except under certain restrictions that usually turn out to be prohibitive. The university, however, is a large and privately owned institution that usually receives preferential treatment. If the robot is used only in a specific room for only academic purposes, if certain other restrictions are observed and if the men and women having occasion to enter the room cooperate fully, we may remain within the law.'

  'But all that trouble just to read proof?'

  'The uses would be infinite, Professor. Robotic labor has so far been used only to relieve physical drudgery. Isn't there such a thing as mental drudgery? When a professor capable of the most useful creative thought is forced to spend two weeks painfully checking the spelling of lines of print and I offer you a machine that can do it in thirty minutes, is that picayune?'

  'But the price———'

  'The price need not bother you. You cannot buy EZ-27. U.S. Robots does not sell its products. But the university can lease EZ-27 for a thousand dollars a year—consider­ably less than the cost of a single microwave spectograph continuous-recording attachment.'

  Goodfellow looked stunned. Lanning followed up his advantage by saying, 'I only ask that you put it up to what­ever group makes the decisions here. I would be glad to speak to them if they want more information.'

  'Well,' Goodfellow said doubtfully, 'I can bring it up at next week's Senate meeting. I can't promise that will do any good, though.'

  'Naturally,' said Lanning.

  The Defense Attorney was short and stubby and carried himself rather portentously, a stance that had the effect of accentuating his double chin. He stared at Professor Good-fellow, once that witness had been handed over, and said, 'You agreed rather readily, did you not?'

  The professor said briskly, 'I suppose I was anxious to be rid of Dr. Lanning. I would have agreed to anything.'

  'With the intention of forgetting about it after he left?'

  'Well———'

  'Nevertheless, you did present the matter to a meeting of the Executive Board of the University Senate.'

  'Yes, I did.'

  'So that you agreed in good faith with Dr. Lanning's suggestions. You weren't just going along with a gag. You actually agreed enthusiastically, did you not?'

  'I merely followed ordinary procedures.'

  'As a matter of fact, you weren't as upset about the robot as you now claim you were. You know the Three Laws of Robotics and you knew them at the time of your interview with Dr. Lanning.'

  'Well, yes.'

  'And you were perfectly willing to leave a robot at large and unattended.'

  'Dr. Lanning assured me———'

  'Surely you would never have accepted his assurance if you had had the slightest doubt that the robot might be in the least dangerous.'

  The professor began frigidly, 'I had every faith in the word———'

  'That is all,' said Defense abruptly.

  As Professor Goodfellow, more than a bit ruffled, stood down, Justice Shane leaned forward and said, 'Since I am not a robotics man myself, I would appreciate knowing precisely what the Three Laws of Robotics are. Would Dr. Lanning quote them for the benefit of the court?'

  Dr. Lanning looked startled. He had been virtually bumping heads with the gray-haired woman at his side. He rose to his feet now and the woman looked up, too— expressionlessly.

  Dr. Lanning said, 'Very well, Your Honor.' He paused as though about to launch into an oration and said, with laborious clarity, 'First Law: a robot may not injure a human being, or, through inaction, allow a human being to come to harm. Second Law: a robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. Third Law: a robot must pro­tect its own existence as long as such protection does not conflict with the First or Second Law.'

  'I see,' said the judge, taking rapid noteb. 'These Laws are built into every robot, are they?'

  'Into every one. That will be borne out by any roboticist.'

  'And into Robot EZ-27 specifically?'

  'Yes, Your Honor.'

  'You will probably be required to repeat those statements under oath.'

  'I am ready to do so, Your Honor.'

  He sat down again.

  Dr. Susan Calvin, robopsychologist-in-chief for U.S. Robots, who was the gray-haired woman sitting next to Lanning, looked at her titular superior without favor, but then she showed favor to no human being. She said, 'Was Goodfellow's testimony accurate, Alfred?'

  'Essentially,' muttered Lanning. 'He wasn't as nervous as all that about the robot and he was anxious enough to talk business with me when he heard the price. But there doesn't seem to be any drastic distortion.'

  Dr. Calvin said thoughtfully, 'It might have been wise to put the price higher than a thousand.'

  'We were anxious to place Easy.'

  'I know. Too anxious, perhaps. They'll try to make it look as though we had an ulterior motive.'

  Lanning looked exasperated. 'We did. I admitted that at the University Senate meeting.'

  'They can make it look as if we had one beyond the one we admitted.'

  Scott Robertson, son of the founder of U.S. Robots and still owner of a majority of the stock, leaned over from Dr. Calvin's other side and said in a kind of explosive whisper, 'Why can't you get Easy to talk so we'll know where we're at?'

  'You know he can't talk about it, Mr. Robertson.'

  'Make him. You're the psychologist, Dr. Calvin. Make him.'

  'If I'm the psychologist, Mr. Robertson,' said Susan Calvin coldly, 'let me make the decisions. My robot will not be made to do anything as the price of his well-being.'

  Robertson frowned and might have answered, but Justice Shane was tapping his gavel in a polite sort of way and they grudgingly fell silent.

  Francis J. Hart, head of the Department of English and Dean of Graduate Studies, was on the stand. He was a plump man, meticulously dressed in dark clothing of a conservative cut, and possessing several strands of hair traversing the pink top of his cranium. He sat well back in the witness chair with his hands folded neatly in his lap and displaying, from time to time, a tight-lipped smile.

  He said, 'My first connection with the matter of the Robot EZ-27 was on the occasion of the session of the University Senate Executive Committee at which the sub­ject was introduced by Professor Goodfellow. Thereafter, on the tenth of April of last year, we held a special meeting on the subject, during which I was in the chair.'

  'Were minutes kept of the meeting of the Executive Committee? Of the special meeting, that is?'

  'Well, no. It was a rather unusual meeting.' The dean smiled briefly. 'We thought it might remain confidential.'

  'What transpired at the meeting?'

  Dean Hart was not entirely comfortable as chairman of that meeting. Nor did the other members assembled seem completely calm. Only Dr. Lanning appeared at peace with himself. His tall, gaunt figure and the shock of white hair that crowned him reminded Hart of portraits he had seen of Andrew Jackson.

  Samples of the robot's work lay scattered along the central regions of the table and the reproduction of a graph drawn by the robot was now in the hands of Professor Minott of Physical Chemistry. The chemist's lips were pursed in obvious approval.

  Hart cleared his throat and said, 'There seems no doubt that the robot can perform certain routine tasks with ade­quate competence. I have gone over these, for instance, just before coming in and there is very little to find fault with.'

  He picked up a long sheet of printing, some three times as long as the average book page. It was a sheet of galley proof, designed to be corrected by authors before the type was set up in page form. Along both of the wide margins of the galley were proof marks, neat and superbly legible.

  Occasionally, a word of print was crossed out and a new word substituted in the
margin in characters so fine and regular it might easily have been print itself. Some of the corrections were blue to indicate the original mistake had been the author's, a few in red, where the printer had been wrong.

  'Actually,' said Lanning, 'there is less than very little to find fault with. I should say there is nothing at all to find fault with, Dr. Hart. I'm sure the corrections are perfect, insofar as the original manuscript was. If the manuscript against which this galley was corrected was at fault in a matter of fact rather than of English, the robot is not com­petent to correct it.'

  'We accept that. However, the robot corrected word order on occasion and I don't think the rules of English are sufficiently hidebound for us to be sure that in each case the robot's choice was the correct one.'

  'Easy's positronic brain,' said Lanning, showing large teeth as he smiled, 'has been molded by the contents of all the standard works on the subject. I'm sure you cannot point to a case where the robot's choice was definitely the incorrect one.'

  Professor Minott looked up from the graph he still held. 'The question in my mind, Dr. Lanning, is why we need a robot at all, with all the difficulties in public relations that would entail. The science of automation has surely reached the point where your company could design a machine, an ordinary computer of a type known and accepted by the public, that would correct galleys.'

  'I am sure we could,' said Lanning stiffly, 'but such a machine would require that the galleys be translated into special symbols or, at the least, transcribed on tapes. Any corrections would emerge in symbols. You would need to keep men employed translating words to symbols, symbols to words. Furthermore, such a computer could do no other job. It couldn't prepare the graph you hold in your hand, for instance.' Minott grunted.

  Lanning went on. 'The hallmark of the positronic robot is its flexibility. It can do a number of jobs. It is designed like a man so that it can use all the tools and machines that have, after all, been designed to be used by a man. It can talk to you and you can talk to it. You can actually reason with it up to a point. Compared to even a simple robot, an ordinary computer with a non-positronic brain is only a heavy adding machine.'

  Goodfellow looked up and said, 'If we all talk and reason with the robot, what are the chances of our confusing it? I suppose it doesn't have the capability of absorbing an in­finite amount of data.'

  'No, it hasn't. But it should last five years with ordinary use. It will know when it will require clearing, and the company will do the job without charge.'

  'The company will?'

  'Yes. The company reserves the right to service the robot outside the ordinary course of its duties. It is one reason we retain control of our positronic robots and lease rather than sell them. In the pursuit of its ordinary functions, any robot can be directed by any man. Outside its ordinary functions, a robot requires expert handling, and that we can give it. For instance, any of you might clear an EZ robot to an extent by telling it to forget this item or that. But you would be almost certain to phrase the order in such a way as to cause it to forget too much or too little. We would detect such tampering, because we have built-in safeguards. However, since there is no need for clearing the robot in its ordinary work, or for doing other useless things, this raises no problem.'

  Dean Hart touched his head as though to make sure his carefully cultivated strands lay evenly distributed and said, 'You are anxious to have us take the machine. Yet surely it is a losing proposition for U.S. Robots. One thousand a year is a ridiculously low price. Is it that you hope through this to rent other such machines to other universities at a more reasonable price?'

  'Certainly that's a fair hope,' said Lanning.

  'But even so, the number of machines you could rent would be limited. I doubt if you could make it a paying proposition.'

  Lanning put his elbows on the table and earnestly leaned forward. 'Let me put it bluntly, gentlemen. Robots cannot be used on Earth, except in certain special cases, because of prejudice against them on the part of the public. U.S. Robots is a highly successful corporation with our extra­terrestrial and spaceflight markets alone, to say nothing of our computer subsidiaries. However, we are concerned with more than profits alone. It is our firm belief that the use of robots on Earth itself would mean a better life for all eventually, even if a certain amount of economic dislocation resulted at first.

  'The labor unions are naturally against us, but surely we may expect cooperation from the large universities. The robot, Easy, will help you by relieving you of scholastic drudgery—by assuming, if you permit it, the role of galley slave for you. Other universities and research institutions will follow your lead, and if it works out, then perhaps other robots of other types may be placed and the public's objections to them broken down by stages.'

  Minott murmured, 'Today Northeastern University, to­morrow the world.'

  Angrily, Lanning whispered to Susan Calvin, 'I wasn't nearly that eloquent and they weren't nearly that reluctant. At a thousand a year, they were jumping to get Easy. Pro­fessor Minott told me he'd never seen as beautiful a job as that graph he was holding and there was no mistake on the galley or anywhere else. Hart admitted it freely.'

  The severe vertical lines on Dr. Calvin's face did not soften. 'You should have demanded more money than they could pay, Alfred, and let them beat you down.'

  'Maybe,' he grumbled.

  Prosecution was not quite done with Professor Hart. 'After Dr. Lanning left, did you vote on whether to accept Robot EZ-27?'

  'Yes, we did.'

  'With what result?'

  'In favor of acceptance, by majority vote.'

  'What would you say influenced the vote?'

  Defense objected immediately.

  Prosecution rephrased the question. 'What influenced you, personally, in your individual vote? You did vote in favor, I think.'

  'I voted in favor, yes. I did so largely because I was impressed by Dr. Lanning's feeling that it was our duty as members of the world's intellectual leadership to allow robotics to help Man in the solution of his problems.'

  'In other words, Dr. Lanning talked you into it.'

  'That's his job. He did it very well.'

  'Your witness.'

  Defense strode up to the witness chair and surveyed - Professor Hart tor a long moment. He said, 'In reality, you were all pretty eager to have Robot EZ-27 in your employ, weren't you?'

  'We thought that if it could do the work, it might be useful.'

  'If it could do the work? I understand you examined the samples of Robot EZ-27's original work with particular care on the day of the meeting which you have just de­scribed.'

  'Yes, I did. Since the machine's work dealt primarily with the handling of the English language, and since that is my field of competence, it seemed logical that I be the one chosen to examine the work.'

  'Very good. Was there anything on display on the table at the time of the meeting which was less than satisfactory? I have all the material here as exhibits. Can you point to a single unsatisfactory item?'

  'Well———'

  'It's a simple question. Was there one single solitary unsatisfactory item? You inspected it. Was there?' The English professor frowned. 'There wasn't.' 'I also have some samples of work done by Robot EZ-27 during the course of his fourteen-month employ at North­eastern. Would you examine these and tell me if there is anything wrong with them in even one particular?'

  Hart snapped. 'When he did make a mistake, it was a beauty.'

  'Answer my question,' thundered Defense, 'and only the question I am putting to you! Is there anything wrong with the material?'

  Dean Hart looked cautiously at each item. 'Well, no­thing.'

  'Barring the matter concerning which we are here en­gaged, do you know of any mistake on the part of EZ-27?'

  'Barring the matter for which this trial is being held, no.'

  Defense cleared his throat as though to signal end of paragraph. He said, 'Now about the vote concerning whether Robot EZ-2
7 was to be employed or not. You said there was a majority in favor. What was the actual vote?' 'Thirteen to one, as I remember.' 'Thirteen to one! More than just a majority, wouldn't you say?'

  'No, sir!' All the pedant in Dean Hart was aroused. 'In the English language, the word "majority" means "more than half." Thirteen out of fourteen is a majority, nothing more.'

  'But an almost unanimous one.' 'A majority all the same!'

  Defense switched ground. 'And who was the lone hold­out?'

  Dean Hart looked acutely uncomfortable. 'Professor Simon Ninheimer.'

  Defense pretended astonishment. 'Professor Ninheimer? The head of the Department of Sociology?' 'Yes, sir.' 'The plaintiff?' 'Yes, sir.'

  Defense pursed his lips. 'In other words, it turns out that the man bringing the action for payment of $750,000 damages against my client, United States Robots and Mec­hanical Men, Incorporated, was the one who from the beginning opposed the use of the robot—although everyone else on the Executive Committee of the University Senate was persuaded that it was a good idea.'

  'He voted against the motion, as was his right.'

  'You didn't mention in your description of the meeting any remarks made by Professor Ninheimer. Did he make any?'

  'I think he spoke.'

  'You think?'

  'Well, he did speak.'

  'Against using the robot?'

  'Yes.'

  'Was he violent about it?'

  Dean Hart paused. 'He was vehement.'

  Defense grew confidential. 'How long have you known Professor Ninheimer, Dean Hart?'

  'About twelve years.'

  'Reasonably well?'

  'I should say so, yes.'

  'Knowing him, then, would you say he was the kind of man who might continue to bear resentment against a robot, all the more so because an adverse vote had———'

  Prosecution drowned out the remainder of the question with an indignant and vehement objection of his own. Defense motioned the witness down and Justice Shane called luncheon recess.

  Robertson mangled his sandwich. The Corporation would not founder for loss of three-quarters of a million, but the loss would do it no particular good. He was con­scious, moreover, that there would be a much more costly long-term setback in public relations.

 

‹ Prev