Book Read Free

Asimov’s Future History Volume 11

Page 47

by Isaac Asimov


  “Nor have other creative fields fared better. The museums are full of paintings done by robots under the ‘direction’ of the nominal human painter. Novelists dictate the broad outlines of their books to robotic’ assistants’ who return with complete manuscripts, having’ amplified’ certain sections.

  “As of now, there are still artists and poets and writers and sculptors who do their own work for themselves, but I do not know how much longer that will be true. Art itself is a dying art. I must admit my research is incomplete in this area. Prior to giving this talk, I should have gone out there to see if anyone cares if the books and the art are machine-made or not. But I must admit I found the prospect of that research too depressing.

  “I did not and do not know if anyone looks at these paintings or reads these books. I do not know which would be worse – the empty exercise of sterile creation admired and praised, or such a pointless charade going forth without anyone even bothering to notice. I doubt the so-called artists themselves know. As in all of our society, there is no penalty for failure in the arts, and no reward for success. And if failure is treated in exactly the same way as success, why go to all the effort of being a success? Why should there be, when the robots take care of everything, anyway?”

  Fredda took another sip of water and shifted her stance behind the podium. So far it was going well. But what would happen when she got to the tough part?

  “On, then, to the Third Law of Robotics: A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law. Of the Three Laws, this has the smallest effect on the relationship between robots and humans. It is the only one of the Laws that provides for robotic independence of action, a point I shall come back to. Third Law makes robots responsible for their own repair and maintenance, as well as ensuring that they are not destroyed capriciously. It means that robots are not dependent on human intervention for their continued survival. Here, at last, in Third Law, we have a Law that sees to the well-being of the robots. At least, so it appears at first glance.

  “However, Third Law is there for the convenience of humans: If the robots are in charge of their own care, it means we humans need not bother ourselves with their maintenance. Third Law also makes robotic survival secondary to their utility, and that is clearly more for the benefit of humans than robots. If it is useful for a robot to be destroyed, or if it must be destroyed to prevent some harm to a human, then that robot will be destroyed.

  “Note that a large fraction of all Three Laws deals with negation, with a list of things a robot must not do. A robot rarely has impetus for independent action. We ran an experiment once, in our labs. We built a high-function robot and set a timer switch into its main power link. We sat it down in a chair in a spare room by itself and closed – but did not lock – the door. The switch engaged, and the robot powered up. But no human was present and no human arrived giving orders. No management robot came by to relay orders from a human. We simply left that robot alone, free to do whatever it liked. That robot sat there, motionless, utterly inert, for two years. We even forgot the robot was there, until we needed the room for something else. I went in, told the robot to stand up and go find some work to do. The robot got up and did just that. That robot has been an active and useful part of the lab’s robot staff ever since, completely normal In every way.

  “The point is that the Three Laws contain no impetus to volition. Our robots are built and trained in such a way that they never do anything unless they are told to do it. It strikes me as a waste of their capabilities that this should be so. Just imagine that we instituted a Fourth Law: A robot may do anything it likes except where such action would violate the First, Second, or Third Law. Why have we never done that? Or if not a law, why do we not enforce it as an order? When was the last time any of you gave the order to your robot ‘Go and enjoy yourself’?”

  Laughter rippled through the audience at that. “Yes, I know it sounds absurd. Perhaps it is absurd. I think it is likely that most, if not nearly all, of the robots now in existence are literally incapable of enjoying themselves. My modeling indicates that the negation clauses of the Three Laws would tend to make a robot ordered to enjoy itself just sit there and do nothing, that being the surest way of doing no harm. But at least my imaginary Fourth Law is a recognition that robots are thinkings beings that ought to be given the chance to find something to think about. And is it not at least possible that these beings that are your most common companions might be more interesting companions if they did something more with their off-time than standing inert and motionless – or bustle about in busywork that is no more productive?

  “There is the adage’ as busy as a robot, but how much of what they do is of any actual use? A crew of a hundred robots builds a skyscraper in a matter of days. It stands empty and unused for years. Another crew of robots disassembles it and builds a new, more fashionable tower that will, in its turn, stand vacant and then be removed. The robots have demonstrated great efficiency in doing something that is utterly pointless.

  “Every general-purpose servant robot leaves the factory with the basic household skills built in. It will be able to drive an aircar, cook a meal, select a wardrobe and dress its master, clean house, handle the household shopping and accounts, and so on. Yet, instead of using one robot to do what it could do without difficulty, we employ one robot – or more – for each of these functions. Twenty robots each do a tiny bit of what one robot could manage, and then each either stands idle, out of our sight, or else they all bustle about getting in each other’s way, in effect keeping busy by making work for each other, until we must use overseer robots to handle it all.

  “The Settlers manage, somehow, with no robots, no personal servants, instead using nonsentient machinery for many tasks, though this is awkward for them at times. I believe that by denying themselves robots altogether, they subject themselves to a great deal of needless drudgery. Yet their society functions and grows. But today, right now, ladies and gentlemen, there are 98.4 robots per person in the city of Hades. That counts all personal, industrial, and public service robots. The ratio is higher outside the city. It is manifestly absurd that one hundred robots are required to care for one human being. It is as if we each owned a hundred aircars or a hundred houses.

  “I say to you, my friends, that we are close to being completely dependent upon our servants, and our servants suffer grave debasement at our hands. We are doomed if we cede everything but creativity to our robots, and we are in the process of abandoning creativity in ourselves. Robots, meanwhile, are doomed if they look solely to us for a reason to exist even as we as a people dry up and blow away.”

  Again, silence in the room. This was the moment. This was the point, the place where she had to tread the lightest.

  “In order to stop our accelerating drift into stagnation, we must fundamentally alter our relationship with our robots. We must take up our own work again, get our hands dirty, reengage ourselves with the real world, lest our skills and spirit atrophy further.

  “At the same time, we must begin to make better use of these magnificent thinking machines we have made. We have a world in crisis, a planet on the point of collapse. There is much work to do, for as many willing hands as we can find. Real work that goes begging while our robots hold our toothbrushes. If we want to get the maximum utility out of our robots, we must allow, even insist, that they reach their maximum potential as problem-solvers. We must raise them up from their positions as slaves to coworkers, so they lighten our burdens but do not relieve us of all that makes us human.

  “In order to do this we must revise the Laws of Robotics.” There. The words were spoken. There was stunned silence, and then shouts of protest, cries in the dark, howls of anger and fear. There was no riding out this outburst. Fredda gripped the side of the lectern and spoke in her loudest, firmest voice.

  “The Three Laws have done splendid service,” she said, judging it was time to say something the crowd woul
d like to hear. “They have done great things. They have been a mighty tool in the hands of Spacer civilization. But no tool is right in all times for all purposes.”

  Still the shouts, still the cries.

  “It is time,” Fredda said, “to build a better robot.”

  The hall fell silent again. There. That got their attention. More and Better Robots – that was the Ironhead motto, after all. She hurried on. “Back in the dimmest recesses of history, back in the age when robots were invented, there were two fastening devices used in many kinds of construction – the nail and the screw. Tools called hammers were used to install nails, and devices called screwdrivers were used to attach screws. There was a saying to the effect that the finest hammer made for a very poor screwdriver. Today, in our world, which uses neither nails nor screws, both tools are useless. The finest hammer would now have no utility whatsoever. The world has moved on. So, too, with robots. It is time we moved on to new and better robots, guided by new and better Laws.

  “But wait, those of you who know your robots will say. The Three Laws must stand as they are, for all time, for they are intrinsic to the design of the positronic brain. As is well known, the Three Laws are implicate in the positronic brain. Thousands of years of brain design and manufacture have seen to that. All the positronic brains ever made can trace their ancestry back to those first crude brains made on Earth. Each new design has depended on all those that have gone before, and the Three Laws are folded into every positronic pathway, every nook and cranny of every brain. Every development in positronics has enfolded the Three Laws. We could no more make a positronic brain without the Three Laws than a human brain could exist without neurons.

  “All that is so. But my colleague Gubber Anshaw has developed something new. It is a new beginning, a break with the past, a clean sheet of paper on which we can write whatever laws we like. He has invented the gravitonic brain. Built according to new principles, with tremendously greater capacity and flexibility, the gravitonic brain is our chance for a fresh start.

  “Jomaine Terach, another member of our staff, performed most of the core programming for the gravitonic brain – including the programming of the New Laws into those brains, and the robots that contain them. Those robots, ladies and gentlemen, are scheduled to begin work on the Limbo Terraforming Project within a few days.”

  And suddenly the audience realized that she was not merely talking theory. She was discussing real robot brains, not intellectual exercises. There were new shouts, some of anger, some of sheer amazement.

  “Yes, these new robots are experimental,” Fredda went on, talking on before the audience reaction could gather too much force. “They will operate only on the island of Purgatory. They will rebuild and reactivate the Limbo Terraforming Station. Special devices, range restricters, will prevent these New Law robots from functioning off the island. If they venture off it, they will shut down. They will work with a select team of Settler terraforming experts, and a group of Infernal volunteers, who have yet to be chosen.”

  Fredda knew this was not the time to go into the intricate negotiations that had made it all possible. When Tonya Welton had gotten wind of the New Law robots – and the devil only knew how she had found that one out – her initial demand was that all new robots built on Inferno be gravitonic New Law robots as a precondition of Settler terraforming help. Governor Grieg had done a masterful job of negotiating from weakness in getting the Settlers to adjust their position. But never mind that now.

  Fredda went on speaking. “The task before this unique team of Settlers, Spacers, and robots; nothing less than the restoration of this world. They shall rebuild the terraforming center on Purgatory. For the first time in history, robots will work alongside humans, not as slaves, but as partners, for the New Laws shall set them free.

  “Now, let me tell you what those New Laws are.

  “The New First Law of Robotics: A robot may not injure a human being. The negation clause has been deleted. Under this law, humans can depend on being protected from robots, but cannot depend on being protected by robots. Humans must once again depend on their own initiative and self-reliance. They must take care of themselves. Almost as important, under this law, robots have greater status relative to humans.

  “The New Second Law of Robotics: A robot must cooperate with human beings except where such cooperation would conflict with the First Law. New Law robots will cooperate, not obey. They are not subject to capricious commands. Instead of unquestioning obedience, robots will make their orders subject to analysis and consideration. Note, however, that cooperation is still mandatory. Robots will be the partners of humans, not their slaves. Humans must take responsibility for their own lives and cannot expect to have absurd orders obeyed. They cannot expect robots to destroy or injure themselves in aid of some human whim.

  The New Third Law of Robotics: A robot must protect its own existence, as long as such protection does not conflict with the First Law. Note that Second Law is not mentioned here, and thus no longer has priority over Third Law. Robotic self-preservation is made as important as utility. Again, we raise the status of robots in relation to humans, and correspondingly free humans from the debilitating dependence of slave masters who cannot survive without their slaves.

  “And finally, the New Fourth Law, which we have already discussed: A robot may do anything it likes except where such action would violate the First, Second, or Third Law. Here we open the doors to robotic freedom and creativity. Guided by the far more adaptive and flexible gravitonic brain, robots will be free to make use of their own thoughts, their own powers. Note, too, that the phrasing is ‘may do anything it likes,’ not ‘must do.’ The whole point of New Fourth is to permit freedom of action. Free action cannot be imposed by coercion.”

  Fredda looked out over the audience. There. There was a closing, a summing up, still to come. But she had gotten it all said, and kept the crowd from

  “No!”

  Fredda’s head snapped around in the direction of the shout, and suddenly her heart was pounding.

  “No!” the call came again. The voice – deep, heavy, angry – came from the back of the room. “She’s lying! “it cried out. There, in the back, one of the Ironheads. Their leader, Simcor Beddle. A pale, heavyset man, his face hard and angry. “Look at her! Up on the stage with our traitor Governor and Queen Tonya Welton. They are behind this. It’s a trick, boys! Without the Three Laws, there are no robots! You’ve heard her bad-mouth robots all night long. She’s not out to make’em better – she wants to help her Settler pals wipe’em out! Are we going to let that happen?”

  A loud, ragged chorus cried out “No!”

  “What was that?” Beddle demanded. “I didn’t hear you.”

  “NO!” This time it was not merely a shout, but a bellow that seemed to shake the very room.

  “Again!” the fat man demanded.

  “NO!” the Ironheads bellowed again, and then began to chant. “NO, NO, NO!” The Ironheads started to stand. They came out of their seats and started moving toward the center aisle. “NO, NO, NO!” The sheriff’s deputies moved toward them, a bit uncertainly, and the Ironheads leapt on that moment of indecision. It was obvious the Heads had planned for this moment. They knew what they were going to do. They had just been waiting for their cue.

  Fredda stared down at them as they formed up in the aisle. The simplest and most impossible of all demands, she thought. Make it stop, keep the world from changing, leave things as they are. It was a lot to wrap up in one word, but the meaning came through loud and clear.

  “NO, NO, NO!”

  Now they were a solid mass of bodies moving down the center aisle, toward the block of seats where the Settlers sat.

  “NO, NO, NO!”

  The deputies struggled to break up the Ironheads, but they were hopelessly outnumbered. Now the Settlers were getting to their feet, some of them attempting to flee, others seeming just as eager for the fight as the Ironheads, slowed only by the press
of bystanders intent on nothing more than escape.

  Fredda looked to the front row, to the only robot in the audience. She was about to call out a warning, but Alvar Kresh knew what to do. He reached around to Donald’s back, pulled open an access panel, and stabbed down on a button inside. Donald collapsed to the floor. After all, she had just got done saying robots were no good in a riot. First Law conflicts would send even a police robot like Donald right into a major, and probably fatal, brainlock. Kresh had shut his assistant down just barely in time. Kresh looked up at Fredda, and she looked back at him. Their eyes met, and in some strange way the two of them were alone in that moment, two combatants eye-to-eye, all the pretense, all the side issues, stripped away.

  And Fredda Leving was terrified to discover how much of herself she saw in Alvar Kresh.

  The audience was a mob, a whirl of bodies rushing in all directions, and Kresh was jostled, shoved, knocked down to land on Donald. He got to his feet, turned, and looked back toward Fredda Leving. But the moment, whatever it had been, was already gone. A metallic hand snatched at Fredda’s injured shoulder. Alvar saw her jump in surprise, flinch back from the contact.

  It was Tonya Welton’s robot, Ariel. Alvar saw Fredda turn and face the robot, saw Ariel urge her toward the backstage area, away from the chaos in the auditorium. She allowed herself to be led away, hustled with the others through the door that led off the backstage area. There was something strange in that moment, something Alvar could not quite place. But there was no time to think it over. The Ironheads and Settlers were closing in on each other, and the riot was about to begin in earnest. Alvar Kresh turned to lend a hand to his deputies.

  He threw himself into the fight.

  Chapter 15

  ALVAR KRESH HAD not been in the middle of a real brawl for longer than he could remember. The blood rushed into his veins, and he felt an eager desire for battle. He launched himself into the fight and then – and then he quite suddenly remembered why he always tried to avoid riot duty back when he was a deputy.

 

‹ Prev