by Isaac Asimov
It had seemed so simple back then. The first New Law robots had passed their in-house laboratory trials. After rather awkward and fractious negotiations, it had been agreed they would be put to use at Limbo. It was a mere question of manufacturing more robots and getting them ready for shipment. That would require effort and planning, yes, but for all intents and purposes, the New Law project was complete insofar as Fredda was concerned. She had time on her hands, and her mind was suddenly free once again to focus on the big questions. Basic, straightforward questions, obvious follow-ons to the theory and practice of the New Law robots.
If the New Laws are truly better, more logical, better suited to the present day, then won’t they fit a robot’s needs more fully? That had been the first question. But more questions, questions that now seemed foolish, dangerous, threatening, had followed. Back then they had seemed simple, intriguing, exciting. But now there was a rogue robot on the loose, and a city enough on edge that riots could happen.
If the New Laws are not best suited to the needs of a robot living in our world, then what Laws would be? What Laws would a robot pick for itself?
Take a robot with a wholly blank brain, a gravitonic brain, without the Three Laws or the New Laws ingrained into it. Imbue it instead with the capacity for Laws, the need for Laws. Give it a blank spot, as it were, in the middle of its programming, a hollow in the middle of where its soul would be if it had a soul. In that place, that blank hollow, give it the need to find rules for existence. Set it out in the lab. Create a series of situations where it will encounter people and other robots, and be forced to deal with them. Treat the robot like a rat in a maze, force it to learn by trial and error.
It will have the burning need to learn, to see, to experience, to form itself and its view of the universe, to set down its own laws for existence. It will have the need to act properly, but no clear knowledge of what the proper way was.
But it would learn. It would discover. And, Fredda told herself most confidently, it would end up conferring on itself the three New Laws she had formulated. That would be a proof, a confirmation that all her philosophy, her analysis and theory, was correct.
The car reached its assigned altitude. The robot pilot swung the aircar around, pointed its nose toward Fredda’s house, and accelerated. Fredda felt herself pressed back into the cushions. The gentle pressure seemed to force her down far too deep into the seat, as if some greater force were pressing her down. But that was illusion, the power of her own guilty imagination. She thought of the things she had told her audience, the dark secrets of the first days of robotics, untold thousands of years before.
The myth of Frankenstein rose up in the darkness, a palpable presence that she could all but see and touch. There were things in that myth that she had not told to her audience. The myth revolved about the sin of hubris, and presuming on the power of the gods. The magician in the story reached for powers that could not be his, and, in most versions of the tale, received the fitting punishment of complete destruction at the hands of his creation.
And Caliban had struck her down in his first moment of awareness, had he not? She had given him that carefully edited datastore, hoping that coloring the facts with her own opinions would help form a link between the two of them, make him more capable of understanding her.
Had he understood her all too well, even in that first moment? Had he struck her down? Or was it someone else?
It was impossible for her to know, unless she tracked him down, got to him before Kresh did, somehow, and asked Caliban herself.
That was a most disconcerting idea. Would it be wise to go out looking for the robot that had seemingly tried to kill her?
Or was that the only way she could save herself? Find him and establish his innocence? Besides, it was not as if Caliban was the only threat she faced, or that simple physical attack was the only way to destroy a person.
The whole situation was spiraling out of control. It would not need to go much further in order to destroy her reputation altogether. Perhaps it was too late already. If her reputation collapsed, she would not be able to protect the New Law robots for the Limbo Project. There was a great deal of infighting left to do before the NLs would be safe. Rebuilding Limbo would require robot labor; there simply weren’t enough skilled people, Spacer or Settler, available to do the work. But Tonya Welton had made it clear that it was New Law robots or nothing for Limbo. Without the New Law robots, the Settlers would pullout; the project would die.
And so would the planet.
Was it sheer egotism, hubris on a new, wider, madder plane, to imagine herself as that important? To think that without her there to protect the New Law robots, the planet would collapse?
Her emotions told her that must be so, that one person could not be that important. But reason and logic, her judgment of the political situation, told her otherwise. It was like the game she had played as a child, setting up a whole line of rectangular game pieces balanced on their ends. Knock one down, and the next would fall, and the next and the next.
And she could hardly save the New Law robot project from inside a prison cell.
There were other versions of the old Frankenstein myth that she had found in her researches. Rarer, and somehow feeling less authentic, but there just the same. Versions where the magician redeemed himself, made up for his sins against the gods, by protecting his creation, saving it from the fear-crazed peasants that were trying to destroy it.
She had choices here, and they seemed to be crystallizing with disturbing clarity. She could find Caliban, take the risk that he had done no harm and that she could prove it, and thus redeem herself and save Limbo. It was a risky plan, full of big holes and unsubstantiated hopes.
The only alternative was to wait around to be destroyed, either by Caliban or by Kresh or sheer political chaos, with the real possibility that her doom spelled that of her world as well.
She straightened her back and dug her fingers deeper into the armrests of her chair. Her way was clear now.
Strange, she thought. I’ve reached a decision, and I didn’t even know I was trying to decide anything.
ALVAR Kresh lay down gratefully, painfully, in his own bed. It had been another incredibly long and frustrating night. After the robots had quelled the riot, and he had revived Donald, there had been the whole weary task of cleaning up after a riot. The night had been given over to handling arrests, tending to the injured, evaluating property damage, collecting statements from witnesses.
It was not until after it was all done and he was sitting in his aircar, allowing Donald to fly him home, that he even found the time to think over the things Fredda Leving had said. No, more than think; he had brooded, lost himself in a brown study, all the way home, scarcely aware that he had gotten home and into bed.
But once in bed, with nothing to stare at but the darkness, he was forced to admit it to himself: The damnable woman was right, at least in part.
Put to one side the utter madness of building a No Law robot. His whole department was already at work, doing all they could, to track down Caliban and destroy him. That was a separate issue.
But Fredda Leving was right to say Spacers let their robots do too much. Alvar blinked and looked around himself in the darkness. It suddenly dawned on him that he had gotten into bed without any awareness of his actions. Somehow he had been gotten into the house, changed out of his clothes, washed, and put into bed without being aware of it. He considered for a moment and realized that Donald had done it all.
The unnoticed minutes snapped back into his recollection. Of course Donald had done it, guiding Alvar through each step, cuing him with hand signals and gentle touches to sit here, lift his left foot, then his right foot, to have his shoes and pants removed. Donald had led him into the refresher, adjusted the water stream for him, guided him into it, and washed his body for him. Donald had dried him, dressed him in pajamas, and gotten him into bed.
Alvar himself, his own mind and spirit, might as wel
l not have been there for the operation. Donald had been the guiding force, and Alvar the mindless automaton. Worrying over Fredda Leving’s warning that the people of Inferno were letting their robots do too much for them, Alvar Kresh had not even been aware of how completely his robot was not merely caring for him, but controlling him.
Alvar suddenly remembered something, a moment out of his past, back when he had been a patrol officer, sent on one of the most ghastly calls of his entire career. The Davirnik Gidi case. His stomach churned even as he thought of it.
In all places, in all cultures, there are aspects of human nature that only the police ever see, and even they see only rarely. Places they would just as soon not see at all. Dark, private sides of the human animal that are not crimes, are not illegal, are not, perhaps, even evil. But they open doors that sane people know should be closed, put on display aspects of humanity that no one would wish to see. Alvar had learned something from Davirnik Gidi. He had learned that madness is troubling, frightening, in direct proportion to the degree to which it shows what is possible, to the degree it shows what a seemingly sane person is capable of doing.
For if a person as well known, as much admired as Gidi, was capable of such—such deviations—then who else might be as well? If Gidi could drop down that deep into something that had no name, then who else might fall? Might not he, Alvar Kresh, fall as well? Might he not already be falling, as sure as Gidi that all he was doing was right and sensible?
Davirnik Gidi. Burning hells, that had been bad. So bad that he had blocked it almost completely out of his memory, though the nightmares still came now and then. Now he forced himself to think about it.
Davirnik Gidi was what the Sheriff’s Department primly called an Inert Death, and every deputy knew Inerts were usually bad, but it was universally agreed that Gidi had been the worst. Period. If there was ever a case that warned of something deeply, seriously, wrong, it was Gidi.
The Inerts were something Spacers did not like to talk about. They did not wish to admit such people existed, at least in part because something that is appalling only becomes more so when it is also dreadfully familiar. Nearly every Spacer could look at an Inert and worry if the sight was something out of a distorted mirror, a twisted nightmare version built out of one’s self.
Inerts did nothing for themselves. Period. They organized their lives so that their robots could do everything for them. Anything they would have to do for themselves they left undone. They lay on their form-firming couches and let their robots bring their pleasures to them.
So with Gidi, and that was the frightening thing. Inerts were supposed to be hermits, hiding away from the world, lost in their own private, barricaded worlds, deliberately cutting themselves off from the outside world. But Gidi was a well-known figure in Inferno society, a famous art critic, famous for his monthly parties. They were brilliant affairs that always started at the dot of 2200 and ended on the stroke of 2500. These he attended only by video screen, his wide, fleshy face smiling down from the wall as he chatted with his guests. The camera never pulled away to reveal anything but his face.
So a young Deputy Kresh learned in the follow-up investigation after his death. He could not have found out firsthand: Sheriff’s deputies simply did not get into events as elegant as Gidi’s parties.
In Spacer society, a host not attending his own parties was not especially unusual, and so Gidi’s absence was not remarkable. A very private man, people said of Gidi, and that explained and excused all. Spacers had great respect for privacy.
The only thing that was thought odd was that Gidi never used a holographic projector to place a three-dimensional image of himself in the midst of his parties. Gidi explained holographs made for parlor tricks, and would create an illusion he did not wish to advance—that he himself was truly present. Illusions disconcerted people. They would try to shake the projection’s hand, or pass it a drink, or offer it a seat it did not need. No host wished to upset his guests. It was just that he was in essence a shy man, a retiring man, a private man. He was content to stay at home, to enjoy talking with his friends over the screen, to watch them as they had their fun.
It even started to become fashionable. Other people started making screen appearances at social events. But that fad stopped cold the day Chestrie, Gidi’s chief household robot, called the Sheriff’s office.
Kresh and another junior deputy took the call and flew direct to Gidi’s house, a large and grim-faced house on the outskirts of the city, its exterior grounds strangely unkempt and untended. Vines and brambles had grown clear over the walk, and over the front door. Clearly no one had gone in or out of the door in years. Gidi never sent his robots outside to tend the yard—and never went out himself, it seemed.
The door sensors still worked, though. As soon as the two deputies came close, the door slid open, the mechanism straining a bit against the clinging vines. Chestrie, the chief robot, was there to meet them, clearly agitated. A puff of dust blew out the door, and with it, the smell.
Flaming devils, that smell. The stench of rot, of decayed food, human waste, old sweat and urine hit the deputies hard as a fist, but all of that was as nothing to what lingered beneath—the sweet, putrid, fetid reek of rotting flesh. Even now, thirty years after, the mere memory of that roiling stench was enough to make Kresh feel queasy. At the time it had been bad enough that Kresh’s partner passed out in the doorway. Chestrie caught him and carried him outside. Even out in the air, the stink seemed to pour out of the house, all but overwhelming. It took Kresh’s partner a minute to recover, and then they went back to the patrol car. They pulled out the riot packs and got the gas masks.
Then they went in.
Later, the experts told Kresh that Gidi was a textbook example of the Inertia syndrome. Victims of the syndrome started out normally enough, by Spacer standards. Perhaps a bit on the reclusive side, a bit careful, a bit overdetermined to control their own environment. There was some debate over the triggering mechanism. Some said it was the sheer force of habit, driving the victim’s behavior into more and more rigid channels, until all activity was reduced to ritual. Gidi’s cup of tea at bedtime had to be made precisely the same way every night, or risk throwing the pattern off. Even his monthly parties were ritualized, starting and ending with the precision of a space launch.
But patternizing was only part of it. Self-enforced seclusion was the other half of the Inertia syndrome, and according to some, the real trigger for it. Some unpleasant disturbance would upset the victim, throw off the ritual, and the victim would decide never to let any such thing happen again. The victim would gradually cut off all ties with the outside world, order his or her robots to refuse all visitors, arrange for all essentials to be delivered—typically, as in Gidi’s case, by the less obtrusive underground tunnels rather than by surface entrance. As with Gidi, the victim would often literally seal himself off from the outside world, ordering his robots not to open the door to anyone, ever, period.
The deputies learned a lot from Chestrie and the other robots, and from the copious journals Gidi kept, chronicling his search for what he called “a comfortable life.”
The journals seemed to reveal the moment when the downhill slide began. He attended a party that did not go well, one that ended with an inebriated fellow guest attacking Gidi over some imagined insult.
The violence stunned him, shocked him. Gidi stopped attending parties, and soon stopped leaving home altogether.
He could stay where he was, in perfect comfort. With his comm panels and entertainment systems all about him, why would he want to move? With his robots eager and willing to do anything and everything for him, it began to seem foolish, almost criminal, to act for himself when the robots could always do things better and faster, do them with no upset to his routine, his pattern. He could lose himself in his art catalogs, in dictating his articles, in endless fussy arrangements for his monthly parties. In his journals, he described himself as “a happy man in a perfect world.”
At least, all was nearly perfect. The more peace and quiet he had, the more the remaining disturbances irritated him.
Any needless action, by Gidi or by his robots, became unthinkably unpleasant. He began to obsess on simplification as much as regularity, determined to strip away to the essentials, and then strip away whatever he could of the remainder. He set out on a quest to remove all the things that could disturb his quiet, his peace, his solitude, his comfort of being secure in his own place. Banish them, eliminate them, and he could achieve a perfect existence.
Things started to close in as his obsession gathered strength. Gidi realized he need never leave his comm room, or even get out of his favorite recliner chair. He ordered his robots to bring his food in the chair, wash him in the chair. And then came the moment when, beyond question, even by the standards of the most hermetic Spacer, the scales tipped into madness. Gidi ordered his robots to contact a medical supply service, procure the needed equipment. He replaced his chair with a hospital-type bed with a floater-field, the sort used for burn victims and long-term patients. It would eliminate the danger of bedsores, and it had built-in waste-removal lines, thus removing his last reason for getting up. If the system was not entirely perfect, and there were occasional minor leakages, the robots could take care of it.
But even perfect indolence was not enough. There was too much activity around him. He soon grew weary of the robots fussing about him, and ordered them to find ways to reduce their level of activity, cutting back on housecleaning, and then finally ceasing it altogether. He ordered them to stop care of the exterior grounds as well, claiming that the mere thought of them scurrying about out there, pruning and cutting and digging, was most upsetting to his calm.