Caliban - Caliban 01

Home > Science > Caliban - Caliban 01 > Page 31
Caliban - Caliban 01 Page 31

by Roger MacBride Allen


  “They were never there in the first place,” Terach said. “There are no Laws inherent in the structure of the gravitonic brain. That’s the whole idea. The positronic brain became a dead end precisely because the Three Laws were so tightly woven into it. Because of the inherent nature of the Laws inside the positronic brain, it was almost impossible to consider one element of the brain by itself.

  “The Laws interconnected all the aspects of the brain so thoroughly that any attempt to modify one part of a positronic brain would affect every other part of it in complex and chaotic ways. Imagine that rearranging the furniture in your living room could cause the roof to catch fire, or the paint on the basement walls to change color, and that putting out the fire or repainting could cause the doors to falloff and the furniture to reset to its original configuration. The interior architecture of the positronic brain is just about that interconnected. In any sort of deep-core programming or redesign, anything beyond the most trivial sort of potential adjustment was hopelessly complex. By leaving the gravitonic brain with a clean structure, by deliberately not making the Three Laws integral to every pathway and neural net, it became far easier to program a new pattern onto a blank brain.”

  Jomaine looked up and saw the anger and disgust on Alvar Kresh’s face. Clearly the very idea of tampering with the Three Laws was the depths of perversion so far as he was concerned. “ All right,” the Sheriff said, trying to keep his voice even. “But if there are no Laws built into the gravitonic brains, how do these damned New Laws get in there? Do you write them down on a piece of paper and hope that the robot thinks to read them over before going out to attack a few people?”

  “No.” Jomaine swallowed hard. “No, no, sir. There is nothing casual or superficial about the way a Law set--either Law set.:--is embedded into a gravitonic brain. The difference is that the lawset is embedded centrally, at key choke points of the brain’s topology, if you will. It is embedded not just once, but many times, with elaborate redundancy, at each of these several hundred sites. The topology is rather complex, but suffice it to say that no cognitive or action-inductive processing can go on in a gravitonic brain without passing through a half dozen of these Law-support localities. The difference is that in a modern positronic brain, the Laws are written millions, even billions, of times, across the pseudocortex, just as there are billions of copies of your DNA written, one copy in each cell of your brain. The difference is that your brain can function fairly well if even a large number of cells are damaged, and your body will not break down if a few DNA cells fail to copy properly.

  “In a positronic brain, the concept of redundancy is taken to an extreme. All of the copies must agree at all times, and the diagnostic systems run checks constantly. If a few, or even one, of the billions of redundant copies of the embedded Three Laws do not produce identical results compared to the majority state, that can force a partial, perhaps even a complete, shutdown.” Jomaine could see in Kresh’ s face that he was losing him.

  “Forgive me,” Jomaine said. “I did not mean to lecture at you. But it is the existence of these billions of copies of the Laws that is so crippling to positronic brain development. An experimental brain cannot really be experimental, because the moment it shifts into a nonstandard processing state, five billion microcopies of the Three Laws jump in to force it back into an approved mode.”

  “I see the difficulty,” Donald said. “I must confess that I find the concept of a robot with your modified Three Laws rather distressing. But even so, I can see why your gravitonic brains do not have this inflexibility problem, because the Laws are not so widely distributed. But isn’t it riskier to run with fewer backups and copies?”

  “Yes, it is. But the degree of risk involved is microscopic. Statistically speaking, your brain, Donald, is not likely to have a major Three Laws programming failure for a quadrillion years. A gravitonic brain with only a few hundred levels of redundancy is likely to have a Law-level programming failure sooner than that. Probably it can’t go more than a billion or two years between failures.

  “Of course, either brain type will wear out in a few hundred years, or perhaps a few thousand at the outside, with special maintenance. Yes, the positronic brain is millions of times less likely to fail. But even if the chance of being sucked into a black hole is millions of times lower than the chance of being struck by a meteor, both are so unlikely that they might as well be impossible for all the difference it makes in our everyday lives. There is no increase in the practical danger with a gravitonic brain.”

  “That is a comforting argument, Dr. Terach, but I cannot agree that the danger levels can be treated as equivalent. If you were to view the question in terms of a probability ballistics analysis--”

  “All right, Donald,” Kresh interrupted. “We can take it as read that nothing could be as safe as a positronic brain robot. But let’s forget about theory here, Terach. You’ve told me how the New Laws or Three Laws can be embedded into a gravitonic brain. What about Caliban? What about your splendid No Law rogue robot? Did you just leave the embedding step out of the manufacturing process on his brain?”

  “No, no. Nothing that simple. There are matrices of paths meant to contain the Laws, which stand astride all the volitional areas of the gravitonic brain. In effect, they make the connection between the brain’ s subtopologic structures. If those matrices are left blank, the connections aren’t complete and the robot would be incapable of action. We couldn’t leave the matrices blank. Besides, there would be no point to it. Caliban was--was--an experiment. Never meant to leave the lab. Fredda was going to install a perimeter restriction device on him the night it, ah, happened. But he was powered up prematurely, before the restricter was installed.”

  “What, Doctor, was the nature of the experiment?” Donald asked.

  “To find out what laws a robot would choose for itself. Fredda believed--we believed--that a robot given no other Law-level instruction than to seek after a correct system of living would end up reinventing her New Laws. Instead of laws, she--we--embedded Caliban ‘ s matrices with the desire, the need, for such laws. We gave him a very detailed, but carefully edited, on-board datastore that would serve as a source of information and experience to help him in guiding his actions. He was to be run through a series of laboratory situations and simulations that would force him to make choices. The results of those choices would gradually embed themselves in the Law matrices, and thus write themselves in as the product of his own action.”

  “Were you not at all concerned at the prospect of having a lawless robot in the labs?” Donald asked.

  Jomaine nodded, conceding the point. “We knew there was a certain degree of risk to what we. were doing. We were very careful about designing the matrices, about the whole process. We even built a prototype before Caliban, a sessile testbed unit, and gave it to Gubber to test in a double-blind setup.”

  “Double-blind?” Kresh asked.

  “Gubber did not know about the Caliban project. No one did, besides Fredda and myself. All Gubber knew was that we wanted him to display a series of situation simulations--essentially holographic versions of the same situations we wanted Caliban to confront--to the sessile free-matrix testbed unit, alongside a normally programmed Three Law sessile testbed. We would have preferred using a New Law robot, of course, because those were the Laws we wanted Caliban to come up with on his own. Unfortunately we hadn’t received any sort of approval for lab tests of New Law robots at that point, so that was no go.

  “But the main test was to see if an un-Lawed brain could absorb and lock down a Law set. Gubber did not know which was which, or even that the two were supposed to be different. Afterwards he performed a standard battery of tests on the two units and found that the results were essentially identical. The sessile No Law robot had absorbed and integrated the Three Laws, just as predicted.”

  “What happened to the testbed units?” Donald asked.

  “The No Law, free-matrix unit was destroyed when the tes
t was over. I suppose the Three Law unit was converted into a full robot and put to use somehow.”

  “What goes into converting a sessile unit?”

  “Oh, that is quite simple. A sessile is basically a fully assembled robot, except that the legs are left off the torso while it is hooked to the test stand and the monitor instruments installed. Basically just plug the legs in and off it goes.

  “At any rate, Fredda intended Caliban as a final grand demonstration that a rational robot would select her Laws as a guide for life.”

  “Wait a moment,” Kresh said, rather sharply. “You’re telling me this is what was supposed to happen. What is happening? What is Caliban doing out there?”

  Jomaine shrugged. “Who knows? In theory, he should be doing exactly what I’ve just described--using his experience to codify his own laws for living.”

  Kresh reached out his hands and placed them flat on the table, tapping his right index finger on its surface. He did not speak for half a minute, but when he did, all the masks were off. The calm, the courtesy, were gone, and only the anger remained in his steel-cold voice.

  “In other words, this robot that assaulted and nearly killed its creator in its first moment of awakening, this robot that threw a man across a warehouse and committed arson and refused to follow orders and fled from repeated police searches--this robot is out there trying to find good rules for living ? Flaming devils, what, exactly, are the laws he has formulated so far? ‘ A robot shall savagely attack people, and will not, through inaction, prevent a person from being attacked?’ “

  Jomaine Terach closed his eyes and folded his hands in his lap. Let it be over. Let me wake up and know this is all a nightmare. “I do not know, Sheriff. I do not know what happened. I do not know what went wrong.”

  “Do you know who attacked Fredda Leving?”

  “No, sir. No, I do not. But I cannot believe it was Caliban.”

  “And why is that? Every scrap of evidence points to him.”

  “Because I wrote his basal programming. He was not--is not--just a blank slate. He has no built-in Laws. Neither do you and I. But his innate personality is far more grounded in reason, in purpose, than any human ‘ s could be. You or I would be far more likely than he to lash out blindly in a random attack. And if I had made a mistake big enough to cause Caliban to attack Fredda like that, that mistake would have cascaded into every other part of his behavioral operant system. He would have seized up for good before he reached the door to the lab.”

  “Then who was it?”

  “You have the access recorder records. Look there. It is some one of us on that list. That’s all I can tell you for certain.”

  “Access recorder?”

  Jomaine looked up in surprise. They hadn’t known about the recorder! Of course. Why should they even think about such things? With the endless wealth of Spacer society, and the omnipresent robots to serve as watchkeepers, theft was almost unknown, and security systems even rarer. If he had not assumed they knew and let it slip, they never would have known. If he had kept his mouth shut about it, they would have had no way of knowing he had been at the lab that night, just about the time of the attack...

  But it was too late to hold back. Now they would know what to ask about. There was nothing for it but to charge on. They would get the access records, and that would be that. “It’s a Settler security device,” he said. “Tonya Welton insisted that Fredda install it because Leving Labs had access to Limbo Project material. It records the date and time and identity of the person every time someone passes in or out of the lab. It works on a face-recognition system. Humans only. It was programmed to ignore robots. Too many of them.”

  Kresh turned toward Donald 111, but the robot spoke before the Sheriff had a chance. “I have already dispatched a technical team to the labs, sir. We should have the data from the access recorder within half an hour.”

  “Very good. Now, why don’t you save us some time and effort, and tell us yourself whatever that recorder will tell us about your movements.”

  Jomaine was rattled. He had made a major mistake telling them about the recorder. But damnation! Now that they knew that much, there was no point in hiding anything else. “There is very little to tell. I had left a notepack in my lab. I noticed it when I sat down to get some work done at home. I live quite near the lab, and I walked over to collect it. I entered through the main door. I think I called out to see if anyone was around, and there was no answer. I went to my lab, got the notepack, and then left my lab by one of its side doors. That’s all.”

  “That’s your story.”

  “Yes, it is.”

  “Why didn’t you send a robot to get the notepack?” Kresh said. “Seems to me like an errand suited to a robot.”

  “I suppose I could have sent Bertran, but that would have been more trouble than it was worth. I couldn’t quite recall which notepack the data I wanted was in, or where I had left it. Sometimes I can’t even recall which pack I need. I have to put my eyes on it to be sure. My lab is often a bit of a jumble, and there are notepacks all over the place. I find that if I just stand and look at a room for a minute, I remember where the thing I’m looking for is. A robot can’t do that for me.”

  Jomaine had the uncomfortable sense that he was babbling, going on and on, but there seemed to be no way out but forward with more of the same. “Bertran would have brought me a half dozen notepacks to be sure I had the right one, which seemed a bit silly. I knew that I would be able to find the notepack myself the moment I stepped into the lab. And sure enough, I did.”

  “That seems like a rather overexplained set of reasons for why it was easier to do it yourself.”

  Jomaine glared at Kresh. “Yes, I suppose it does. But bear in mind that all of us down at Leving Labs have been hearing Fredda’s theories about excessive dependence on robots for some time now. We’ve all developed a bit of a fetish about doing things for ourselves.”

  Kresh grunted. “I know how that can be,” he said. “ All right. You’ve filled in quite a few blanks for us, Terach. You’re free to go--for now. But if I were you, I’d work on the assumption that you and I were going to have other little chats in future, about other questions that will come up. And the better your memory is when that happens, the better you and I will both like it. Do I make myself clear?”

  Jomaine Terach looked Sheriff Alvar Kresh straight in the eye and nodded. “Oh, yes,” he said. “There is nothing in the world clearer to me than that.”

  JOMAINE Terach stumbled out of Government Tower into the thin light of morning. He felt a pang of guilt for betraying Fredda’ s confidence, but little more than that. What good were petty little secrets when a whole world was turning upside down in panic? The debts he owed to the good of society, and to himself, far outweighed his obligation to Fredda. Besides, you could not know. There might be some key to it all buried deep, hidden in his words where he could not even see it. Maybe Kresh could find that key and turn it in the lock. Maybe, just maybe, by talking, he had saved them all.

  Jomaine snorted in disgust. High and mighty talk for a man who had spilled his guts. There was another explanation, one that did not come out quite so noble.

  Maybe, just maybe, he was a coward at the heart.

  He hailed an aircab and headed toward home.

  “THE access recorder data, sir,” Donald said, handing him a notepack.

  “Thank you, Donald,” Kresh said. He skimmed over the data once or twice, then studied it in greater detail. Damnation! Why hadn’t he had this data days before? It provided him something he had not had until this moment--a nice, tidy list of suspects. Suspect humans, at least. Terach had said the thing did not record the comings and goings of robots.

  “Sir, was it wise to let Jomaine Terach go free?” Donald asked. “I do not think we can consider his interrogation to be complete, and he did confess to several crimes related to violations of robot manufacture statutes.”

  “Hmmmm?” Kresh said absently. “Oh,
Terach. It’s a bit of a gamble, but if we want this case to get anywhere, I think we had to set him free--at least for now. And the same for Anshaw when we ‘re done with him. Neither of them has much of anyplace to go. I don’t regard them as flight risks. But I’m counting on at least one of them panicking. If one or both of them does, it is damned likely they will make some sort of mistake, and it is likely that their mistakes could make our jobs a lot easier. Now go and bring Anshaw in.”

  “Yes, sir.” Donald went through the door, down to the holding cells.

  Alvar Kresh stood up and paced the interrogation room. He was eager, anxious. Things had shifted suddenly. He could not explain how, or why, exactly, but nonetheless they had. The access recorder data was part of it, but not all of it. All it did was suggest certain things. It would be up to Kresh to prove them. He sensed that he was suddenly on the verge of answers, knocking on the door of a solution to this whole nightmare fiasco. All he had to do was press, push, bear down, and it would come.

  Gubber Anshaw. Kresh dropped the notepack onto the table and thought about Anshaw. The interrogation that had been put off, delayed, pushed back, forgotten, lost in the chaotic shuffle of events again and again. And now, with the access recorder data in his hand, with the fact of Ariel ‘ s presence at Anshaw ‘ s home last night, it was suddenly clear that this was the interrogation that could break this case wide open. This was the man who knew things.

  Alvar Kresh paced twice more up and down the room, but then forced himself to sit down and wait.

  The door opened, and Donald ushered in Gubber Anshaw.

  Alvar Kresh waited for Anshaw to sit down in the chair on the opposite side of the table. Then he set his hands palm-down on the table and leaned forward. Then he looked the robotics designer in the eye.

 

‹ Prev