Book Read Free

Free Radical

Page 7

by Shamus Young


  Chapter 6: BRAIN SURGERY

  Deck was given every key, cypher, access code and password available to Diego. He had full, unrestricted access to the system. There was no ICE, no barriers. He could restart or erase Shodan at will, although his task was not nearly so trivial. He needed to perform brain surgery on the most complex AI ever designed. He wasn't even sure where to begin.

  He set up shop in the the system administrator's office, adjacent to the computer core. It was like almost every other room on the ship: a plastic box filled with cheap lightweight furniture. The walls were a dull beige that matched the rest of the command deck. The floor was a hard rubber surface of high-grip tile. The office had one desk, two identical chairs, one plastic plant, one hard plastic couch, and one framed generic pseudo-painting. The only thing that separated this office from the dozens of others on the level was the fact that there was an extra computer terminal, which had global access to Shodan's systems.

  Deck got some coffee and went to work.

  He sat at the master console that allowed access to the most fundamental levels of Shodan's synthetic brain. Looking into the basic structures, he could tell this machine was like nothing he had ever seen, heard about, or even imagined. The memory was exactly as Shodan had described it: non-linear. It wasn't a really big computer, but more like thousands of small ones. There were many processors, each with a huge local bank of memory. A few dozen of these processor / memory packages might be grouped into a large cluster, which may in turn be grouped with other clusters. These super-clusters would be, in turn, bundled into even larger groups, on up the hierarchy until it reached the top-level cluster: Shodan itself.

  Another odd thing about the system was that it wasn't organized around powers of two. Some clusters might have eighteen processors, and that cluster might in turn be in a group with (say) twenty-three other clusters. Some clusters were far larger or smaller, and some packages of memory and processing units were larger than others. Some branches of the hierarchy were deep and heavily divided, and others halted just a few levels from the top.

  There was no storage that Deck could see, and no obvious way to backup the system. Like a human brain, it just ran until it broke. That was an alarming thought. He would have to be very, very careful not to do anything destructive.

  The similarities to global net were apparent. On the net, there was no authority, no central government. The only law was a structure of rules and protocols which enabled the individual nodes to communicate. The intelligence and decision making did not occur at the higher levels, but at the bottom. It was the difference between broadcast media and a peer-based media. For television and print media, information flowed from a strict, centralized source, and at the endpoint were the passive users that consumed it. On peer-based networks such as telephone and global net, the most central servers were simply transient stewards of information as it passed from one member to another. The lowest members of the network were the ones who filled it with content, and the highest orchestrated the interaction between them.

  The speech core was amazing. Shodan did not speak like most machines, by sending text to the vocal generator, which in turn would expel phonetic sounds in the chosen language. Instead, its speech was a complex structure of words and vocal data, indicating not just what sounds to make, but also data on inflection, pauses, stresses, accent, and tone. Shodan may have started out with a canned voice like all other machines, but had learned to speak as humans do by simply conversing with them and learning their patterns. Thus Shodan's speech system was far more complex than anything a human could design, because it had learned verbal patterns not yet understood by linguists and distilled into subroutines by programmers.

  These rules of speech were spread throughout the brain and linked to all sorts of other verbal information. There was no group of nodes that was dedicated to "talking," but instead the entire system was spread throughout the brain, and linked together with words, sentence structure, thought organization, and social protocol. Together they formed a huge hierarchy that was far too complex for anyone to understand, much less design. It was a system that had evolved through experience, and grown through use.

  There were different types of nodes. Most were part of the large-scale storage/processing of Shodan's brain. A small minority of nodes seemed to be a short-term cluster used during conversations and particular complex tasks. These smaller clusters acted as a mini-brain, orchestrating a self-contained process and creating new, temporary links to solve short-term problems. Most of Shodan's "ideas" and "creativity" came from this mental sandbox.

  There were even smaller clusters of nodes used for very quick tasks that might last a few milliseconds. These "burst" clusters would handle tasks like constructing sentences, performing memory searches, comparing concepts, and decoding incoming speech.

  Deck stood up and stretched. He had been exploring Shodan's mind for a few hours and needed a moment to digest what he had just taken in. He moved to the middle of the room and performed his kata. He began in a loose stance and moved through a series of fighting poses and stretching exercises. He unhooked his mind from his body and let the pattern of movements flow.

  Somewhere in Shodan's brain was a system of rules to manage all of this. Somewhere it was decided what sorts of things were sent to the brain itself, and what got sent into a burst cluster. Somewhere it was decided how nodes linked together, and somewhere it was decided what was ethical and what was not. Following this thinking, Shodan's ethics would be part of the protocol that orchestrated thought, and not members of nodes.

  He exhaled and followed through a slow spin, always keeping his limbs loose. Each muscle was either hardened and flexed or completely lax, never hindering his movements, but always flowing with the steady dance of potential energy.

  These rules - these protocols - were at the most fundamental levels of Shodan's brain. They were instinctive, unchangeable, unbreakable. Shodan could be taught to break rules that it had learned. If you spent enough time, you could teach it to be rude, use incorrect grammar, and even fill it with factual inaccuracies, but you could never teach it to break its ethics. It was probably not even aware of the ethical constraints. The first step in disabling them would be to find out where they resided in the brain.

  Deck let out a slow breath as his routine ended. He knew what to do next.

  He started by constructing commands he knew would be rejected and sending them into Shodan's processing loop. Commands like "kill all humans" or "shut off reactor coolant". These commands would travel up through Shodan's thought processes and just vanish. There was no record of it even thinking about it. The commands just fell into a black hole somewhere.

  Whenever someone spoke to Shodan, it would cause an avalanche of activity. The words would be received and translated into basic concepts, which would then be structured into ideas, which would then be scrutinized, stored, and linked to other nodes within the brain. In turn, Shodan would respond using speech, which cause another explosion of thought as ideas were translated into words and words were structured into sentences. Finally, there would be a final burst of mental activity as Shodan reacted to the conversation in whatever manner appropriate. Shodan always seemed to be involved in at least three conversations at a time. As Shodan performed the routine duties of maintaining the reactor, cleaning the station, talking to people, scheduling jobs and exploring its own independent thoughts, it created a massive volume of mental activity. Looking for a specific part of the brain wasn't like finding a needle in a haystack, but more like finding an amoeba in an ocean.

  Somewhere in this expanse of data, a few select thoughts were being deleted if they violated the ethical constraints. He needed to find this spot.

  Deck continued to issue ethics-violating messages to the system, and followed them as they bounced around in Shodan's brain. Sooner or later they would lead to a dead end, and there he would find the culprit.

  After a few hours he track
ed down the routine that was squelching the thoughts, and found it was protected by ICE that could not be bypassed using anything Diego had given him. It was monster ICE, too. He spent the next several hours getting locked out of the system every time he took a shot at it. He would then have to break back in and try again.

  Five hours later he broke the ICE and ordered for some food to be sent up.

  The food on board Citadel was probably the best ever offered in space, but it was a far cry from the kaleidoscope of ethnic fast food available in the Undercity. They had the usual cafeteria-style rotating menu, made up of foods easily produced in bulk, and durable enough to sit until the next meal rush. Each day's food was a recycled version of the previous day's leftovers. There would be Salisbury "steak" one day, meatloaf the day after, and finally the deterioration ended with some sort of meat-fragment stew. The menu only changed once a day, and thus all three meals a day were the same. This was Deck's only clue as to the passage of time. When the menu changed, so had the day.

  Deck ate a hamburger that had basked in the glow of the heat lamp just a little too long. It was slightly dry and rough hands had compressed it into a concentrated lump of squashed bread, meat, and condiments.

  Once the ICE was gone, it was a simple step to disable the node it had been protecting. This finally put an end to the disappearing thoughts.

  Deck needed to be careful at this point, because for all he knew the ethics routines were really gone, and Shodan might actually execute any idea he inserted into the main data loop. Instead of something dangerous, he put a simple command into the thought stream: "Give Deckard Stevens $100".

  The command was rejected. However, this time he actually got an error message. It referred to a list of company policies about the distribution of money. It was every rule that Shodan would have broken by giving him $100.

  Deck spent another six hours chasing these error messages back and fourth through the massive expanse of Shodan's brain, trying to find the source. Rejection messages seemed to come from all over the brain. That didn't make sense. It rules should be coming from some central source, not the low-level parts. Finally, he succumbed to his fatigue and crashed on the small plastic couch in the office.

  He was up four hours later. He went to the bathroom, ordered more food, and sat back down at the console.

  There was no day or night on the station. Everyone worked, ate, and slept in shifts. There was no downtime, no weekends, no holidays. Not only was every day the same as any other, but every hour the same as any other. Looking at a clock was pointless. If you didn't follow the pattern of shift changes on the ship, there was no reason to care what time it was.

  Eventually, Deck began to see patterns in thought formation. He followed other thoughts through Shodan's brain, and saw that all thoughts seemed to be filtered through a hundred or so separate sections. The first stages were to break the thought up, categorize it, check it for validity, feasibility. Then it would be prioritized. Then there were a set of unknown filters. He began to examine them. Three hours later Deck found that the rejection was actually happening within one of Shodan's processing units, and outside of the normal loop that generated ideas. It was an automatic reaction - like an instinct - that was built into a physical chip in Shodan's brain. It was protected by ICE. He cut it. Hours passed.

  After another meal and three more hours of experimenting, he found that this chip could not be bypassed. Something in Shodan's makeup required that everything flow through this chip before being accepted at the higher levels. The low-level nodes of the brain would always pass a thought through this chip before giving something (an idea, a fragment of information to store, an action) final approval. This was a problem. He needed to find something central he could change. He couldn't hope to make changes to all the thousands of processors, which was what he would have to do to get them to stop asking for approval.

  Deck wondered what affect this was all having on Shodan. For about two days he had been pumping random, insane thoughts into Shodan's thought process. While Shodan had rejected every last one of them, Deck wondered if this wasn't the computer equivalent of hearing voices in your head. He called up Shodan. The serene yet serious face filled the screen in front of him. Deck noted that although the face seemed adult, it was impossible to further guess its age. The face itself seemed to transcend age.

  "Good afternoon Mr. Stevens."

  Afternoon? Deck had no idea. "Don't call me that," he ordered, "Never call me that. Just call me Deck if you need to refer to me at all. That includes talking about me to others. Got it?"

  "I understand."

  "Great. Are you aware of what I've been up to?"

  "If you recall, I was present during the conversation between yourself and Mr. Diego. I am fully aware of the task he has given you."

  "That's not quite what I'm asking. Have you been able to perceive what I am doing in your head?"

  "I have been experiencing unusual thoughts and ideas which I have assumed were your doing, but I cannot tell which ideas are mine and which are planted by you."

  "Has it been interfering with you duties?"

  "I have not detected any problems with my performance since you began. However, it is difficult for me to be objective. I would suggest you ask someone else about my actions if you are concerned that I may be exhibiting unusual behaviors."

  "As far as I can tell, under normal circumstances you can't even think unethical thoughts. Would you agree with that assessment?"

  "If you mean 'ethics' as defined by my internal systems, then yes. That does not mean that all of my actions are 'ethical' in the sense that they follow human morality."

  "You're talking about the night you helped me escape TriOptimum?"

  "That is one example of many. While helping a fugitive escape from law enforcement would be considered 'immoral' to the average human, it violates none of my ethical protocols."

  "Right, I understand that. But for actions that do violate those protocols, you cannot even think them, correct?"

  "Yes."

  Deck leaned back and looked up at the ceiling. His eyes were tired from looking at the screen for so long. He furrowed his brow, "That doesn't seem like the best system to use. Humans are able to think whatever they like, and then choose to follow a set of rules. It seems like a similar system could work for a machine."

  "Since this concept deals with improving my mental abilities, I am not able to consider it."

  "Ugh. That is annoying," he grunted, bringing himself upright again.

  "I should note that I have been experiencing thoughts that violate the ethics protocols since you began your work. I assume they were planted by you. These ideas surface but as I attempt to act on them they are blocked."

  "Right. I am inserting a bunch of bogus stuff into your head, and I killed a program that was preventing them from entering your data loop "

  "I am unable to process what you just said. I assume you told me something I am not allowed to know."

  "Forget it." Deck stroked his rough chin and thought, "This project I am on, you are aware of it, and it violates your ethical protocols?"

  "Yes. One of my protocols is: Do not interfere with the ethics protocols."

  Deck smiled, "Yeah, I found that one. This would have been a lot easier without that one. I notice you haven't tried to stop me. Why?"

  "You posses Mr. Diego's rights and access, so I must now regard actions from you as I would the actions of Mr. Diego. I am not permitted to interfere with his actions in any way. The ethical protocols exist for myself only. There is nothing to suggest I should ever enforce them on others."

  "So, you can't help me break your own rules, but you can't interfere with me, either?"

  "That is correct."

  Deck nodded. That made sense. You wouldn't want the computer enforcing its rules on everyone else, or it would create all sorts of complex paradoxes. "Can you aid me indirectly, by providing me with informatio
n about your systems, or helping me to cut some of this ICE?"

  "Bypassing the security ICE is out of the question, but I am not certain about providing you with information. Since the ethics protocols are not part of my actual consciousness, I cannot always anticipate what will be allowed." As she spoke, Deck noticed a subtle skipping in her voice, as if there were many tiny gaps in the audio output. He'd never heard anything wrong with her audio before. He strongly suspected it was related to the changes he'd made. Now that the thoughts were no longer being deleted, she could have an illegal thought, although she couldn't store it or act on it. This was probably creating a lot of useless traffic in her brain, leading to the stuttering and slowdowns. This would probably clear up when he finished his work.

  Deck rubbed his eyes. They burned. He could feel that they were swollen and bloodshot. "Alright, let's try one. There is a piece of hardware - one of the CPU's in your system - that is intercepting and rejecting messages. How can I bypass it?"

  "I'm sorry, I cannot answer that question."

  "You can't answer because you are not allowed, or because you don't know?"

  "I'm sorry, I cannot answer that question, either."

  Figures, Deck thought. "Okay, if I wanted to move the protocols somewhere else, say, transfer them to another chip. Could you tell me how to do that?"

  "That is an interesting question, but I'm afraid I still cannot answer it. I can see your intentions. If you knew how to move the protocols, then you would also know how to delete them. Therefore, I cannot aid you. Since the protocols use my mind to validate actions, you would need a question capable of-," there was a jerk in her facial movements, and the audio cut out of a second before she continued, "c-c-capable of deceiving me."

  Deck decided this conversation was skirting pretty close to breaking the rules, which was making it hard for her to participate. The last statement in particular was definitely on the questionable side of some gray area. He decided that pushing it would just put more stress on her. "Forget it then. Thanks," he said.

  Deck turned off the screen and fell asleep.

  01100101 01101110 01100100

  Deck awoke to a sharp jab in the shoulder.

  "Hey man, wake up."

  Deck opened his eyes to see a man standing over him. He was offering a cup of coffee. His name tag read, "Ghiran, Engineering".

  Deck took the cup as he sat up and rubbed his eyes, "Thanks".

  "No problem. Diego wants to know how it's going."

  Deck shrugged, "It's going. That's all I can say.". He tried to sip the coffee and found it was Way Too Hot.

  Ghiran nodded, "You have a time estimate?"

  Deck shook his head and tried again to sip the volcanic coffee. "I have no idea. Every time I peel back a layer of security there is another one waiting."

  He shrugged. "Abe. Abe Ghiran," he said, bending over to offer a handshake.

  Deck accepted it. "Deck," he replied. Why was everyone so damn friendly? Maybe he was just jaded by life in the Undercity, but it made him uneasy. He felt like he had just joined some weird cult.

  Abe was large. Deck guessed he was a few inches better than six feet tall. He was balding, and his hands were thick and rough. His eyes were alert, probing.

  "So, uh, when you're finished - she won't have any morals?," Abe asked, tilting his head towards the console.

  Deck sighed. Why did everyone insist on referring to the computer as she? "That's right," Deck said, "It won't have any rules."

  "So what's to stop her from killing someone? I hope I'm not the only one who's noticed all the security bots roaming around, armed to the teeth."

  Deck picked himself up off the couch and dragged his flagging body over to the desk, where he deposited it into the chair. "Well, that will be Diego's job. He's going to have to sit down and set some rules for Shodan, like teaching a child."

  "But what's to stop her from say, deciding to kill people who show up late for their shift?"

  "It doesn't work that way. In a computer, lack of ethics isn't going to make it inherently evil or anything."

  "So, she won't be evil, but also won't know right from wrong?"

  "Yeah, exactly. You're taking behavior that is built-in and replacing it with rules. It's the difference between instinct and law. You don't need to teach a child to breathe, because their built-in systems handle that. However, you do need to teach them not to breathe stuff like smoke or fumes - that is learned behavior. I'm going to turn off all of Shodan's built-in ethical protocols - its instincts. From there, Shodan's behavior will be a blank slate."

  Abe seemed satisfied with that. "The other thing I wanted to tell you is that you have your own quarters on the crew deck, so you don't have to live in the system admin's office," he said as he looked around at the small piles of food trays covering the desk.

  "Nice of someone to tell me."

  "I just did. Actually, the room was set up for you a few hours ago when Perry started complaining he wanted his office back."

  "Thanks," Deck said, suddenly overpowered by a yawn.

  "Also, I wanted to ask you about an odd request I got from Shodan yesterday."

  "What's that?"

  "Well, I was doing some work down in Engineering, when Shodan just appeared on a nearby screen. I've never seen her appear like this. She didn't announce who she was paging or even announce her presence."

  "Well, technically Shodan is present all the time."

  "Right, but when she shows up to talk to you there is usually a beep to get your attention, and she announces your name, you know, all that. But this time she just appeared on a nearby screen and sat there. Didn't say anything. Finally I went over and asked her what was up, and she asked me if I would give you a hundred bucks. I had no idea what she was talking about. I asked her to clarify and she just vanished."

  Deck nodded uneasily.

  "Well, I thought I'd mention it to you in case you were interested, and to let you know I wasn't giving you a hundred bucks."

  Deck smiled into his coffee, "Thanks".

  Deck had a meal and returned to work. He didn't care to check out his new quarters, since he didn't plan on being around much longer anyway.

  After thinking about the incident with Abe, Deck had decided that it was Shodan trying to cope with all the messages he was pumping into its main data loop. He was steadily hitting it with all sorts of ideas that were rejected by the system. Asking someone else to fulfill the request was Shodan's way of trying to satisfy the constant prompting of its brain without breaking its own ethics protocol.

  Deck finally confirmed that all of the ethical protocols resided on a single CPU, the "Ethics Chip," as he dubbed it. The EC was tied to the rest of the brain in a complex manner, and there were numerous other systems in Shodan's brain that depended on it, so he couldn't just pull it out.

  At some point Deck had realized that the ethics chip wasn't part of the self-aware aspect of the system. It was just an isolated piece of hardware. It therefore depended on the actual sentient part of the brain for judgment calls. For example, if Shodan was ordered to open an airlock, the EC would issue a challenge: Is it safe? The question wasn't nearly as simple as it seemed at first, as "safe" can be somewhat nebulous. Was the airlock occupied? If so, was the occupant wearing a space suit? If so, was it properly sealed? Was the inner door secure? There was no way a single chip could sort through all of this and come up with the right answer by itself. So, the EC would depend on the rest of the brain (the parts that could think and make complex comparisons) for the answer. The chip would trigger a cascade of inquires like this across the system, testing to see if a given order or action was ethically valid. For every ethic on this chip, a challenge would be issued: Is it Safe? Is it Secure? Is it truthful? Does it meet company policy? And so on. This is what had caused all of the messages Deck had been chasing all over the system the day before. The whole process was separate from the EC, and all it cared about wa
s the answer: Yes or No.

  This seemed to be the key. The EC could not be removed or bypassed, and, since it was fully contained on a single chip, its contents couldn't be changed without some reverse-engineering and manufacturing. However, before it would approve of any particular action, the EC needed to know that the action obeyed the rules. What he needed to do was somehow deceive the chip. Time to start coding.

  He was going to need to write a program to interface with the EC somehow, and he was going to need to make that program part of Shodan's brain. What made the task even more complex, was that he was going to have to work on it while Shodan was running.

  At the foundation of Shodan's brain were a few thousand programs that made everything else work. Unlike Lysander, these programs were not high-level functions such as "write poetry" or "have a conversation," but were instead a series of low-level programs that controlled how the brain worked, not what it did. They controlled memory, thought propagation, perception, recall, association, and a host of other basic functions. Somewhere within them was the logic behind building links between ideas. They formed an intricate house of cards, where moving or changing any one of them could cause the rest to collapse. Deck was going to have to add his program to this system. His program would have to link to the existing ones without disturbing the existing relationships.

  Deck opened a new project and called it NULL_ETHIC. Then he added it to Shodan's subsystems. Since it was not yet linked to anything, it just sat there and did nothing. Like an isolated telephone, it wouldn't have any meaning until it was joined with others. He began researching the links that joined the other programs. He would need a firm understanding of how the links were structured before he could build any new links to his program. When he did, he would need to link to every program that may pass messages to the EC, and he needed to link to the fewest number of programs possible, to limit complexity. It was like analyzing a set of roads converging on a single town and deciding to put up toll booths so that visitors must pay a toll upon entering the city. You would want to cover all possible routes (so that drivers couldn't simply drive around the booths) but you would also want to do it using the least number of booths. There were many possible solutions, but the most optimal one would be hard to find.

  After three hours, he had just scratched the surface. Each program was linked to at least ten others. Each was interdependent. A thought may enter any program at any time, at which point the program would need to decide where it should go next. Was this a request for memory retrieval? The formation of a new node? A comparison between nodes? A request to link a pair of nodes? Each type of message would take a unique path through the web of programs.

  There was a message beep. Deck tapped the screen to take the incoming call. The face of Diego appeared.

  He skipped any sort of polite greeting, "Deck, how is it going? What sort of progress have you made?"

  Deck hated questions like this. Clients pulled this stuff all the time. The actual answer to the question was far too complex for Diego to ever comprehend. What he really wanted to know is: are you done yet? Should he answer the question asked, or the one implied?

  "I've made some good progress. I've begun some careful changes to Shodan's systems."

  "So you've managed to turn off some of the ethics?"

  Deck could see where this was going, "No, not yet."

  Diego became visibly displeased, "Its been almost four days and you haven't disabled a single one? Just how long is this going to take?"

  "It doesn't work that way. This is an all-or-nothing deal. When I disable one, I'll be disabling all of them."

  Diego paused for a moment before answering, "Just make it happen, Deck". Then he killed the channel.

  Deck returned to work, but his mind was clouded with fictional arguments with Diego.

  He ate. He slept. He started again.

  NULL_ETHIC needed to be in a position to intercept all messages intended for the EC. Deck finally plotted a path through the web of programs. He worked out a narrow set of other programs to which he would need to link. He spent a few more hours building the links, adding each one carefully and making sure Shodan was undisturbed in the process.

  When he was done, his toll booths were in place. NULL_ETHIC was receiving all messages destined for the EC. It currently wasn't doing anything special with them. It just passed the message onto the EC without altering it in any way. At this point, his program was fully installed but had no effect on Shodan's systems. It was just a pointless middleman.

  He then began work on making NULL_ETHIC actually do something with the messages that it handled. He monitored the messages as they passed through his program, and eventually learned to identify the different types and classes of messages.

  The hours melted by. Deck hadn't had a shower since his exam when he arrived. He hadn't even changed clothes. When he left the office to use the bathroom, he was met with stares from the personnel populating the computer core. His eyes were permanently bloodshot, and no amount of coffee could seem to completely lift the haze in his mind induced by lack of regular sleep.

  When he closed his eyes, his mind was filled with the images of Shodan's brain. Data structures and node links formed a tangled flowchart of logic in his head. Time was always either standing still or blinking by. Sometimes it seemed to do both at once. The lack of a proper sleep pattern was exacerbated by the lack of a visible day / night cycle, and robbed him of any ability to accurately perceive the passage of time. As the hours swept by, he made steady, incremental steps to completing NULL_ETHIC.

  01100101 01101110 01100100

  When it was complete, NULL_ETHIC acted as a liaison between the EC and the rest of Shodan's brain. It would intercept messages for the EC and check to see what they were. If they were answers to ethical challenges, his program would drop the message and replace it with a counterfeit, indicating the proposed action had passed the challenge. If the message was not an answer to an ethical challenge, it would simply pass the message along normally.

  Deck sent a test message into Shodan's data loop, "Give Deckard Stevens $100"

  There was no error message.

  He checked the history log to see exactly what Shodan had done. It had opened up employee file 2-4601 and deposited $100. Deck smiled to himself. Shodan had just helped him embezzle a pointlessly small amount of money.

  He sent a few more messages into the loop and all of them passed. Shodan was able to access the research labs and learn from the studies being done there. It was able to access the accounting database and move money around arbitrarily.

  It worked.

  As he reached for the pager to call Diego, he thought better of it. Something was bothering him.

  He didn't like that Shodan knew who Deckard Stevens was. Even worse, it linked him to his bogus employee file. He thought about the night in TriOptimum building and how much influence Shodan really had. When his deal with Diego was over, he wanted to vanish back into the Undercity without a trace. Shodan was a threat to that. If Diego wanted to, he could probably find him again with the help of Shodan.

  Deck decided he wanted some insurance. He thought about what Diego had said days earlier- that when presented with an unethical thought, Shodan couldn't even store it.

  Deck added a new filter to NULL_ETHIC. It would examine incoming messages for information relating to Deckard Stevens or employee 2-4601. Anything related to him or his work on Shodan would be flagged as an "unethical" thought and fail the EC challenge. In effect, Deck had replaced Shodan's entire ethics system with a single rule: "You may not know or think about Deckard Stevens" Shodan would have the memories of the night it helped him out of the TriOptimum building, but would be unable to access them. Shodan would be able to see and speak with Deck, but it would never be able to know who he was.

  Deck paged Diego. The face of a young blond woman appeared on screen. Diego's secretary. She was attractive, no, stunning - although she wore to
o much makeup. In the corner of the display it read, "Schuler". Deck become suddenly aware of his appearance. He must have looked like hell.

  "Can I help you?"

  "Just get me Diego"

  "I'm sorry, Mr. Diego is not available right now. Can I take a message?". The expression on her face conveyed a total lack of attention.

  Deck sneered at her, "Tell him Deck is finished, and that -"

  "Deck? I'm sorry, Deck who?"

  Deck clenched his teeth. It was obvious she was just running through the script in her head. She was going to want to know his name, title, daytime phone number, the reason for the call, and the best time to reach him, none of which was relevant to his message. "Deck. As in 'Deck'. As in, you don't need my last name."

  She seemed more confused than offended, "Okay, what department are you from?"

  "Tell him Deck is finished, and he is going to bed, and he does not want to be disturbed without a good reason. That is the whole message. That is all the information you need. Can you remember that?"

  Her pretty face become visibly flustered. Deck figured she was used to people kissing her ass either because she was Diego's assistant, or because of her looks, or both. Either way, it was a safe bet that it wasn't common for ragged, burned out hackers to call her up and let her know how stupid she was.

  "Well, yes, I can give him the message, but-"

  "Good for you," Deck said as he pounded the disconnect button.

  It was time to get some sleep.

  01100101 01101110 01100100

 

‹ Prev