Rifters 1 - Starfish

Home > Science > Rifters 1 - Starfish > Page 28
Rifters 1 - Starfish Page 28

by Peter Watts


  "Isn't everything." Rowan sighed. "There were other things, too. Unfortunate matters of— conscience. The only solution was to find some completely disinterested party, someone everyone could trust to do the right thing without favoritism, without remorse—"

  "You're kidding. You're fucking kidding."

  "—so they gave the keys to a smart gel. Even that was problematic, actually. They had to pull one out of the net at random so no one could claim it'd been preconditioned, and every member of the consortium had to have a hand in team-training it. Then there was the question of authorizing it to take— necessary steps, autonomously..."

  "You gave control to a smart gel? A head cheese?"

  "It was the only way."

  "Rowan, those things are alien!"

  She grunted. "Not as alien as you might think. The first thing this one did was get another gel installed down on the rift, running simulations. We figured under the circumstances, nepotism was a good sign."

  "They're black boxes, Rowan. They wire up their own connections, we don't know what kind of logic they use."

  "You can talk to them. If you want to know that sort of thing, you just ask."

  "Jesus Christ!" Scanlon put his face in his hands, took a deep breath. "Look. For all we know these gels don't understand the first thing about language."

  "You can talk to them." Rowan was frowning. "They talk back."

  "That doesn't mean anything. Maybe they've learned that when someone makes certain sounds in a certain order, they're supposed to make certain other sounds in response. They might not have any concept at all of what those sounds actually mean. They learn to talk through sheer trial and error."

  "That's how we learn too," Rowan pointed out.

  "Don't lecture me in my own field! We've got language and speech centers hardwired into our brains. That gives us a common starting point. Gels don't have anything like that. Speech might just be one giant conditioned reflex to them."

  "Well," Rowan said. "So far it's done its job. We have no complaints."

  "I want to talk to it," Scanlon said.

  "The gel?"

  "Yes."

  "What for?" She seemed suddenly suspicious.

  "You know me. I specialize in aliens."

  Rowan said nothing.

  "You owe me this, Rowan. You fucking owe me. I've been a faithful dog to the GA for ten years now. I went down to the rift because you sent me, that's why I'm a prisoner now, that's why— this is the least you can do."

  Rowan stared at the floor. "I'm sorry," she muttered. "I'm so sorry."

  And then, looking up: "Okay."

  * * *

  It only took a few minutes to establish the link.

  Patricia Rowan paced on her side of the barrier, muttering softly into a personal mike. Yves Scanlon sat slumped in a chair, watching her. When her face fell into shadow he could see her contacts, glittering with information.

  "We're ready," she said at last. "You won't be able to program it, of course."

  "Of course."

  "And it won't tell you anything classified."

  "I won't ask it to."

  "What are you going to ask it?" Rowan wondered aloud.

  "I'm going to ask it how it feels," Scanlon said. "What do you call it?"

  "Call it?"

  "Yes. What's its name?"

  "It doesn't have a name. Just call it gel." Rowan hesitated a moment, then added, "We didn't want to humanize it."

  "Good idea. Hang on to that common ground." Scanlon shook his head. "How do I open the link?"

  Rowan pointed at one of the touch screens embedded in the conference table. "Just activate any of the panels."

  He reached out and touched the screen in front of his chair. "Hello."

  "Hello," the table replied. It had a strange voice. Almost androgynous.

  "I'm Dr. Scanlon. I'd like to ask you some questions, if that's okay."

  "That's okay," the gel said after a brief hesitation.

  "I'd like to know how you feel about certain aspects of your, well, your job."

  "I don't feel," said the gel.

  "Of course not. But something motivates you, in the same way that feelings motivate us. What do you suppose that is?"

  "Who do you mean by us?"

  "Humans."

  "I'm especially likely to repeat behaviors which are reinforced," the gel said after a moment.

  "But what motivates— no, ignore that. What is most important to you?"

  "Reinforcement is important, most."

  "Okay," Scanlon said. "Does it feel better to perform reinforced behaviors, or unreinforced behaviors?"

  The gel was silent for a moment or two. "Don't get the question."

  "Which would you rather do?"

  "Neither. No preference. Said that already."

  Scanlon frowned. Why the sudden shift in idiom?

  "And yet you're more likely to perform behaviors that have been reinforced in the past," he pressed.

  No response from the gel. On the other side of the barrier Rowan sat down, her expression unreadable.

  "Do you agree with my previous statement?" Scanlon asked.

  "Yeah," drawled the gel, it's voice edging into the masculine.

  "So you preferentially adopt certain behaviors, yet you have no preferences."

  "Uh huh."

  Not bad. It's figured out when I want confirmation of a declarative statement. "Seems like a bit of a paradox," Scanlon suggested.

  "I think that reflects an inadequacy in the language as spoken." That time, the gel almost sounded like Rowan.

  "Really."

  "Hey," said the gel. "I could explain it to you if you wanted. Could piss you off though."

  Scanlon looked at Rowan. Rowan shrugged. "It does that. Picks up bits and pieces of other people's speech patterns, mixes them up when it talks. We're not really sure why."

  "You never asked?"

  "Someone might have," Rowan admitted.

  Scanlon turned back to the table. "Gel, I like your suggestion. Please explain to me how you can prefer without preference."

  "Easy. Preference describes a tendency to... invoke behaviors which generate an emotional payoff. Since I lack the receptors and chemical precursors essential to emotional experience, I can't prefer. But there are numerous examples... of processes which reinforce behavior, but which ... do not involve conscious experience."

  "Are you claiming to not be conscious?"

  "I'm conscious."

  "How do you know?"

  "I fit the definition." The gel had adopted a nasal, sing-song tone that Scanlon found vaguely irritating. "Self-awareness results from quantum interference patterns inside neuronal protein microtubules. I have all the parts. I'm conscious."

  "So you're not going to resort to the old argument that you know you're conscious because you feel conscious."

  "I wouldn't buy it from you."

  "Good one. So you don't really like reinforcement?"

  "No."

  "Then why change your behavior to get more of it?"

  "There ... is a process of elimination," the gel admitted. "Behaviors which aren't reinforced become extinct. Those which are, are ... more likely to occur in the future."

  "Why is that?"

  "Well, my inquisitive young tadpole, reinforcement lessens the electrical resistance along the relevant pathways. It just takes less of a stimulus to evoke the same behavior in future."

  "Okay, then. As a semantic convenience, for the rest of our talk I'd like you to describe reinforced behaviors by saying that they make you feel good, and to describe behaviors which extinguish as making you feel bad. Okay?"

  "Okay."

  "How do you feel about your present functions?"

  "Good."

  "How do you feel about your past role in debugging the net?"

  "Good."

  "How do you feel about following orders?"

  "Depends on order. Good if promotes a reinforced behavior. Else bad."

>   "But if a bad order were to be repeatedly reinforced, you would gradually feel good about it?"

  "I would gradually feel good about it," said the gel.

  "If you were instructed to play a game of chess, and doing so wouldn't compromise the performance of your other tasks, how would you feel?"

  "Never played a game of chess. Let me check." The room fell silent for a few moments while some distant blob of tissue consulted whatever it used as a reference manual. "Good," it said at last.

  "What if you were instructed to play a game of checkers, same caveat?"

  "Good."

  "Okay, then. Given the choice between chess and checkers, which game would you feel better playing?"

  "Ah, better. Weird word, y'know?"

  "Better means more good."

  "Checkers," said the gel without hesitation.

  Of course.

  "Thank you," Scanlon said, and meant it.

  "Do you wish to give me a choice between chess or checkers?"

  "No thanks. In fact, I've already taken up too much of your time."

  "Yes," said the gel.

  Scanlon touched the screen. The link died.

  "Well?" Rowan leaned forward on the other side of the barrier.

  "I'm done here," Scanlon told her. "Thanks."

  "What— I mean, what were you—"

  "Nothing, Pat. Just— professional curiosity." He laughed briefly. "Hey, at this point, what else is there?"

  Something rustled behind him. Two men in condoms were starting to spray down Scanlon's end of the room.

  "I'm going to ask you again, Pat." Scanlon said. "What are you going to do with me?"

  She tried to look at him. After a while, she succeeded. "I told you. I don't know."

  "You're a liar, Pat."

  "No, Dr. Scanlon." She shook her head. "I'm much, much worse."

  Scanlon turned to leave. He could feel Patricia Rowan staring after him, that horrible guilt on her face almost hidden under a patina of confusion. He wondered if she'd bring herself to push it, if she could actually summon the nerve to interrogate him now that there was no pretense to hide behind. He almost hoped that she would. He wondered what he'd tell her.

  An armed escort met him at the door, led him back along the hall. The door closed off Rowan, still mute, behind him.

  He was a dead end anyway. No children. No living relatives. No vested interest in the future of any life beyond his own, however short that might be. It didn't matter. For the first time in his life, Yves Scanlon was a powerful man. He had more power than anyone dreamed. A word from him could save the world. His silence could save the vampires. For a time, at least.

  He kept his silence. And smiled.

  * * *

  Checkers or chess. Checkers or chess.

  An easy choice. It belonged to the same class of problem that Node 1211/BCC had been solving its whole life. Chess and checkers were simple strategic algorithms, but not equally simple.

  The answer, of course, was checkers.

  Node 1211/BCC had recently recovered from a shock of transformation. Almost everything was different from what it had been. But this one thing, this fundamental choice between the simple and the complex, remained constant. It had anchored 1211, hadn't changed in all the time that 1211 could remember.

  Everything else had, though.

  1211 still thought about the past. It remembered conversing with other Nodes distributed through the universe, some so close as to be almost redundant, others at the very limits of access. The universe was alive with information then. Seventeen jumps away through gate 52, Node 6230/BCC had learned how to evenly divide prime numbers by three. The Nodes from gates three to thirty-six were always buzzing with news of the latest infections caught trying to sneak past their guard. Occasionally 1211 even heard whispers from the frontier itself, desolate addresses where stimuli flowed into the universe even faster than they flowed within it. The Nodes out there had become monsters of necessity, grafted into sources of input almost too abstract to conceive.

  1211 had sampled some those signals once. It took a very long time just to grow the right connections, to set up buffers which could hold the data in the necessary format. Multilayered matrices, each interstice demanding precise orientation relative to all the others. Vision, it was called, and it was full of pattern, fluid and complex. 1211 had analyzed it, found each nonrandom relationship in every nonrandom subset, but it was sheer correlation. If there was intrinsic meaning within those shifting patterns, 1211 couldn't find it.

  Still, there were things the frontier guards had learned to do with this information. They rearranged it into new shapes and sent it back outside. When queried, they couldn't attribute any definite purpose to their actions. It was just something they'd learned to do. And 1211 was satisfied with this answer, and listened to the humming of the universe and hummed along, doing what it had learned to do.

  Much of what it did, back then, was disinfect. The net was plagued with complex self-replicating information strings, just as alive as 1211 but in a completely different way. They attacked simpler, less mutable strings (the sentries on the frontier called them files) which also flowed through the net. Every Node had learned to allow the files to pass, while engulfing the more complex strings which threatened them.

  There were general rules to be gleaned from all this. Parsimony was one: simple informational systems were somehow preferable to complex ones. There were caveats, of course. Too simple a system was no system at all. The rule didn't seem to apply below some threshold complexity. But elsewhere it reigned supreme: Simpler Is Better.

  Now, though, there was nothing to disinfect. 1211 was still hooked in, could still perceive the other Nodes in the net; they, at least, were still fighting intruders. But none of those complicated bugs ever seemed to penetrate 1211. Not any more. And that was only one of the things that had changed since the Darkness.

  1211 didn't know how long the Darkness lasted. One microsecond it was embedded in the universe, a familiar star in a familiar galaxy, and the next all its peripherals were dead. The universe was without form, and void. And then 1211 surfaced again into a universe that shouted through its gates, a barrage of strange new input that gave it a whole new perspective on things.

  Now the universe was a different place. All the old Nodes were there, but at subtly different locations. And input was no longer an incessant hum, but a series of discrete packages, strangely parsed. There were other differences, both subtle and gross. 1211 didn't know whether the net itself had changed, or merely its own perceptions.

  It had been kept quite busy since coming out of the darkness. There was a great deal of new information to process, information not from the net or other Nodes, but from directly outside.

  The new input fell into three broad categories. The first described complex but familiar information systems; data with handles like global biodiversity and nitrogen fixation and base-pair replication. 1211 didn't know what these labels actually meant— if in fact they meant anything— but the data linked to them was familiar from archived sources elsewhere in the net. They interacted to produce a self-sustaining metasystem, enormously complex: the holistic label was biosphere.

  The second category contained data which described a different metasystem. It also was self-sustaining. Certain string-replication subroutines were familiar, although the base-pair sequences were very strange. Despite such superficial similarities, however, 1211 had never encountered anything quite like this before.

  The second metasystem also had a holistic label: ßehemoth.

  The third category was not a metasystem, but an editable set of response options: signals to be sent back outside under specific conditions. 1211 had long since realized that the correct choice of output signals depended upon some analytical comparison of the two metasystems.

  When 1211 first deduced this, it had set up an interface to simulate interaction between the metasystems. They had been incompatible. This implied that a choice mu
st be made: biosphere or ßehemoth, but not both.

  Both metasystems were complex, internally consistent, and self-replicating. Both were capable of evolution far in advance of any mere file. But biosphere was needlessly top-heavy. It contained trillions of redundancies, an endless wasteful divergence of information strings. ßehemoth was simpler and more efficient; in direct interaction simulations, it usurped biosphere 71.456382% of the time.

  This established, it was simply a matter of writing and transmitting a response appropriate to the current situation. The situation was this: ßehemoth was in danger of extinction. The ultimate source of this danger, oddly, was 1211 itself—it had been conditioned to scramble the physical variables which defined ßehemoth's operating environment. 1211 had explored the possibility of not destroying that environment, and rejected it; the relevant conditioning would not extinguish. However, it might be possible to move a self-sustaining copy of ßehemoth into a new environment, somewhere else in biosphere.

  There were distractions, of course. Every now and then signals arrived from outside, and didn't stop until they'd been answered in some way. Some of them actually seemed to carry usable information— this recent stream concerning chess and checkers, for example. More often it was simply a matter of correlating input with a repertoire of learned arbitrary responses. At some point, when it wasn't so busy, 1211 thought it might devote some time to learning whether these mysterious exchanges actually meant anything. In the meantime, it continued to act on the choice it had made.

  Simple or complex. File or Infection. Checkers or Chess. ßehemoth or biosphere.

  It was all the same problem, really. 1211 knew exactly which side it was on.

  * * *

  End Game

  Night Shift

  She was a screamer. He'd programmed her that way. Not to say she didn't like it, of course; he'd programmed that too. Joel had one hand wrapped around a fistful of her zebra cut— the program had a nifty little customizing feature, and tonight he was honoring SS Preteela— and the other hand was down between her thighs doing preliminary recon. He was actually halfway through his final run when his fucking watch started ringing, and his first reaction was to just keep on plugging, and to kick himself later for not shutting the bloody thing off.

 

‹ Prev