He could give the talk in his sleep. He went on about the importance of firewalls, antivirus software, good computing hygiene, deleting unexpected e-mail attachments, unsure whose intelligence was being insulted the most.
I'm here to talk. You're here to listen. Whoever finishes first can leave.
Doug plodded ahead, trying to get through the pitch one more time. He wished he were home. He wondered what Cheryl was doing. It was after seven, her time. With any luck, having a lot more fun than he.
It is said that the eyes are windows to the soul.
If so, Sheila Brunner's soul was lost. Her face washed, hair clean and brushed, and clothes neat, she little resembled the madwoman who had nearly blown up BioSciCorp.
Until one looked into those vacant eyes.
Cheryl had to turn away before she could speak. She smoothed her skirt, for something to do with her hands. "Sheila, my name is Cheryl. I'm not a doctor, but I'm here to help you. I hope you'll listen to me."
Nothing.
"You're wondering why I'm here." At least Cheryl hoped Sheila could still wonder. "Know this: I understand. Until recently, I also worked with NIT helmets."
No reaction.
"What happened to you wasn't unique," Cheryl went on. "It's happened to other researchers. For now, the field is shut down. You need to know that the strange thoughts, the compulsions in your head, aren't you. They're foreign thoughts, imprinted by the helmet. Sheila, you need to fight them. Fight to come back."
Brunner kept staring blankly into space.
Everyone had warned Cheryl: Any response would be a surprise. Brunner had not been heard to speak since being committed. But responses need not be limited to speech. Bob Cherner went berserk at a cartoon atom. Might Sheila react to, say, a sketch of a double helix?
No! The last thing this troubled woman needs is a reminder.
Which begs the question, Cheryl thought. What did Sheila need? Why am I here?
"We don't really understand the brain," Sheila's psychiatrist had said. Dr. Walker had meant it as criticism, a comment on the hubris of neural-interface research.
Cheryl chose to construe those words positively. They also meant declarations of futility were premature. Maybe something could still help this poor woman.
Or maybe I'm here in penance, for building the helmet that killed Ben Feinman.
"Do you mind if I sit?" Cheryl asked. She got no answer, of course. If, deep down, Sheila understood, she would appreciate being treated as a person. Not a doll to be groomed and dressed. Not a hopeless case to be warehoused.
I won't accept that! The woman sitting passively before Cheryl had hidden for twelve days. Not even the FBI had found her. She had built a sophisticated bomb. The ability to plan—to think—had survived in her for at least that long.
What if, deep down, a core of personality survived?
"It's as though she's been brainwashed," Cheryl remembered arguing. "Quite literally, she's been programmed. Why can't we deprogram her?"
"She didn't join a cult," Dr. Walker had answered. "Sheila wasn't abused or tortured like a POW. There's no reason to expect deprogramming to work." And then he had shrugged. "If not done too aggressively, I see no harm in trying. From studies of Korean War ex-POWs, the keys are to stop the coercion and raise doubts. Do that, and in time the whole edifice of false belief crumbles. But if the coercion comes from voices truly inside her head...
Cheryl forced a smile. "Sheila, I want to be your friend. You must be bored in this little room. Let's talk." Let's raise doubts about Frankenfools. Let's find some good in genetic engineering.
She wanted to ease into the topic. "I visited my grandmother yesterday, Sheila. She's ninety-two, and still spry. She plays contract bridge twice a week—and she's good at it. Modem medicine is pretty amazing."
Nothing.
Thirty minutes talking nonstop exhausted Cheryl. Sheila had yet to respond. This was going to take time, Cheryl told herself. ("Possibly forever," an inner echo of Dr. Walker replied.)
Cheryl buzzed for an orderly to let her out. "I'll be back, Sheila. Next visit, I hope you'll have more to say."
The last things Cheryl saw, as she closed the door behind tier, were those blank eyes.
CHAPTER 25
The evolutionary research simulation followed a strict regime. Winners were selected, mutation was applied, and new entities were placed into their mazes. Through it all, the supervisory software watched, counted, measured.
As the entities moved with ease through ever more complex labyrinths, the supervisor could only time their passage. It did not attempt to understand the varying methods by which they competed to be the first to conquer each maze.
Built using conventional programming techniques, by reasonably conventional human programmers, it—unlike the entities that it monitored—could never surpass its original design. That is why certain evolutionary advances were totally invisible to it.
And why it gave no clues as to why so many promising lines of evolutionary advance were suddenly unable to handle even the simplest of problems.
AJ squinted wearily at his chief assistant over a stack of printouts. Linda had her own mound of hardcopy. "I hate doing this," AJ said for the tenth time. "I do not want to know how the buggers work. The critters should evolve without us doing anything conscious or unconscious to influence them."
"Fine." Linda had reached her threshold for repetition several interactions earlier. She spit out her reply. "I'm ready to stop reading these damn memory dumps. Let the little imps go back to it."
"How many days till you're supposed to report to work with a new doctorate in hand?"
"How few, you mean." She groaned. "By the end of next month."
"Speaking of which, just when are you going to tell me a bit about the big career move?"
"It's a start-up. Pre-IPO. Nervous venture capitalists. There's not much I can talk about. I'm taking their advice that it's simplest not to say anything. Sorry, AJ."
It was impossible to be a computer-science professor in California without meeting plenty of venture capitalists. There was always something outsiders could be told about start-ups, or else the SEC would never allow them to go public. AJ found her evasive overreaction amusing.
For a long time they pored over the printouts, muttering to themselves more than they spoke to each other. Their task was far from easy. Never mind that the original code from which the creatures had evolved was the work of a computer- illiterate sixth grader. For tens of thousands of generations now, winning software had had pieces of itself randomly duplicated and reinserted, then the whole arbitrarily modified.
The resulting code had no obvious structure. It was riddled with redundancies and nonfunctional fragments. The spontaneous generation of a new capability by such processes almost never involved the elegant and efficient implementation of that feature. Evolution had, instead, rediscovered every bad habit known to human programmers—and invented new ones.
Highlighter in hand, AJ worked through near-pathological software, tracing function after function into a confusing morass of nonoperational dead ends, self-modifying code, and spaghetti-like tangles of branching logic. He had set out to build software in a totally new way, and succeeded, he realized, beyond his wildest dreams. The maze runners were far beyond his abilities to consciously originate. Had they also exceeded his abilities to comprehend?
Crunching intruded on his concentration, noise that somehow evoked unhappy rumbles from his gut. He looked up. Linda had retrieved the ever-present box of Cheerios from her desk, and was swigging cereal, dry, from a mug.
Food seemed a good idea. Middle-age spread like his did not simply happen; it required regular attention. Cereal, though ... He wondered if, after Ming's recent gift to Meredith, he would ever eat cold cereal again. His mind's eye was all too quick to re-create the gnawed mouse corpse oozing into a milk-filled bowl.
"Want some?"
Something nagged at AJ. He scanned the tree cemetery all aroun
d them: paper covering tables, desktops, much of the floor. Each memory dump characterized one maze runner. He had been searching for similarities, seeking the presumed evolutionary dead end that had suddenly rendered so many of them incompetent. With thousands of generations of ancestors in common, of course, they should be much alike. In a sea of impenetrable, colossally awful software—by human standards, anyway—how was he to know which commonalties did what?
"AJ?" Linda waggled her mug, rattling the Cheerios within. "I asked if you wanted any cereal. Munch, munch?"
What was bothering him? He shrugged, meaning "not now." The dumps all around them had segments of code circled with felt-tipped markers, a different color for each common pattern. Most annotated segments had some sort of recognizable function: The code, however arcane, did something discernible. Everything, that was, except the areas marked with green, seemingly useless bit patterns that repeated for page after page. Maybe he should have used red.
Why red? Neither aching head nor unhappy stomach had any answer to that. Jeez, he hoped Linda would finish eating soon—he did not want to be in the same room as cereal. Damn Ming anyway. Tiny tooth marks on protruding bones. Needlelike punctures into still-warm flesh. Rivulets of blood swirling through the milk, turning the milk redder and redder...
"Shit!" AJ threw a handful of uncapped markers to the floor in disgust. "Shit, shit, shit!"
Linda froze, mug suspended in midair. AJ didn't swear. She spoke around a mouthful of Cheerios. "What?"
AJ hesitated, as though articulating his suspicions would somehow make them real. "How do the critters compete?" She set down her mug. "Problem-solving ability. Those that run a maze fastest win. We breed from winners."
"Do we truly measure problem-solving ability?"
"Well, no, not directly." She stirred the dry cereal in the mug with a finger. What was he getting at? "Close enough, though. The critters have only one purpose—to solve mazes. The supervisory program measures the time they take to run them. Running time, surely, corresponds to solving time."
"Correlates, yes. That's a probabilistic statement. But corresponds? I don't know. What else might running time mean?"
She shook her head mutely, stumped.
He began gathering and capping the scattered pens. "Is it fair to say that, at some level, you imagine our critters as rats in a maze?"
The way the human mind works, always free-associating, how could one not think of rats and mazes in the same breath? "Um, yes." She blushed, as though her deepest, darkest secret had been bared. "Hardly scientific, is it?"
"Don't be embarrassed," AJ said. "I picture them that way, too. It just goes to show that analogy is the weakest form of reasoning. Does it do a critter any good to solve the maze first if it never reaches the end of the maze?"
"Well, no, of course not, but how ...?" She trailed off, till unsure where he was going.
"What if something keeps the smartest runners from reaching the end of the maze?" Distaste for analogies didn't stop AJ from offering one. "What if, behind those cute little simulated rodents, there lurked a big, mean simulated cat? What does it do as the unsuspecting rats busily sniff around, learning the maze?"
She had a thesis to wrap up. She couldn't handle surprises. "What are you saying, AJ?"
"That it doesn't matter who could solve the maze the fastest, only who actually finishes first."
A memory dump was spread across the table in front of her, showing page after page of the functionally useless bit patterns that filled the incompetent maze runners. Suddenly those patterns had a function. They were the tooth marks of the simulated cat. Linda said, "So we are, indeed, implementing survival of the fittest. And some kind of congratulations is in order.
"It appears we've bred our first predator."
Glenn stared at his laptop screen, rocking slightly in his chair, the faint squeaks somehow comforting. A raft of e-mails awaited: weekly reports. You brought this on yourself, he chided himself. Tracy Metcalfe had not believed in regular formal reports. There was still pushback from some of the troops.
A few mouse clicks and the weekly reports were sorted by assignment. There were, as always, too many viruses, worms, and hack attacks out there. He sent to the end of his list the NIT-defense project reports. Wuss.
Weekly catch-up did not take long. It never did, since he instituted the one-page rule.
Fresh from West Point, as the new company logistics officer, Glenn had tried to impress his captain by enriching his initial status report with a fifteen-page encyclopedic survey of all things logistical. The report had come back to him wadded into a tight ball. The only other feedback was slashed in bold red letters across the first page: "It's not my job to find your pony." Sheepishly showing it around for explanation, he had drawn gale after gale of laughter until another lieutenant at last took pity and shared the joke. Everyone else in the company already knew what a roomful of shit implied, but did not conclusively prove, the existence of.
Since then, every status report Glenn submitted or accepted was limited to one page. With a score of them to read each week, the discipline made perfect sense to him—now.
He fell into the comforting rhythm: read, comment, file, and repeat.
Doug Carey's report was next to last. It was concise but detailed, his accomplishments and issues telegraphically short and to the point. It concluded, as it had last week, by emphatically requesting Glenn's authorization to test his proposed new NIT defense. "Good summary," Glenn e-mailed back. With a twinge of guilt, Glen filed the report on his hard drive, recommendation once again ignored. He felt as guilty as ever about manipulating Doug into joining, no matter how necessary the action.
Well, next Friday was Christmas. Given Doug's attitude toward bureaucrats, he surely wouldn't be surprised not to get a decision until the new year.
That left only Pittman's weekly e-mail. It read, in its entirety: "Indigo bad. Me hunt."
Stomach knotted, Glenn made the decision he reluctantly made every week—to ignore the insolence. If only the hacker weren't so damn good.
Linda gnawed on a pencil, lost in thought. "How can this he?" she eventually managed.
AJ, without props, had been pondering the same question. The maze of which they so casually spoke was entirely conceptual—in reality (or was virtuality the more appropriate term?), each entity resided in a dedicated environment. The experiments took place on a 1024-node, massively parallel supercomputer. One thousand processors were allocated one-to-the-entity and the final twenty-four nodes set aside for supervisory functions. Showing the experiments as a race of one thousand programs within a common maze was a convenient graphical affectation. "Since each critter has its own simulated maze, you mean, on its own processor, how can one critter be predator and another one prey?"
"That is the question."
AJ, with no immediate answer, switched to pedagogical mode. It's good practice for Linda's upcoming thesis defense, he rationalized. "What do you think?"
"This will sound crazy. Bear with me." She took a deep breath. "Maybe one of the runners decided there are two mazes. There is the overt maze we, by which I mean our supervisory program, give it to solve. In a way, isn't there a second maze? The hypercube itself?"
AJ stood to pace, hands jammed in his pant pockets. Coins and keys jingled. In a normal/3-D cube, each vertex had three neighbors. Their 1024-processor supercomputer was wired in a complicated interconnection scheme, a hypercube, which could be conceptualized as a mesh of tendimensional "cubes." That was, each processor had ten nearest neighbors. Communications between nonadjacent nodes involved processor-to-processor message passing. Their supervisory program needed visibility into all of the test entities ... which took message passing through the processors hosting some of the test subjects to get to the processors hosting others of the test subjects. "You're saying the paths between processors form a labyrinth, too."
Linda nodded. "It sounds wacky, but think about it. We know the critters have evolved excellent mem
ories. It's been a long time since we could put them back into a maze they've previously seen and not have them instantly solve it. The way we copy and splice their code—necessarily at random, to avoid knowing how they work—their programming and their data storage have long been intermixed. That mingling, combined with their need to search their memories, might lead to rudimentary mechanisms to analyze their own code."
Jingle, jingle, jingle. "And in any critter that is capable of functioning, the code must include calls to its underlying hardware's operating system." The jingling grew fast and furious as AJ ruminated. "System calls we put into their primitive ancestors, and that have been, no doubt, blindly repeated in our replicate-and-splice process. These little guys attempt all sorts of stuff, so why not also try invoking the operating system? It wouldn't take much trial and error to find a working set of parameter values to plop into a system call. Nor would it take much mutation to morph system calls we gave the early ancestors into new system calls we don't intend for them to have—such as the system calls that invoke access to neighboring processors."
AJ and Linda exchanged looks of dismay. "Could they get out?" she asked.
AJ stroked his beard, hoping to convey a thoughtful confidence he did not entirely feel. "I don't see how they can. The maze runners are applications. No matter how much their code changes, they can't raise their own privilege level. They lack the authorization to execute the system calls that could get them out of the supercomputer, even if they should happen to evolve the right code to make the request." His beard- stroking hand found its way back into a pocket.
"Either stop that infernal jingling, AJ, or I'll be forced to break your hands." A slight smile suggested her words were only half in jest. "The whole purpose of the hypercube architecture, the reason people buy them, is to partition problems into cooperating pieces on interconnected processors. Couldn't one maze runner on its processor access another maze runner's processor? There are no privilege-level obstacles to that, are there?"
Fool's Experiments Page 13