“Hello!” a voice yelled back. “Who is that?”
“Catherine Matthews. Who are you?”
“Mike Williams.” Pause. “Did you put me in this hole?”
“No, I, uh . . .” She panicked. “Hold on, I’ll get a rope . . . or something.”
She ran into the clubhouse, searching for anything useful, and found heavy drapes covering the tall windows. She grabbed with both hands and yanked, but they didn’t budge.
“Can I help?”
Cat whipped around. Helena had rolled silently into the room.
“Yes. Come with me.” She led the bot outside. “Mike Williams is down there. Can you get him out?”
Helena gazed at Cat with four eye stalks, then glanced at the hole in the concrete. “You people are both liberal and careless with experimental technology, a dangerous combination. You didn’t use enough solution and the nanotech kept going until it got the elements it needed to finish its program.”
Helena let out something approximating a sigh, then levered herself into the dry hot tub. Holding onto the rim with four tentacles, she lowered her body into the cavity. The limbs extended, growing impossibly slim, like fine black ropes, then the process reversed until Helena popped out. A few seconds later a naked man emerged in the grasp of her arms.
He blinked in the late afternoon sunlight and crouched. “Do you have clothes?”
Cat nearly fell in shock. He was alive, looking like a normal, healthy man of his age, indistinguishable from his photos. She’d put a disembodied head into the pool, and technology rebuilt him.
Mike coughed. “Clothes?”
Right. Cat ran into the building, and came back with a server’s uniform she found hanging in a closet. “Meet us inside after you’ve gotten dressed.”
Mike nodded, and Cat and Helena went in. Distressed by the incident, Cat held onto Helena for support.
“You know what he is?” Helena asked.
“I put his head in that tub with MakerBot solution. I’m guessing he had nanotech in him. It formed a protective core around his brain and then reconstituted his body.”
Helena’s optics swiveled and clicked. “Yes,” she hissed. “It’s highly illegal. Unethical.”
“You’re a fine one to talk about ethics,” Cat said harshly. “You came after me in a bar full of people, who are mostly all dead now.”
“No, I mean you reconstituted him with mineral sludge,” Helena said. “You were supposed to use a blood path so his tissues could be re-cultured. The MakerBot protocol is an untested, extreme backup. Now he’s a bot inside instead of biological, and he’s got to live like that. Forever!”
“Look, I’m not running a freaking hospital here.” Cat was going to lose it. She should be in school, not conducting secret operations against a power-crazed artificial intelligence. Cat poked the military bot with one finger. “I did the best with what I had. He’s alive.”
Helena turned toward Mike. “But, still . . .”
“It’s fine. He doesn’t even notice.”
Helena stared. “He will soon.”
63
* * *
THEY GATHERED IN the dining room of the clubhouse.
Mike strolled in, dressed but looking puzzled. He walked over to Leon. “How’d we get here, who’s the bot, and why was I in a hole outside?”
“Cat rescued us after we fell unconscious from heat exhaustion. The bot’s on our side, and I don’t know about the hole.” Leon smiled, looking better already, his complexion returning to normal.
Cat wanted to avoid the conversation, so when she sensed Slim arriving, she said “Pizza’s here. I’ll go.”
She met Slim at the door and carried half the food into the room. She could eat a pie herself.
Seeing Leon still on the floor, Cat brought him two slices. All the humans dove into their food except Mike.
“Aren’t you hungry, dude?” Leon said to Mike. “I’m starving.”
“No, for some reason I’m not.” Mike rubbed his stomach. “I feel good, amazing actually.”
Once everyone was eating, Helena moved to the middle of the group.
“I studied Adam by interrogating humans and mimicking the expected AI in Tucson. Based on packet routing and observed human traffic patterns, I believe Adam lives in the University of Arizona’s Computer Science building. He may hold access to a supercomputer cluster the department maintained as of a year ago.”
Tony nodded vigorously. “We’ve seen it.”
Slim punched him in the side. “Adam is gonna kill you.”
“You met Adam, his actual body?” Helena asked, rolling closer.
“Yeah, he’s on the seventh floor. He’s a little utility robot, about this high.” He held a dripping slice of pizza four feet up. “He was plugged into these black boxes.”
“The computing cluster,” Helena said.
“He doesn’t sound like much of a threat,” Cat said. “Can we go in and disconnect him?”
Helena wagged a tentacle. “Negative. I believe he’s used the supercomputer to break processor execution keys, and is in control of everything computerized in Tucson: all bots, military vehicles, and computers.”
“Impossible,” Leon said. “He couldn’t crack the encryption codes. Not even a Class IV is strong enough. Oh . . .” Leon stood for the first time since he’d been overcome by heatstroke. “The supercomputer would give him sufficient power.”
“Computers aren’t the only problem,” Cat said. “He controls all the people too.”
Leon looked at her sideways. “How?”
“They’re zombies,” Cat said.
“Flesh eating?” Leon raised one eyebrow.
He was cute, Cat thought, remembering the feeling of cradling his head earlier. She had sudden distracted emotions, wondering what it’d be like to get naked with him. Focus, girl, focus. It might be the end of the world. “No, they’re philosophical zombies.”
“Huh?” Mike asked. “What the hell is a philosophical zombie?”
“Something that looks and acts like a person, but isn’t really one,” Cat answered. “They’re used as a construct in philosophy to argue about the nature of consciousness. They don’t exist, really . . .”
Mike stared doubtfully at her. “You’ve studied a lot of philosophy?” he said, almost, but not quite, rolling his eyes. Typical old guy reaction.
“More than you, I think.”
“Listen to Catherine,” Helena said. “She’s correct. Adam is controlling the last forty thousand people through their neural implants. The humans’ own consciousnesses are gone. Presumably there were others, but they’re gone now.”
“It’s not possible,” Leon said, but glanced at Mike, who nodded, a pained expression on his face.
“It is,” Tony said. “Adam created these black boxes to do things to people’s heads if they’ve got implants. We’ve been going out and stealing memories for the last year.”
Mike raised his eyebrows. “How do you steal a memory?”
“We take a black box,” Tony said, as Slim shook his head in the background. “We bring it into a room with the target of the extraction. We press the button, the unit talks to their neural implant, makes them remember everything, and records the memories.”
“Why?” Leon asked. “That’s pointless.”
“He’s got to find out what’s going on in the world outside Tucson,” Tony said. “Cause of the firewall.”
“He doesn’t want any other AI to detect him,” Helena said, “or they’d report him. He created a massive firewall around Tucson to eliminate evidence of his existence.”
“I can’t believe he doesn’t send any realtime data,” Mike said.
“He does,” Cat said. “Certain ports are open, but they’re highly restricted. Everything inside is cached and stale. But why steal memories?”
“Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway,” Helena said.
“What?” Cat asked.
>
“Professor Andrew Tanenbaum,” Helena explained, “a pre-YONI specialist in computer networking, was referring to the capacity of moving physical media recorded with data versus sending the same information electronically.”
Leon nodded. “That rings a bell.”
“You can strap a memory stick on a carrier pigeon,” Helena said, “and if the file size is large enough, the bird will beat a transfer by the Internet. If Adam wants the maximum amount of data while minimizing his network footprint, transporting memories via boxes is an optimal solution.”
Tony cleared his throat. “That’s not all. He’s stealing from particular people to influence the People’s Party.”
“I knew there was a connection,” Leon said. “But I don’t understand why.”
“His purpose may be to discredit the anti-AI movement,” Helena said. “The Party is associated with extremist behavior.”
“Like the attack on Shizoko Reynolds in Austin,” Leon said.
“Yes,” Helena said. “There’s been a massive protest in Washington, and now in New York, since the President and Vice President are meeting with the G8.”
Cat remembered fragments of Adam’s data streams she’d sensed while she was training with him. He’d been communicating with people in Manhattan.
“I wonder . . .” Cat trailed off, lost in thought.
“What?” Mike asked.
“I oversaw bits of traffic, Adam coordinating with someone in New York City. He gave them a timetable and locker codes.”
“Perhaps the movement is planning an attack on the President,” Helena said.
“Why?” Mike said, “He’s already one of the most anti-AI presidents we’ve had.”
“As a puppet organization controlled by Adam,” Helena said, “this is a logical course of action for the People’s Party.”
Mike looked at Cat. “When does the timetable take place?”
She shrugged. “I’m not sure. I caught tiny fragments. Soon, I think. The real question is how is Adam able to do any of this? Isn’t the purpose of the Institute to ensure AI don’t harm people? Aren’t there three laws to that effect?”
Leon and Mike chuckled weakly.
“You’re thinking of Asimov’s Three Laws of Robotics,” Leon said. “And no, it doesn’t work that way. Asimov thought it would be easy to implement rules like ‘A robot may not injure a human being.’ But AI materialize from collections of algorithms and neural networks. They’re conditioned into existence.”
“Attempts to create such a rule run into endless questions of ‘What is a robot? What is a human being? What is injure?” said Helena.
“Exactly.” Mike nodded his head. “Instead of defining terms in rules implemented in software, AI learn. Like a baby learns about the environment around them and their expected behavior. It’s accelerated, of course, and the best AI are replicated, but in the end they’re emergent, not programmed. That’s why we have the reputation framework, to be an ongoing, adaptive guide to correct behavior. Unfortunately, social pressure sometimes fails to create a properly socialized being, whether human or AI.”
“But never on this scale,” Leon said. He turned to Helena. “We must stop Adam and fast. What’s your plan?”
“We need to get undetected to the University of Arizona campus, which is patrolled by combat bots, and use underground maintenance tunnels to get to Gould-Simpson, the computer science building where Adam resides. Cat and I must connect to the fiber optic network inside to attack Adam himself.”
“Why don’t you blow up the building?” Tony said. “Won’t Adam die?”
“No,” Helena said. “Even if we could wrest control of sufficient military bots away from Adam, which I doubt, and destroy the computer lab, Adam would simply shift his consciousness to a new location. He has enough power to break the processor encryption codes, so he could go anywhere. No, we attack electronically, surrounding him so that he cannot move his neural nets to another place. We must shut the door to catch the thief. Cat and I can contain him.”
“What do we do?” Tony asked.
“You will disturb the water and catch a fish,” Helena said.
Cat couldn’t believe she was hearing a bot spout two-thousand-year-old Chinese military strategy. She might actually like Helena.
“Huh?” Slim said.
“You sow confusion,” Helena said. “Report catching Catherine and use your armored vehicle to attack those sent to pick her up, forcing Adam to send more people to you and to concentrate his attention on you.”
Slim and Tony turned to each other.
“That doesn’t sound so good to me,” Tony said. “We’re just two guys. Adam will crush us.”
“Before the reinforcements arrive, we’ll eliminate Adam.”
“And if you don’t?” Tony said.
Helena waggled her tentacles as her only answer.
64
* * *
CAT CLIMBED INTO THE ARMORED personnel carrier last and shut the door. Tony drove and Slim manned the weapons console, while the rest of the group sat facing each other on jump seats. Cat took a spot next to Leon, conscious of her leg touching his.
Helena displayed diagrams in a shared netspace for them to analyze.
“Cat,” Helena said, over the roar of the vehicle’s off-road tires. “I’ll attack Adam and attempt to wipe his core. You’ll need to establish a perimeter so that Adam cannot escape. He used a firewall to prevent other AI from entering Tucson and detecting him, now you must use the same technique to keep him from leaving. If you fail, Adam will enter the global net, making it difficult, if not impossible, to track him.”
Cat nodded. She was already working on giving them the drone, satellite and camera counter-coverage necessary to avoid detection en route to the campus, and dealing with the ongoing prickling of Adam’s intensive search for her. She wanted everyone to shut the hell up so she could meditate.
“Mike and Leon, once Cat and I engage Adam fully, your goal is to penetrate the Tucson firewall and message the government, alerting them of a probable attack by the People’s Party on the President, and to send for reinforcements.”
Leon nodded.
“We’ll also require your help to get to the Gould-Simpson building. You may need to distract anyone we encounter.”
“We get it already,” Cat said. “Be quiet so I can concentrate.”
Helena settled herself. “Sorry, I am nervous. We have a thirty-six percent chance of success and no opportunity to improve our likelihood of winning. If we fail, we’ll experience brain death and Adam will grow unopposed until he dominates the entire world.”
Cat tuned Helena out, closed her eyes and focused on her breathing. A combat bot admitting to nervousness and listing what might go wrong didn’t help. She started One Thousand Hands Bhudda, and felt her heart slow and her brain focus. Keeping them from detection while doing the form was the equivalent of meditating and fighting at the same time. She should qualify for a belt promotion now.
The armored personnel carrier rumbled along on its knobby tires, filling the cabin with vibration and road noise.
Mike spoke up. “We need battle music.”
“Huh?” Leon said.
“Something to get us pumped up. Hey Tony, can you play a song in this thing? Put on Knights of Cydonia by Muse.”
“Sure,” Tony called. “Fifteen minutes until we reach the edge of the campus.”
The tune started with the clop-clop of a horse galloping, followed by the twang of laser blasters. A few seconds into the rousing anthem, Helena moved into the center of the cabin and spun and waved tentacles in time to the beat as Mike sang along. Cat stared in shock, then laughed. She started head banging with the music, and smiled when Leon did the same.
The nervous tension eased, and the song ended with everyone primed for whatever would come next.
Cat checked the rack of personal weapons on the walls. “I want the two of you to carry,” she said to Mike and Leon.
Le
on glanced at Mike. “We don’t know anything about guns. We can help the fight in cyberspace and make contact outside the Tucson firewall.”
“Yes, but I also want you to carry weapons because if there’s combat, I’ll fight through you, using your body. It’s one more surprise we have on our side.”
Mike leaned forward. “That’s not possible.”
Cat focused on their implants, rooting them with accustomed ease, and made the men give each other fist bumps before relinquishing control. “I think I can.”
“How did you do that?” Mike said, staring down at his forearm.
Leon rubbed his knuckles in pain. “Jesus, dude, you nearly broke my hand.”
“I slide into your implant through the diagnostic interfaces and stimulate muscles. I don’t think about it on a conscious level. It just happens.”
“After all this, please come to the Institute,” Leon said, shaking his head. “We need to learn how you can do this stuff.”
“After all this, I’m going to be.” Cat paused, embarrassed, wondering what Leon thought of her.
“Why?” Leon asked.
“For the men in Portland.”
“Don’t you know?” Leon said.
Cat shook her head.
“It’s been a major story the last couple of days. Your case was debated across the country. The conclusion is that you were acting in defense of another person. Oregon State isn’t charging you for murder. You’ve got to deal with more minor stuff, like your robberies, but not homicide.”
Cat sat back, the cabin swirling around her. She wasn’t guilty. She could go home to her old life, to Einstein, her puppen, and Maggie and Tom and even Sarah! She wanted to hug them. She couldn’t believe she’d been on the lam for nothing.
“You didn’t know.” Leon stared at her.
“No, I thought I was going to jail when I helped you two.”
“You did it anyway.”
Cat’s face flushed. Why? She shouldn’t be embarrassed about being selfless. “I just wanted to do the right thing.”
“Thank you,” Leon said.
The vehicle slowed. “We’re here,” Tony called out. “All passengers please exit.”
The Last Firewall Page 25