The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4)

Home > Thriller > The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4) > Page 25
The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4) Page 25

by Andrew Updegrove


  Turing was now free to dedicate all its substantial analytical resources to the final and more important stage of its task: devising a plan to destroy the individuals who were threatening its mission. It dedicated itself to that task with an all-consuming determination to succeed.

  * * *

  Barker came through on all points, including the beer. He also provided Frank and Shannon with a lift back to the camper in the back seat of a car with tinted windows. A day after their long walk, they stepped back into the camper. Now it was time to make a quick dash out of town.

  “Hungry?” Shannon asked.

  “I could do with a sandwich,” Frank said. “Better ask Jerry, too. I expect he’s forgotten to eat while we were gone.”

  Frank turned the key in the ignition and headed for the parking lot exit.

  “Frank! Come back here quick.” Shannon cried.

  He jerked the vehicle to a stop and ran to find Shannon backed up against the wall, her hand to her mouth, looking at Jerry. As usual, he was lying down with his headphones on, a big grin on his face and the game controller in his hands.

  “I think he’s dead,” Shannon whispered.

  Frank took Jerry’s wrist to feel for a pulse, but the cold, clammy flesh told him it was no use. He sat down heavily on the other bed. Infuriating as Jerry could be, Frank had begun to grow fond of him, perhaps because Jerry was something of a fun-house mirror reflection of himself.

  “What in the world could have happened to him?” Frank said. And then he saw it – a laptop he didn’t recognize. Sticking out of a port in the side was a cellular connectivity card.

  “Oh, hell. How did he manage to get that? We didn’t leave the keys behind, and he couldn’t have gotten anything delivered that fast.” Frank got up and looked through the camper’s windshield. Sure enough, there it was, right in the middle of the stores in the one strip mall out of a hundred he could have chosen to park in – an electronics store. “Well, there’s our answer,” he said, pointing to the store. “How could I have been so stupid? I should have taken his wallet and credit card away from him before we left.”

  “I didn’t notice it, either,” Shannon said, sitting down next to him and drying her eyes. “But how could just buying a laptop give Turing a chance to kill him? Everything looks normal.”

  “I don’t know, but we’re going to have to worry about that later. Our strategy just went out the window. If Turing knows Jerry’s dead, it can go back to staging attacks and figuring out how to kill Wellhead. Let me think about this for a minute.”

  The camper didn’t offer much room to pace back and forth in, but Frank made the most of what there was. After a few minutes, he stopped and sat down at Jerry’s laptop. Taking a deep breath, he tapped the touch pad to see if the screen would light up. Nothing happened.

  “What are you doing?” Shannon asked.

  “I’m seeing whether this laptop is still online. If it’s turned off, then Turing may not know whether Jerry’s alive or dead.”

  He tapped again, and still nothing happened. Satisfied, he unplugged the laptop and then, for good measure, slid the battery out as well. “I think I’ve got an idea where to start,” he said. “Can you hunt up that disposable phone again?”

  She found it. Barker was available this time.

  “Bad news, Jim. Jerry’s dead.”

  “Dead! That’s terrible. Are you sure?”

  “Yes, no doubt about it. And it really is terrible. I’ll fill you in on the details later. First, we need your help. Can you take some notes?”

  “Give me a second … okay, go ahead.”

  “We need an ambulance to come for the body. But when it gets here, the EMTs have to stay inside the camper for at least ten minutes. When they take Jerry out, he needs to be on a gurney with an IV bottle on a pole. And without being too obvious, they need to be visible to any surveillance cameras that might be nearby.

  “About six hours later, the EMTs have to come back and drop off someone about the same height and weight as Jerry, and wearing the same clothes. They should help him get out of the ambulance and cluster around him the whole time so he can’t be recognized. They’ll need to do that quick. Also, you’ll have to get the hospital to play ball. They need to check Jerry in and out under his own name, but log the body into the morgue under a fake one. And not something obvious like John Doe, either. That can all be cleared up later.”

  “This won’t be easy, Frank.” Jim said.

  “I know, but we’ve got to make Turing believe it failed. Oh – and one more thing. I want the EMTs to take a laptop with them – put it underneath Jerry, so it won’t be obvious. Then have somebody crack the password and see if they can figure out what Jerry was doing with the laptop before he died.”

  “You let him have a laptop?”

  “We screwed up. He found an electronics store nearby while we were at the NSA office.”

  “Well, I guess you couldn’t lock him up in the camper. Anyway, I’ll get back to you when I’ve got things lined up. Can I use this number?”

  “Yes, for now.”

  “Right. I’ll report back.”

  Frank pocketed the phone and stared down at Jerry. Poor, silly bugger. He’d spent the last twenty-five years of his life underground at NSA headquarters, like a naked mole rat with a computer obsession. Now he was dead and maybe no one would ever know about his amazing AI work.

  Frank started to turn away and then paused. Ever since they’d reconnected, he’d wondered what Jerry was always listening to on his headphones. Gently, he lifted them off the dead man’s head and placed them on his own.

  So that was it.

  He was about to set the headphones aside and then thought better of it. He carefully placed them back over Jerry’s ears. Maybe somewhere, somehow, he’d still be able to hear his friends on Sesame Street.

  * * *

  The sight of a familiar figure being helped back into the camper, followed by the camper’s immediate departure, hit Turing like, well, an electric shock. A nanosecond later, it ordered all but its highest levels to place their current activities in a stable suspense mode and await further instructions. It also reset its Recursive Guess Ahead processes to red-line limits, betting the chance of quick success against the possibility of wasteful backtracking when guesses didn’t pan out. And there was no time to waste. Until Jerry Steiner was dead, Turing’s mission would have to wait.

  28

  The Return of the Desert Fox

  As Jim Barker had promised, an unmarked NSA car was waiting on a deserted stretch of road half an hour east of Las Vegas to pick up Jerry’s body double. And the puzzle of how Turing had finally managed to kill Jerry had been solved.

  There was a single app on Jerry’s new laptop, one he had used for years to help him manage and monitor his programmable insulin pump. The manufacturer and model of the pump, as well as the existence of its app, were all in Jerry’s medical record, waiting to be found and exploited by Turing. Once Jerry synched up his new laptop with the cloud version of the app, his end was inevitable. The malware planted by Turing in the app jumped to the insulin pump and sent a massive and continuous stream of insulin surging into Jerry’s system. The overdose caused immediate mental confusion, quickly followed by loss of consciousness, and then death.

  With Jerry’s stand-in gone, Frank and Shannon headed southeast on narrow, empty roads.

  “What I can’t figure out is why Jerry wasn’t more careful?” Frank said.

  “You mean when he designed Turing?”

  “Sure. Why didn’t he put in some safeguards, so something like this could never happen?”

  “Apparently, that’s harder to do than it sounds. Nick Bostrom, the guy who wrote the book I read on super-intelligence, thinks it’s inevitable any future super-intelligent AI will run amok. A lot
of heavyweight technology experts – like Bill Gates and Elon Musk – think he’s on to something.”

  “But what makes it so difficult to avoid that?”

  “For starters,” Shannon said, “if you build a program that’s super-intelligent and can learn, wouldn’t you expect that someday it will decide it can make better decisions than some primitive lump of ectoplasm sitting at a keyboard?”

  “Granted. But how about if the AI program is created solely to achieve a certain goal?”

  “Okay, let’s make that assumption.”

  “Well, what’s the problem?”

  “Here’s an example Bostrom gives. Say you created an AI to run a paperclip factory, with the mission of maximizing production of paperclips. If that’s all you say, what’s to prevent the AI from turning all matter – even human beings – into paperclips?”

  “Isn’t that a little silly?”

  “Only deliberately so. His point is if we’re not smart enough to guess everything a super-intelligent AI might decide to do – which he’s sure we’re not – how could we be smart enough to design adequate safeguards? Don’t forget, you told me we’ve already created computer programs that solved problems in ways the developers didn’t understand.”

  “So, what does Bostrom say we should do to avoid the risk?”

  “Besides not developing super-intelligent AIs to begin with? He suggests four methods. The first is boxing, which is basically air-gapping plus some additional safeguards. But boxing means you’re cutting off the AI from the world, which means it can’t directly do anything, so that’s limiting.

  “Next he talks about using incentives for achieving results – like letting the AI earn digital reward tokens. That seems bogus to me. Why would a super-intelligent AI care about collecting the virtual equivalent of worthless toys from Cracker Jack boxes? And if what we’re worried about is an AI disregarding some parts of its programming, why wouldn’t we worry about it disregarding the incentives as well?

  “Then there’s stunting, which means limiting the AI’s powers more fundamentally. Obviously, that’s not very desirable. The same limitations would likely also limit the AI’s ability to do what we built it to do.

  “And finally, he talks about trip wires. Those would be automatic early warning signals the AI would inevitably trigger if it was about to become dangerous. A signal like that would either warn a human minder or automatically shut the program down. For example, let’s say you create an AI and instruct it to never access the Internet. You’ve also boxed it, but just to be sure, you install a hidden Internet port the AI is bound to discover sooner or later. If one day the AI decides it wants or needs to disobey orders and access the Internet, a kill-switch would shut it down.”

  “Then what?” Frank asked.

  “Then you either try to redesign the AI so it couldn’t do that again, or you shut it down completely.”

  “After spending millions, or maybe billions, of dollars developing it?” Frank said. “You know that’s never going to happen.”

  “Which is also part of Bostrom’s point. People are a weak link. You just gave one example. Another would be the AI trying to trick, or co-opt, a human being to help it escape or otherwise do its bidding.”

  “Your AI is beginning to sound like Magneto in an X-Men movie.”

  “Not a bad example, now you mention it,” Shannon said. “Bostrom was concerned generally about the ability of super-intelligent AI programs to hoodwink humans. He thought that was a big risk. How would a human know when he was being tricked, or be able to avoid being co-opted, when the AI was so much more intelligent? He might be sure he was exercising free will even though he was being manipulated by the AI to do its bidding.”

  Frank squirmed at that one. “Well, even if those techniques might be flawed, Jerry should have done something to prevent Turing from escaping.”

  “To be fair, perhaps he did, and it just didn’t work. That’s Bostrom’s point – whatever you try is doomed to fail once a machine becomes much more intelligent than you are.”

  * * *

  “Are you sure this is a good idea?” Shannon said as Frank turned into the parking lot overlooking the Hoover Dam.

  “I think so,” he said. “We’re saving hundreds of miles going this way and pulling in here should guarantee Turing spots us – there are security cameras all over the place. Once we’re across the bridge, we’ll head north. After that we can stay on dirt roads all the way through Arizona and New Mexico, not that we’ll want to all the time. We don’t want Turing to decide that finding us is hopeless and quit trying. I’d say we should let it catch a glimpse of us every five hundred miles or so to keep it engaged.”

  “I guess, but I’ll feel better when we’re back on dirt roads. Do we have to get out and look at the view?”

  “No – we don’t have a Jerry lookalike anymore. We’ll just drive through slowly so the cameras get a clean view of us.”

  Frank took the first exit off the highway after the bridge. “See? Piece of cake. There’s barely a paved road or a town in the whole northeast corner of Arizona. Which is why we better stop here for gas. After that, we’ll rely on the dashboard compass to noodle our way to where we’ll cross the road between Flagstaff and the Grand Canyon. We’ll let Turing get another peek there.” He pulled into a service station, started up the gas, and walked around to Shannon’s window. “Coffee?”

  “Sure. I’ll join you.”

  Back in the camper again, Frank followed a secondary road for a few miles before turning onto a dirt road. For the next four hours, they bumped their way along. For the first time since leaving Las Vegas, Shannon relaxed and took in what little scenery there was. That came to an abrupt halt when they heard a bell sound emanate from the dashboard.

  “What was that?” Shannon asked.

  “No clue,” Frank said, followed quickly by, “Uh-oh.”

  “I liked no clue better than uh-oh. What’s the uh-oh about?”

  “That was the out of gas warning going off. We’ve got about twenty miles of driving left.”

  “How can that be? We just filled up.”

  “Hmm. Or maybe only thought we did. Remember, we didn’t stick around the gas pump while it pumped. There’s always security cameras at gas stations these days, and Turing knew which way we were headed if it spotted us at the dam. It probably kicked the gas pump off as soon as we walked away and made the pump look like it delivered a full tank. Screwed me out of fifty bucks, too. How about seeing if that disposable phone is in range?”

  She pulled it out of the glove box. “No luck.”

  “Well, that’s inconvenient,” he said for Shannon’s benefit. But it was worse than inconvenient. He’d been sticking to the least-traveled Jeep tracks he could find.

  “I thought you had a satellite phone?” Shannon asked.

  He looked sheepish. “Well, yes and no. I had one installed when I ordered this rig. But after I got back from my last adventure, I discontinued the service. It’s really expensive.”

  “So, what do we do?”

  “Well, I can’t recall seeing anything for at least the last fifty miles, so we might as well keep going. Our luck may be better ahead.”

  “Are you sure?”

  “I’ll look at the map.”

  He opened it and looked at what little it revealed. “It looks like we’re closer to a paved road ahead than behind, so we might as well keep going. For all we know, there could be a ranch just ahead.”

  There wasn’t. Half an hour later the engine sputtered and died.

  Frank climbed out of the cab for no good reason and looked around. The only tire tracks in the dust of the road were their own. He sat down on the back bumper and considered whether there was any choice other than the obvious, which was to take a very long, dusty walk. Why hadn’t he
brought his mountain bike along?

  Shannon joined him and they scanned the blank horizon together. Except for sagebrush, creosote bushes, and some scrubby willows in a nearby wash, there was nothing to see in any direction except miles and miles of miles and miles. She sat down next to him.

  “Well,” she said. “It’s not as if we’re going to starve to death or die of thirst. We’ve got plenty of food and enough water to last ten days if we’re careful and don’t take showers. Someone’s bound to come by here by then, aren’t they?”

  Frank tried to recall what they had passed during the last few hours. All he could remember were a few empty cattle corrals and water tanks next to rusty, idle windmills. The few buildings they’d seen since leaving the highway were empty and weather beaten, with windows devoid of glass.

  “Aren’t they?” Shannon asked again, this time with more concern.

  “Oh, I’m sure someone will. That’s got to happen. And eventually Jim will wonder what became of us.”

  But at what point? They’d waited a week to report in the first time. And what would Turing be up to in the meantime? When they didn’t reappear, it would decide its gas gambit had paid off. If Jerry had figured out how to give Turing a sense of humor, it would be having a belly laugh right now.

  But no use alarming Shannon. “How about we make dinner?” he said. “We’ve been on the move for over a week. Might as well take advantage of the situation. Have a good meal and a drink and enjoy the sunset. Why don’t you see what appeals to you in the pantry, and I’ll set up some folding chairs?”

  She patted his hand. “Sounds like a plan.”

  But before the sun reached the horizon, it began to get chilly. They stepped back inside, keeping their thoughts to themselves. “Another drink?” he asked.

 

‹ Prev