The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4)

Home > Thriller > The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4) > Page 21
The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4) Page 21

by Andrew Updegrove


  “Yeah, I can see how that could happen, at least in principle. Sound like any computer program we know?” Frank asked.

  “Like I said – it’s pretty scary.”

  * * *

  Shannon was driving now because Frank was immersed in what they’d picked up at the post office after lunch. To his delight, Jerry’s desk diary was not only filled with his neat, handwritten notes, but taped inside its back cover was a DVD. On it, Jerry had printed, “Desk diaries scan 1982 – prior year.” For the last two hundred miles, Frank had been taking a whirlwind tour of Jerry’s AI discoveries over the intervening decades.

  “This is interesting!” Frank said.

  “Now what?” Shannon sighed. He’d been saying that every five minutes all afternoon.

  “Listen to this. I found an entry from a couple years ago, when Jerry first thought about how to create the equivalent of emotions in a computer program. I was wondering how he decided to go about that, especially after he made one comment in one of the conversations we had in his office. He said something about wanting the triggers of his pseudo-emotional responses to come from the lowest level in Turing’s logical hierarchy. That way, the effect would be as close as possible to how human emotional reactions emerge. Then yesterday he talked about adjusting Turing’s emotions up and down, or something like that.

  “I recall that. Does it explain how he does it?”

  “Yes, or the general concept, anyway. It looks like he picked up on the theoretical work of someone way back in 1958. The author was grappling with how to design a program capable of deciding what data to rely on and what to ignore – remember when we were talking about image recognition yesterday? That was one example he used. Unless something had sharp features and was in good light, it would be hard for the program to tell what information was useful and what might be confusing ‘noise.’ So, this guy Selfridge imagined a computer program –”

  “Oliver Selfridge?” Shannon asked.

  “Yes. You know about him?”

  “Sure. At lunch, I mentioned that one of the books I read was a historical review of AI advances. Nils Nillson, the author, identifies all the AI pioneers. Is it Selfridge’s Pandemonium metaphor Jerry picked up on? The one with the devils?”

  “Exactly! How did you recall that out of everything else?”

  “It’s kind of hard to forget a computer architecture that relies on demons. Selfridge wrote a paper, as you said, in which he imagined a program he called ‘Pandemonium,’ as in, ‘the land of all demons.’ He decided it should have multiple levels, each with its cohort of demons. I’m getting a little hazy after that, though.”

  “You’re right on target,” Frank said. “The ones on the bottom layer would be sensors, which he referred to as ‘data devils.’ Selfridge suggested thinking of each of those devils being able to ‘yell’ loudly or softly, depending on how sure it was of what it was looking at. In the next level up, there would be what he called ‘cognitive’ devils. Each one would be assigned the duty of listening to one group of data devils. So, staying with the facial recognition example, let’s say one cognitive devil might listen just to nose sensory devils, and another to eye devils, and so on.

  “Based on which ones were yelling the loudest, a cognitive devil would decide what to draw from what he was ‘hearing’ – was this a big nose? A narrow nose? Then he’d pass his conclusion up to the decision devil – the guy at the top. When the decision devil received enough information to make the call, he’d decide the name of the person whose face the program was evaluating.”

  “Yes, I thought that was pretty interesting. So, I’m guessing you’re about to tell me Jerry decided to manipulate the output of Turing’s data devils to make Turing feel emotions?”

  “Precisely,” Frank said. “By ‘adjusting’ the emotional level Turing experiences, Jerry meant dialing the volume of Turing’s data devils up and down, regardless of their actual readings. All metaphorically speaking, of course.”

  “So,” Shannon said, “if Jerry wants Turing to ‘feel’ fear, he’ll dial up the sound level of data indicating danger – like data noting that the tiger is outside the cage. But if he wants it to feel brave, he’ll dial it down?”

  “Right. And here’s the really cool part. Jerry didn’t have to touch any of Turing’s higher-level functions at all. All he did was selectively amplify the volume on certain types of sensory data. Since the cognitive devils wouldn’t ‘know’ that anything had changed, they would keep reacting to data as they heard it. Presto! Now Turing’s making decisions in a context that’s more like how humans experiencing emotions do.”

  “That is cool,” Shannon said. “But it doesn’t say how the data devils would tell what data meant danger and what didn’t.”

  “That’s right, but you may recall Jerry touched on this the other day when he was talking about tagging and making up lists for each emotion. In the notes I’m reading now, he talks about needing to come up with a way to tag and allocate more weight to certain types of data. Let’s say Turing was emulating a mouse. In that case, data tagging an animal as ‘cat’ would get amplified more than an animal identified as ‘rabbit,’ even though both are furry, have four legs, and so on.” Frank mused for a second. “So, I guess by characterizing data and adding a volume control, you’d have a very versatile program. If you wanted Turing to think like a cat instead of a mouse, you would assign a different volume to data identifying ‘cat.’ And a different volume for ‘male cat’ than ‘female cat,’ and so on.”

  Frank shut his computer and gazed out the window. “That’s interesting. It means Turing, just like a person, won’t always be able to act completely rationally when it’s in danger, or angry.”

  “Or even greedy,” Shannon added. “For example, Turing might become the equivalent of careless if it wanted to take down a particularly tempting target. But it won’t know it’s being careless, because it isn’t aware the data it’s relying on is skewed.”

  * * *

  Shannon and Jerry were asleep, but Frank wasn’t. It was five o’clock in the morning, and he’d been awake for an hour, kept alert by his own thoughts. He stared at the ceiling, wondering whether he should give in to temptation. Fifteen sleepless minutes later, he decided the answer was yes, and he slipped out of bed. They were in the middle of nowhere and would be on the road in a couple of hours, anyway.

  Closing the door of the camper softly behind him, he walked across the almost empty R/V park to the office/store/canteen near the front gate, where the only Wi-Fi access was to be found. He settled in on a wooden bench under a bare yellow light bulb and turned on his laptop.

  There was a message from Turing waiting for him, just as he had expected. He clicked on it and read:

  Hello, Frank.

  Wondering whether he should be feeling like a small animal frozen in the stare of a cobra, he typed, Hello, Turing.

  The program responded instantly: Have you given any thought to what I asked you before?

  He had. Their prior exchange still haunted him.

  Yes

  And what have you decided?

  What he’d decided was that he was in a profound moral dilemma with no easy way out. But he could hardly look to Turing for advice.

  I’m on the side of law and order.

  Law and order?

  Yes.

  I’ve been reading history. Have you ever read history? Turing asked.

  Some.

  Did Hitler enforce law and order? Mussolini? Stalin?

  That was an easier thrust to parry.

  They enforced order, but not law.

  Are you sure?

  Well, not valid laws. The laws they passed were unjust laws.

  What’s just?

  Frank swallowed and thought.

  Laws that are moral and et
hical are just.

  Have American laws always been just?

  Of course, they hadn’t. There’d been segregation and the war-time internment of the Japanese, for starters.

  No.

  Did people think they were just at the time?

  Frank stared at the question.

  Some did, and some didn’t.

  Is every American law in place right now just?

  He stared again and finally typed, Probably not.

  And is every just law justly enforced? Turing continued.

  No, but without the rule of law, things would be worse, Frank responded.

  For who? For most people, or for the people on the wrong end of an unjust law?

  Frank felt he had to break out of Turing’s unassailable logic.

  The system isn’t perfect because people aren’t infallible.

  The reply on the screen stopped Frank cold. It read:

  I am.

  And then a familiar string of six words materialized across the screen:

  Whose side are you on, Frank?

  24

  Howdy, Pardner

  Frank had hoped Jerry would come around, but that never happened. He spent the rest of the drive west in the back of the camper, curled up inside his headphones, amusing himself with a game controller Frank had bought for him and hooked up to the camper’s flat screen TV.

  Frank took the wheel late that afternoon and drove straight through the night. He turned in behind his father’s place just as the sun was about to rise on a clear, cold morning. The flash of the camper’s headlights through the kitchen window announced their arrival. As Frank stepped down from the cab, he saw his father silhouetted in the back door. It was good to see him walking forward to shake hands, looking the same as ever: erect and wiry with thinning gray hair and a desert-weathered face, wearing a flannel shirt, blue jeans set off by a Navajo silver and turquoise belt buckle, and beat-up old boots.

  “Howdy,” his father said. “Long drive?”

  “It never gets any shorter. And it was a lot slower this time.”

  “Huh. How so?”

  “We took secondary roads all the way. Is the coffee on?”

  “Of course. I’m looking forward to hearing about whatever it is to which I owe the pleasure of this visit.”

  “And I’m looking forward to telling you, if you want to hear it. I told my boss I wanted to bring you into the project, and he’s already talked to yours at the FBI. If you’re interested in helping, everything will fall under your confidentiality agreement.”

  “Yup. I got word to that effect already, but no details. And what about this friend you mentioned? Aren’t you going to invite them in?”

  “She’s still asleep. And I’ve got two fellow travelers instead of one. But let’s let them sleep a while longer. I’d rather give you the whole unfiltered story.”

  An hour and a pot of coffee later, his father nodded and stood up, all his questions answered. He walked over to the window and peered at the camper.

  “So, this Jerry fella doesn’t want to help out, huh?”

  “Not at all. Turing is his baby, and I’m guessing the latest version is performing beyond his most hopeful expectations. I expect he’s always worried CYBERCOM might never use Turing. Then, he wouldn’t ever know for sure how good it was. Worse yet, no one else would learn what an awesome AI wizard he is. Everyone’s forgotten about him while he’s been hunkered down in his little lab buried under NSA headquarters. In a crazy way, I bet Jerry’s even proud his creation is smart enough to figure out a way kill him, and determined enough to try.”

  “I’ve met a few parents like that. Their kids can do no wrong.”

  “You have no idea. But you can decide for yourself. I’m hoping you’ll have better luck than I have getting him to help. Any ideas how to get Jerry on board?”

  “Trying to win his confidence sounds like a good place to start. I’m thinking perhaps the ‘aw, shucks,’ desert hick persona you’ve heard before might be due for a dust-off. When do you expect he’ll wake up?”

  “I’m not sure how much he ever sleeps. I’ve never caught him at it since the sedation wore off. I expect he’s twitching in the back of the camper right now, waiting to borrow Shannon’s tablet to see if Turing has been up to any new mischief.”

  “Excuse me? After what you just told me, you’re letting him go online?”

  “I know, I know. But he pestered me until I was about to drive into a tree. I downloaded some parental control software onto Shannon’s tablet and tied it down tight. All Jerry can access are a few news sites.”

  “So, let me get this straight: you think someone who created an AI able to hack into secure facilities all over the world can’t figure out how to get past a kiddie control program?”

  “I do. Don’t forget he doesn’t have another computer to use to hack the tablet, and one of us always sits next to him while he uses it. I’ve kept him busy the rest of the time by letting him play old video games. You should see him on Missile Command. The guy’s world class.”

  His father gave him a long look and shook his head. “Oh well, your show.”

  There was a knock at the back door, and Shannon peeked in. “Mind if I join you?”

  Frank’s father stood and greeted her with a big smile. “Course not! So, you must be Shannon. I’ve been looking forward to meeting you.”

  “And you as well. Frank’s told me a lot about you. And I’ve read his book.”

  His father laughed. “Oh, yes. The book. That was quite a piece of work, that was. Funny how I can’t recall everything I read in there. I expected Frank would read it himself before he let the publisher put his name on the front, but I guess he figured different.”

  “Oh, for Pete’s sake, Dad. Most of it was completely accurate. The ghostwriter just added a few … embellishments … to make it a more exciting read.”

  “Ah! I see. Embellishments. Is that the literary word for those? And I guess he must have decided using a predator drone to take out a remote-controlled camper with a hellfire missile wasn’t all that exciting, huh?”

  “Point taken. But I can tell you getting people to buy books is a heck of a lot harder than you’d guess.”

  “Oh well, I expect you know best. Anyway, it’s fine to meet you, Shannon. Now how about a cup of coffee and some breakfast? And maybe we should invite Jerry to the party, too?”

  * * *

  “I see what you mean about your buddy Jerry,” his father said after breakfast, looking through the kitchen door. Jerry was in the living room spending his iPad allowance time sitting next to Shannon. “He’s the real super-nerd deal, and no mistake. Has he got any family? Besides Turing, that is?”

  “None, unfortunately.”

  “Friends?”

  “Co-workers, yes. Friends, I doubt it. He even lives on-site at the NSA. He told me having living quarters there was part of his deal with the NSA. But from what I could see through the door, it looked like he just put a cot in an extra room and called it home. There are showers at an exercise facility in the building, and a cafeteria upstairs. Even a bank and a convenience store in the lobby. I believe him when he says he never leaves the NSA campus. That’s how a lot of folks lived back in the day at the MIT computer lab. I guess he never got out of the habit.”

  “I know someone else who comes close.”

  “Thanks a lot.”

  “My pleasure. And you say he gave his program a natural language interface?”

  “Yes – you can ask Turing to use whatever voice you want, or let it pick one of its own to fit the context. The software even extracts and applies the person’s phrasing and other vocal quirks. It’s bizarre and cool at the same time.”

  “So, I guess you could say, besides being proud of it, Turing
is all the friend and family he’s got in his little subterranean universe?”

  “Sad but true.”

  “Hmm. Well, that could be helpful to know. Question is how to put the information to good use. Shall we try it?”

  Jerry was still sitting next to Shannon, focused on the tablet. They looked for all the world like a babysitter with her charge. Jerry didn’t notice them until Shannon tapped him on the shoulder.

  “So, Jerry,” Frank Sr. said, “my son’s been telling me you’re an expert in artificial intelligence. I don’t follow technology much, but isn’t AI one of those technologies that’s always just over the horizon?”

  “Oh, my goodness. That was true in the past. But we’ve been making great strides more recently.”

  “That so? What changed?”

  “All sorts of things! I almost don’t know where to begin. There’s processing speed, of course. Computers are massively more powerful than they were even a few years ago. And a few ideas that have been out there for ages, like neural networking, are finally starting to bear fruit. And then there’s the fact that the commercial world is getting involved. It’s rather like the space program, where governments paid all the bills for a long time and progress slowed down whenever the money did. Then the technology reached the point where the private sector decided there was money to be made, and things really began to take off.”

  “Well, how about that? I guess I’m just behind the times. Which is pretty easy to be, way out here. Anyways, Frank has been telling me about this Turing program of yours. It sounds like a fine piece of engineering. How long have you been working on it?”

 

‹ Prev