The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4)

Home > Thriller > The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4) > Page 14
The Turing Test: a Tale of Artificial Intelligence and Malevolence (Frank Adversego Thrillers Book 4) Page 14

by Andrew Updegrove


  “Yes, I can,” a perfect mimic of Barbara Billingsley replied. “I’m glad you’ve decided to come over and play with Jerry today.” The voice seemed to come from everywhere around them at once.

  “Pretty nifty, isn’t it?” Jerry whispered, “I’ve wired the drop ceiling to act as a speaker membrane.”

  Frank gripped the arms of his chair to keep his hands still. “I’m glad, too, Turing.”

  “You can call me June,” the ceiling said.

  “Uh, okay … June. So, you were saying you can program?”

  “Oh, yes. I can code in over a dozen computer languages, including C, C++, Fortran, Java, and more. If there’s any particular language you like, give me 1.45 seconds and I can acquire that skill. Would you like some cookies and milk? I’m told they have them at the cafeteria upstairs.”

  Frank glared at Shannon.

  “June,” Shannon said, “do you think you could be Grace Hopper instead?”

  “Affirmative,” Turing said crisply. “Next question.”

  Frank shot Shannon a grateful look.

  “Grace –”

  “Admiral Hopper!” the voice interrupted.

  Shannon jumped in. “Jerry, is there a restroom nearby?”

  “Yes, just down the hallway.” Jerry said.

  “Frank, do you need a break too?” Shannon asked.

  He nodded vigorously.

  “We’ll be right back,” Shannon said, taking Frank by the arm.

  “Are you all right?” she asked him in the hallway.

  “All right? How am I going to be all right in the middle of a computerized madhouse?” He stopped abruptly, waiting to hear if Admiral Hopper would respond. After a moment, he continued, in a whisper. “Anyway, thanks. Yes. I did need a break.”

  “Are you okay now?”

  “I’m not sure. Anyway, I think we’ve already heard enough from Jerry to assume that Admiral … I mean, Turing, has the ability to launch the attacks. I’m thinking we should just ask Jerry if we can get back to him if we have any more questions, and then call it a day.”

  “Okay. That sounds good.”

  They went back to Jerry’s office and let themselves in. He was staring at his computer screen, headphones in place, grinning and nodding his head. Shannon tapped him on the shoulder. He turned toward her briefly and then back to the computer screen without registering that he’d seen her or really anything at all.

  Frank shrugged. “Admiral Hopper?”

  “Yes?”

  “Would you give Jerry a message for us?”

  There was a pause; clearly this task was well below the admiral’s pay grade. “Yes,” the voice said finally.

  “Thank you. That’s very kind of you. Would you please thank Jerry for us, and tell him we may be back in touch?”

  “Affirmative,” the admiral replied. “Dismissed.”

  * * *

  “Wow,” Frank whispered as they walked down the hall. “This is getting beyond weird.”

  “No kidding. What are you thinking now?”

  “I’m thinking we’ve found our culprit. The only question is whether it’s Jerry’s Turing or a clone of it that’s running the attacks.”

  “Why are we whispering?”

  “Because right now we don’t know where we can talk about Turing and where we can’t. Even the SCIF downtown is out of bounds if the version of Turing we just talked to is the attacker. According to Jerry, it has access to all the data the NSA receives. Even if Jerry’s copy isn’t the enemy, for all we know, it may be passing everything along to an evil twin.”

  Frank paused. “Which wouldn’t be all bad, now that I think about it.”

  “Because?”

  “Because if we’re sure Bad Turing will learn everything that Good Turing does, we can try to mislead, and perhaps even trap, it.”

  “And if Jerry’s copy is the Bad Turing?”

  “That would be great. All we’d need to do is block its access to the Internet and the attacks would end. Except, I guess, for any self-triggering malware that’s already out there.” He paused. “Darn it! I wish we’d known about this before we visited Jerry. Now the program knows we may be on to it. It may be planting more time bombs like that right now.”

  “But we couldn’t have known,” Shannon said. “Anyway, it sounds like the next step is to figure out whether you’re right, and if so, how many versions of Turing we have to worry about.”

  “Exactly. I think it’s time we design ourselves a Turing test. If we do it right, we should be able to find out the answer to both questions at the same time.”

  16

  Not all Fake News is Bad News

  Frank always reserved the most creative part of his day to tackle whatever problem was presenting him with the greatest challenge. Today, that challenge was beginning the design of a Turing test.

  Putting his toothbrush back in its holder, he turned the water on in his shower and stepped inside. There was something about a shower that freed the mind from all distractions.

  Clearly, the first thing he needed to know was whether Jerry’s copy of the Turing program had ever had access to the Internet or instead had always been “air-gapped” from the outside world. If Turing had access to the Internet, he was sure they’d found their virtual culprit. Even if it had always been air-gapped, they might still be on to something if someone had copied Turing and reinstalled it on a suitable computer outside the NSA.

  He could think of four distinct possibilities to consider:

  1. Turing had gained access to the Internet at one point and installed all the malware with pre-set triggers so everything would happen automatically thereafter.

  2. Turing has periodic access to the Internet and updates its attacks when it can.

  3. Turing has constant access to the Internet and triggers attacks directly.

  4. Turing was installed outside the NSA, and it is too late to stop the attacks by air-gapping Turing at the NSA.

  Or maybe it had transitioned from one of these possibilities to another after listening in to them in Jerry’s office. But how would it do that? And was it capable of making plans and decisions like that?

  Stop it! He was running down too many roads at once. If he was going to create a foolproof test, he needed to identify every possible factual variation and then devise a series of tests capable of determining which one corresponded to the attacker. Otherwise, the test results would tell him nothing or, worse yet, mislead him.

  He tilted his head back and let the hot water massage his face. Time to start over and keep his decision tree clean like a good programmer should.

  Okay. What did he want to know, and in what order? First, whether Turing had always been air-gapped. That was a factual question and could be investigated rather than tested.

  Next, he wanted to know if the program was attacking its targets from an NSA computer or from the outside. And also, if it was operating autonomously or on the orders of some third party. And finally, whether it was a version of Turing at all.

  That sounded good. Now on to stating those questions more formally, and in the right logical order, starting with whether the program was always air-gapped or not? If the answer was yes, the second question along that fork would be whether someone inside the NSA might have copied the program and smuggled it outside. Come to think of it, he’d need to ask the same question even if Turing hadn’t always been air-gapped. Just because Turing might have had access to the Internet didn’t mean that copy was the attacker. Someone might also have copied it, and that copy could be behind the attacks.

  That meant the complete set of possibilities should be stated like this:

  1. Turing air-gapped and not behind the attacks

  2. Turing not air-gapped and behind the attacks
<
br />   AND

  Turing is directed by someone inside the NSA

  OR

  Turing is directed by someone outside the NSA

  OR

  Turing is self-directed

  3. Turing exists outside the NSA and is behind the attacks

  AND

  Turing was removed by someone inside the NSA

  OR

  The NSA was hacked and Turing was stolen

  OR

  Turing escaped on its own and is self-directed

  4. Turing-like program was independently developed and is responsible for attacks.

  He reviewed the list and decided it was complete. So on to the next step: moving the listed alternatives into mutually exclusive relationships he could test. If he set these up in the right order, he should be able to move from test to test, narrowing down the possibilities each time, until he reached a single result. That alternative would necessarily be the correct one – assuming, of course, he wasn’t on a wild goose chase to begin with.

  He closed his eyes and squinted. The problem with being creative in the shower was you couldn’t take notes. Anyhow, assuming he was still holding everything in his head, the logic in the first step should read:

  1. IF AI program is behind attacks

  AND

  NSA Turing always air-gapped

  THEN

  NOT NSA Turing

  AND attacker is

  Stolen Turing

  OR

  Independently developed AI program

  Good. That worked. He lathered up with shampoo. On to the next alternative.

  2. IF AI program is behind attacks

  AND NSA Turing not always air-gapped

  THEN attacker is

  Stolen Turing

  OR

  Escaped Turing

  OR

  Independently developed AI program

  Good, but not complete. He wanted to know more. He rinsed his hair out and proceeded to the next step.

  3. IF attacker is

  Stolen Turing

  OR

  Escaped Turing

  OR

  Independently developed AI program

  THEN need test to differentiate NSA-source AI program from non-NSA source

  And finally:

  4. IF Turing is NSA-source program

  THEN need test to differentiate copied NSA-source program from escaped NSA-source program

  Time to dry off and come up with those tests. He turned the water off and stepped out of the shower, wondering what had happened to the rest of the bathroom. Groping through the steam, he found his towel and hurried through getting dried and dressed so he could type up his decision tree.

  It still looked good when he finished. Now to figure out the tests themselves.

  According to Jerry, Turing had access to everything in the NSA. He could take advantage of that fact in setting up his tests, but it was also inconvenient. For all he knew, Turing might be regularly accessing his laptop every time he logged onto the NSA network, checking to see whether he had guessed what was going on. If so, he’d better ditch his decision tree immediately rather than save it. He printed out a paper copy and deleted the tree. He’d have to take his next steps using a communications technology he hadn’t employed in decades.

  Taking a pen in hand, he began the laborious process of handwriting a letter. Then he called Shannon to ask her for a lift to Fort Meade.

  * * *

  Jim Barker made it to the end of Frank’s letter and frowned. Then he went back to the beginning and read it again. When he was done, he wrote something at the top of the first page and handed it back to Frank. His note read See you there.

  It was eight o’clock that evening when Frank and Shannon saw him next, scanning the crowd from inside the door before spotting them and weaving his way across the room to their table.

  “Did it have to be a karaoke bar?” Jim said. He had to almost yell to be heard.

  “Sorry,” Frank replied. “It’s Monday night, and I couldn’t find anywhere else loud enough.”

  Barker winced as someone launched into an off-key, off-tempo awful version of Adele’s “Someone Like You.” “I’ll make the best of it. Anyway, the answer to the question in your letter is it looks like Jerry has been testing Turing on the open Internet. As you might expect, we maintain a number of testbed systems for experimental work, each open to the Internet but segregated from each other and from all other NSA networks. That way, if one of them gets compromised, nothing else gets be contaminated.

  “I checked the index of those systems, and it turns out Jerry’s had his own testbed system for years. This afternoon I asked him to participate in a meeting and had someone check the server logs for the system in Jerry’s office, the one he develops Turing Nine on. The logs show he exports a copy of the Turing control modules onto a storage device on a regular basis. There’s no reason for him to do that, since he could transfer those modules directly to the version of Turing he’s running in the virtualization environment. I expect it’s safe to say that whenever Jerry downloads a copy of those modules, it’s to transfer them to his testbed system so he can update a full copy of Turing installed there.”

  “And there we go,” Frank said. “Do you recollect how often Jerry exports a copy?”

  “I do. Every Wednesday afternoon.”

  “Excellent. That will make it a lot easier to run some experiments I have in mind to figure out what’s going on.”

  “Why don’t we just shut down the testbed right now rather than take a chance things will get worse?”

  “Mainly because I’m not sure that would solve the problem. I think we need to figure out first whether this is the program behind the attacks and, second, if the guilty version is the one running on the NSA testbed server or one that’s already out in the wild. And finally, we need to know if it’s under Jerry’s control, somebody else’s control – or nobody’s control.”

  “Why don’t we just ask Jerry?” Shannon asked. “Assuming we can get him off the NSA campus.”

  Someone launched into an exuberant rendition of “Born in the USA,” and the crowd joined in. Frank leaned in closer. “Well, that’s an interesting question. I don’t believe he’d consciously answer something dishonestly. But he does live in his own little world. It’s like he’s looking through a pea shooter and can’t see anything outside the little circle of reality visible to him at the other end of it. There could be a thousand copies of Turing running amok out there, and I expect he might have no idea that was the case. And then there’s the chance he might try to ‘fix’ the problem and make it worse by spooking the program.”

  Barker nodded. “I guess I agree. Running some tests before bringing Jerry into the loop makes sense. What do you have in mind?”

  “I’m still working on the finer details, but here’s the general scheme. We’ll come up with some fictitious greenhouse gas-related announcements that we’re sure would lead Turing to launch new attacks. By varying the way we make that data available, we should be able to determine whether the Turing program is behind the attacks and if the version we need to worry about is running on Jerry’s testbed or on a system somewhere off-site.”

  “I don’t like the sound of this,” Barker said.

  “Okay,” Frank responded, “but hear me out. In the first step, we’d release the information internally but not publicly. If there’s no attack, we know either the Turing program has nothing to do with the attacks or the copy launching the attacks is outside the NSA. And we’ll learn more if we distribute the news on a Friday.”

  “What does the day of the week have to do with it?” Shannon asked.

  “Recall that Jerry updates the testbed version on Wednesday. If an att
ack follows between Friday and Wednesday, it can’t have been launched by the testbed version of Turing, because it hasn’t learned the fake news yet. But if an attack follows shortly after Jerry updates the testbed system, then the odds are very high it’s the version behind the attacks.”

  “I get the second part,” Barker said. “But what did we learn if there’s an attack before Wednesday?”

  “That would mean someone inside the NSA leaked the information to the attacker,” Frank said, “whoever that is – an external version of Turing, a different Turing-like program, or a human attacker.”

  “There’s one more possible explanation for an attack,” Barker said, “whether it happens before or after Wednesday. Real-world events might trigger an attack independently.”

  “True,” Frank said, “but the odds of an actual attack closely matching the climate impact of the fake news would be very low, and besides, we’d know about the real news story. So, the worst that might happen would be we’d have to repeat the test with a new bogus news story.

  “Anyway, in the next step, we’ll make new information available only to the testbed network. Not internally or publicly.”

  “Well,” Shannon said, “that test should be rock solid. If an attack follows, we’ll know we’ve got it nailed, because only Testbed Turing would have had a reason to launch it.”

  “Right,” Frank replied. “But how about if there’s no attack at any time during the tests?”

  “Hmm. I guess in that case we’ve learned what can’t be behind the attacks but nothing new about who or what is.”

  “Exactly. An escaped or stolen copy of Turing could be launching the attacks, or an AI program developed by someone else, or a non-AI attacker entirely. If that’s where we end up, I’ll have to try to come up with another series of tests to figure out which one of those possibilities is the right one. So, Jim, what do you think?”

 

‹ Prev