“Well, I expect there are all sorts of nuances that should be considered. I’ve never been interested in the field of AI ethics, so I only built in the most basic rules and rule-shifting algorithms. I had to provide that much of an ethical foundation so extensive logic re-programming wouldn’t be required later. But here’s an example. If you say that the ethical override of rule one only applies in relation to people who are physically present in, say, Russia, then Russian troops would be safe once they left Russian soil.
“Here’s another example. At any point in time, there are going to be some Americans in Russia. Should that force the program to be selective about the actions that it takes, or perhaps do nothing at all, until all the Americans leave? Or is it allowed to sacrifice those lives? If it isn’t, what should Turing do if it can’t always tell who is and who isn’t an American citizen? And what about Americans that are also Russian spies? Should Turing protect them, or kill them? All this sort of thing still needs to be added to make Turing truly useful. Those capabilities will be added to Turing Nine.”
“When will that happen?”
Jerry suddenly looked distracted. “Oh, goodness. I keep forgetting to put a requisition in for an engineer who knows something about ethics logic.” He removed a desk diary from the drawer in his desk and wrote himself a note.
“What else needs to be done to finish Turing Nine?” Frank asked.
“Oh, the basic structure is all done. But I’m such a fiddler. I hate to turn a new version of a pet project over to my team. Once I do, they want to make all kinds of changes, and I start to lose control. Currently I’m running tests on Turing Nine in our simulated global environment, which replicates the real world to a remarkable degree. Turing Nine is doing very well!”
“That’s fascinating,” Frank said, glancing at Shannon. “But you said no program the NSA or anyone else has could replicate the current waves of attacks on its own. What missing parts would be needed to enable Turing to plan and launch the kind of attacks we’re discussing?”
“Oh, my. Well, let me see. When you think about the goal you suggested – taking out energy infrastructure – the number of variables that would have to be accommodated would be almost infinite, because so much information would be changing all the time. Power plants would close for repairs, or permanently. New ones would come online. Other issues on the grid would stop fail-over precautions from preventing a regional or national blackout, and so on. That many options can cause what we call a ‘combinatorial explosion’ – meaning a problem that would take more resources to solve than the most powerful computer currently imaginable possesses. To do what you’re suggesting, somebody would have to be helping any program in the possession of any government today to refine its analysis.”
“Even the Turing program you developed for the NSA?” Frank asked.
“Even the NSA’s Turing.”
“Well, thanks. I guess you’ve told us what we came here to ask.” Frank stood up. Jerry didn’t, so Frank leaned over to shake hands.
“Great to see you again, Frank. Next time, you’ll have to tell me what you’ve been up to all these years.”
“Sure thing, Jerry.”
11
It’s Not as Bad as All That
“Well,” Shannon said when they were back in the hallway, “that was instructive.”
Frank’s ears were burning. He hadn’t been trying to impress Shannon – consciously, anyway. With one of his famous, intuitive theories now blown out of the water, he could tell she was being tactful. Was she disappointed, too?
“I suppose so,” he said, trying not to sound at a loss. “And anyway, the question of whether we’re dealing with an autonomous program or a directed series of exploits using an AI program is more of an interesting detail than something essential to figuring out who’s behind the attacks. If nobody could have written an autonomous program capable of staging them, it just means we’re back to assuming there’s a human instead of a robot pulling the trigger.”
“Yes, but who?”
“Can I give that a think and get back to you?”
“Of course, you can,” she said.
* * *
Frank had already passed the halfway point on his morning run. But instead of rounding the Washington Monument and heading home, he turned south toward the Jefferson Memorial. His autonomous program theory had seemed to provide such a neat solution, but Jerry had slammed the door on that option, assuming he was right. So where to go next?
Two miles later, Frank had knocked off the Jefferson, Roosevelt, and Martin Luther King memorials and was no closer to the answer. But he was fading, both mentally and physically. Heading back up the Mall, he ran out of gas completely and sat down on a park bench, gasping for breath and inspiration.
It was time to try a different approach to get past the conundrum that the only identified suspects couldn’t be suspects. An old favorite of his in such a situation was to apply another piece of seemingly unassailable, but nonetheless contradictory logic: “When you’ve failed to find something anywhere it could possibly be, then it must be somewhere it couldn’t.”
Leaving aside extreme cases like spontaneous combustion or alien abduction, the purpose of citing that contradiction was to highlight where the problem must lie: by necessity, wherever something was, it could be. So, the error had to be with the original assumptions regarding possible locations.
That suggested that the current paradox wasn’t really a paradox at all. Either he’d framed the statement improperly or perhaps left out something crucial. That meant it was time to take apart the statement that “the only entities that could be the attacker also couldn’t be the attacker.”
That statement depended on at least two assumptions. The first one was that he had identified all entities capable of launching the attacks or designing a program capable of doing so. The second one was that whoever had developed the attacks was also launching them. One or both of those assumptions might be wrong. For example, someone might have stolen the AI program. Now the attacker was no longer the developer of the program, and the contradiction had disappeared.
He had his breath back now and began to trot homeward, reorienting his idea of what might have happened. If someone stole a program that couldn’t operate entirely autonomously, then the person who stole it must be a skilled AI engineer. Which would probably be the case, anyway, or he wouldn’t know the program existed nor have access to a copy. Well, why not? Why couldn’t there be someone that had the opportunity, means, motive, and skill to make off with the crown jewels of one of the most sophisticated cyber labs in the world and then launch a private campaign to save the earth from global climate change? After all, somebody had to be behind the attacks.
* * *
“So, your new theory is that instead of someone stealing a completely autonomous version of an AI, a skilled AI engineer at the NSA, or maybe an equivalent lab or agency in Russia or China, stole a semi-autonomous one and decided to become a climate vigilante?” Shannon asked.
“Right. Or possibly that someone hacked into one of those labs and made off with the technology. But I think the ‘going rogue’ scenario is more likely. I have to believe it would be easier for an insider to know about the program and make off with it.”
“But I thought you’d concluded that unless the program could operate autonomously, someone stealing the technology wouldn’t be able to use it effectively?”
“I had. And now I’m thinking I jumped to that conclusion too quickly.”
“Okay,” Shannon said. “So, let’s say someone could steal and use the technology on their own. But now you’ve got another whole set of assumptions to make.”
“Very true. For starters, we’d have to assume there was a program available to steal.” Frank said.
“Yes,” Shannon said, “and that someone out of a very
small number of people decided to steal it. Just because someone’s a part of the NSA doesn’t mean he knows what the people in the next room are doing. I doubt many NSA employees here are aware Jerry’s program exists. The same tight security procedures would apply anywhere else.”
Frank tapped his fingers and looked up at the ceiling. “What else?”
“I’m thinking ‘what else’ is you’re forgetting the law of parsimony.”
“I know. The more assumptions you need to make to reach a conclusion, the more likely it’s the wrong explanation. But stick with me, because I’m about to take one of those assumptions back again.”
“Which one?” Shannon said.
“The one about such a program existing, of course. According to Jerry, he’s already built one. Maybe other countries thought it was worth developing one, too. But here are two more assumptions: the guilty party figured out a way to get the program out of the NSA, or wherever, and has an appropriate system to run it on.”
“Those assumptions sound less troublesome to me,” Shannon said. “On the first one, what’s so hard about heading home one night with a thumb drive in your pocket? We go through a metal detector when we arrive in the morning but not when we leave.”
“Because,” Frank said, “a program like this would fill a bucket of thumb drives, so you’d want to sneak out a hard drive instead, but that’s just a detail. The real challenge would be that you’d need a pretty powerful computer to run the program on. As I think about it, the attacker probably also had to plant all the malware while he was at work.”
“Why’s that?”
“Because otherwise he’d also need to make off with all the database information and zero-day archives needed to set up the attacks. Doing all that penetration testing and malware placing on the job sounds unlikely to me, though.”
“Because?” Shannon asked.
“Because he’d be noticed doing such a big project.”
“Maybe not.”
“How so?” Frank asked.
“Well, according to your earlier assumptions, the person who steals the program would have to be one of the people who developed it. Maybe his role is actually to use the program to penetrate systems and plant malware against enemies. Setting up attacks in his own country at the same time might be easy to work in without being noticed.”
“I like that! That makes perfect sense.” Frank said. “So maybe we’ve made some progress. We’ve got the beginning of a profile for a suspect – an idealistic, highly skilled employee of one of just a few domestic and foreign intelligence agencies who is working on a project similar to Turing and who also has access to a powerful off-site computer. Maybe he’s affiliated with a non-profit or university lab that does work with the government and has some serious computers.”
“And probably American,” Shannon said. “It’s harder for me to imagine a climate vigilante in Russia or China.”
“I’m not so sure about that, but in any event, let’s note the attacker could also have a different motive. He could be making a bazillion dollars on stock markets around the world. Every time one of these attacks is launched, some company’s stock takes a hit. I wonder whether the Securities and Exchange Commission has been looking for unusual trading in companies right before and after they get hit?” Frank made a note to himself and then frowned.
“What’s wrong,” Shannon asked. “It seems like we’re finally making some progress.”
“Well, making progress is one way to put it. But starting all over again is another.”
* * *
“Hey!” Frank said, “Here comes Julius.”
“Julius?” Shannon said, looking from Frank’s balcony down at the street. “Who’s Julius?” But Frank had disappeared inside.
He was opening the refrigerator when he heard something sounding like “Eeek!” from the balcony. “What’s wrong?” he called back. He took a cup of strawberries out of the refrigerator and returned to the balcony. Shannon was leaning back in her chair, clutching its arms. Sitting on the tiny table immediately in front of her, its head cocked to one side, was a crow.
“Oh, is that all. Shannon, meet Julius.”
“What does Julius want?” Shannon said, still tense.
“Strawberries! Here, want to give him one? He’ll take it right out of your hand.”
“No, thank you. I think I’ll just watch you feed him.”
“That’s fine. Watch! Here’s a new trick I taught him.”
Frank put a strawberry in his mouth and tilted his head back. Shannon watched, horrified, as the crow flapped its way into the air and then down onto Frank’s forehead, its talons barely missing his eyes. It plucked the strawberry from Frank’s lips and then jumped into the air. With a fluttering of wings, it rose a few feet before settling down again on the railing.
“There! Wasn’t that cool? Here,” he held out a strawberry. “Want to try it?”
“No!” she said. “And I don’t think you should, either!”
“Oh, Julius wouldn’t hurt a fly. Hmm. Scratch that. He’d probably eat a fly. But so long as you’re not food, he’s really very gentle. Smart, too! Watch this. Julius! Catch!” The bird leaped into the air and circled above their heads, waiting. Frank threw a strawberry upward, and Julius caught it precisely at the top of its arc.
“I’ve been reading up on crows lately. Did you know you can teach them to talk? I’ve already taught him to understand several words. I’m hoping to train him to say some next. He comes by every day now.”
“That’s great. Now how about we go inside?” Shannon followed Frank inside and gave the door a decisive shove shut. Then, feeling sheepish, she locked it for good measure.
* * *
Within a few days, Shannon was much more comfortable with Julius. She enjoyed feeding him now, although only from her hand. Frank was making progress with the crow’s speech lessons, too. Every time Julius learned a new word, he’d reward the crow with a new dime or penny – the crow assigned great value to anything shiny. As soon as Frank surrendered a coin, the bird would snatch it and instantly fly away. Frank wondered where its secret cache of treasures might be. When Julius learned to croak, “Black Hats Suck!”, he gave it a quarter.
“What does Thor think about your new pet?” Shannon asked one day. “Do you think a tortoise can be jealous?”
“Oh, Thor would never be jealous. I’m sure he’s as generous as a tortoise can be. He really likes Julius.”
“Oh, come on now. How can you possibly tell?”
“Watch!” Frank disappeared inside and returned with the tortoise, setting it down on the balcony. As if from nowhere, the crow appeared, fluttering down and landing on Thor’s back. The tortoise slowly craned his neck around until he could see who was there and then lumbered inside, the bird still poised upon its back. “See? They’re pals.”
“How can you be so sure Thor’s enjoying himself?”
“Well, you’ll see. If he wanted to, he could walk under the couch and scrape Julius off. But he never does.”
Sure enough, the tortoise made a slow circuit around the couch and returned to the balcony with Julius still in place. Frank rewarded them each with a strawberry.
“I think I may have understated something I said before. It’s a really good thing we got together before I found out how strange you are,” Shannon said to Frank. Then she cocked her head from one side to the other. “What do I have to do to get a strawberry?”
“Hmm,” Frank said with a sly smile. “Let me think about that.”
* * *
“Frank?”
“Yes?”
“I’ve been wondering,” Shannon said that night. “Why did you decide to play so hard to get? Weren’t you attracted to me?”
Frank stared up at the ceiling of the bedroom. It was dark,
except for the occasional flicker of headlights making their way through the window blinds when a car passed by below. “Oh, that wasn’t it at all. I guess you could say I just wasn’t open for business.”
“Why?”
How much did he want to share? “I guess you just get used to being alone after a while.”
“Really? Weren’t you lonely? You don’t seem to have a circle of friends you do things with.”
“Maybe sometimes.”
“Then why stay that way if you don’t have to?”
“I guess I wasn’t sure I had another option.”
“Oh, come on. What else were you waiting for me to do? Take out a looking-for-love ad addressed to you personally and leave an underlined copy on your desk?”
“I’m sorry. You’re right. After a while I did have a pretty good idea you were interested.”
“Well, all I can say is you’re a mighty timid rabbit.”
“I guess I am. I suppose every time a relationship ended, I just got more defensive. I’ve never handled rejection well. And I’m lousy at reading people. I notice every little thing that might look like disapproval, and I find those things everywhere. Like even in a checkout line. I’ll notice whoever is behind the register chatting with the person in front of me, but when it’s my turn, the smile disappears.”
Shannon yawned. “Did it ever occur to you they might know each other?”
“I suppose. My daughter’s always pushing me to try harder. To get out and about more; open up to other people. That sort of thing. I guess I’d be better off if I did, but I don’t really know how to go about it. And if I did, I’m not sure I have it in me anymore to try.”
But those were just excuses, weren’t they? More likely he was being defeatist and cowardly. He’d been more social when he was younger. Or at least more social in his own way. Why couldn’t he try being that way again, now that there was someone who could be at his elbow to smooth out the awkwardness? Shannon was great with people.
The Turing Test Page 11