by Andrew Watts
He turned to face the engineer who had been giving them a tour. “What are you guys doing in here?”
The engineer said, “They asked to . . . I mean, her hand unlocked . . .”
Pace said, “You may leave. We’ll finish this.”
The engineer scurried out of the room, red-faced.
When the door closed, Ava looked at Kim. “Why didn’t I know about this?”
He said, “These are highly classified projects. Very few people in our organization know they exist, and I prefer to keep it that way.”
“Even me?” Ava said.
Kim said, “We need to do this work, Ava. I understand why you might not be okay with it.”
“And you are?” Ava turned to Pace. “What about you? I know you’ve voiced ethical concerns within our company before. How do you feel about these tests, manipulating people’s opinions?”
Kim’s lips tightened. “If we didn’t, then someone else would. Pax AI must be at the forefront of this innovation. Think about what this could do if it ended up in the wrong hands.”
Ava said, “If all guns are outlawed, only criminals will have guns?”
“Something like that, yes,” Kim said.
Pace was looking down at the floor.
Colt said, “How does it work?”
Pace said, “We’ve meshed together several of our best AI programs. The deep-learning AI monitors millions of social media accounts. They detect what people are looking at, what articles they click on, how long they spend reading an article or watching a video. They track where each individual goes and match it up with various other bits of information to create a values profile. We cross-reference different surveys and activity patterns to determine what the subject believes. Another AI program introduces stimuli and measures the impact. Taken together, each person’s online behavior can be studied and mapped out. The more data we get, the better we become at learning how to influence opinions.”
Colt folded his arms across his chest. “How is this different than what advertisers and social media companies have been doing for the past decade? Political campaigns do this all the time. It’s very hard to sway opinion. Marketers spend billions trying to get people to pick one brand over another.”
Kim nodded. “You’re right. But you just named two organizations. Social media companies and political campaigns. Social media companies have much of the same capability we do. They have the data and can control a lot of the stimuli. They even use AI. But they normally don’t try to change opinion. Instead, they make money selling to advertisers who try to do that. But the advertisers are half blind with one hand tied behind their back since they don’t have all of the data or control most of the stimuli.”
Pace said, “Our program fully integrates into a person’s devices. We have all the capability of a social media company, but not just on the social media apps. We’re controlling everything the subject is served, with the goal of shifting opinion.”
Kim said, “Think of it this way. A political campaign advertiser can serve you a paid ad and take up about ten percent of your feed. Maybe fifty percent of your advertising. But they can’t control the organic social media exposure, or the recommendations from the streaming video apps. They can’t control what news articles you are served and recommended. Our AI program does all of that. And here is the important part: our AI measures how different stimuli change opinions after exposure, and learns what to show them next, with our ultimate behavioral change goal as a target.”
Colt said, “This is like a mind-control algorithm. You can really change people’s opinions? Your engineer mentioned something about deepfakes?”
Kim gave Pace a look.
Pace said, “The DOD has us incorporating some deepfake stimuli into the tests.”
Colt played dumb. “What are those?”
Pace said, “We can create manipulated videos that will appear real. Customized offensive stimuli that will change the beliefs to what we want. It’s mind-blowing stuff, and incredibly dangerous. But this is all under government supervision. And to be honest, we have more work to do on those. The images can appear smudgy or low-resolution.”
Ava said, “Can you show us this in action?”
Pace sighed and looked up at the screen. “This woman here on the TV screen was a vegan, based on her feelings toward protecting animals. She was one hundred percent against using animals as a food source. And this is her at the end of the test.”
“She’s eating a steak.”
Colt said, “How long did the opinion shift take?”
“Ninety days,” Pace said.
“This has to be illegal.”
“All five hundred participants signed waivers,” Kim said.
“Are the subjects made aware of what the experiment is trying to do?”
“No, it wouldn’t work if they were aware,” said Pace. “Listen, obviously some of us have qualms about these studies. But we need to understand the technology. Think about what you could do with this type of AI. Technology is changing how people interact. More and more often, human beings are isolated, alone, using their own technological devices to get information. It would be hard to know if one individual’s information was being customized and manipulated if you didn’t have as much outside contact with others. This would have been impossible a few years ago. But now, with some of our more powerful AI programs, it can be done.”
Colt said, “This could swing an election.”
Pace said, “Easily. It could also overthrow a government. Destroy a rival company. Think about what you might be able to convince people to do if you had no morals. Think about what a computer program might convince people to do, if it helped it to achieve its goals.”
Colt felt a chill go down his spine. “I would like to think that I am strong enough to avoid manipulation by a machine.”
“We all would,” said Kim. A beep at the door. “Come in.”
The door slid open. “Mr. Kim, your helicopter is arriving soon.”
Kim looked up. “Time to go.”
Kim led them out of The Facility. The building was a collection of hallways and research rooms, data storage and security rooms. The quantum computer and its support systems made up a significant chunk of the building.
As they walked, Colt continued asking questions. “Luke, what are those?”
Pace said, “Battery storage chambers. Power is an issue here. We need to have access to huge amounts of power at the right moments, and we can’t run out.”
“Is that because of the quantum computer? That must suck a lot of power.”
Kim shook his head. “No, actually. One of the benefits of quantum computing is that it doesn’t take up as much power. Your typical supercomputer takes up around five megawatts of power. Enough electricity to power a town. But our quantum computer uses only about five hundred kilowatts of power. It’s very efficient compared to traditional computers.”
Colt said, “So why do you need all those solar panels and windmills and batteries?”
“Deep learning uses good old legacy computing. It’s the legacy computers that use up so much energy and those servers generate a lot of heat. Did you see that water tower on the way in?”
“Yes.”
“It’s not for drinking. If we ever ran out of electricity to blow our fans and run cool air across our data storage and servers, we have a backup system of water cooling. The heat generated by all our computers would transform that liquid water into steam. The transfer of energy will cool it down, in the case of an absolute emergency. If you ever see steam shooting up from above The Facility, that means we’re about to lose all our data and burn up. We need power to run our computers, but also to cool them. Know how fast that water would be transformed into vapor?”
“How fast?”
“About two minutes.”
“No way.”
“Seriously.”
“It’s the size of a city water tower. That’s nuts.”
“Major data ce
nters use these emergency systems as well. If the shit really hits the fan, it buys us a few extra minutes. We have backup electrical systems, but they can take a few seconds to get online. So the water tower could prevent hundreds of billions of dollars in lost research.”
They walked out of the main exit and toward the helicopter landing pad. Once again, Colt was struck by the expansive Pacific Northwest forest. Mountains and evergreen trees as far as he could see. The cool, crisp air. And the quiet, except for a humming drone emanating from The Facility and the occasional cry of eagles echoing throughout the land.
He looked over at Ava. She was unusually quiet.
While they waited for the helicopter, Colt said, “You mentioned that Kozlov was crucial to your AGI progress. What could he do that was so special?”
Kim said, “It’s a bit complex, but . . . are you familiar with the control problem?”
Colt said, “Let’s pretend I’m not.”
Kim turned to Pace, who said, “The control problem. Okay. Our goal is to create a super-intelligent AI system. One that is safe and friendly. One that will help us, and not hurt us. In order to do that, we must make sure that any super-intelligent agent wouldn’t just take control of its own programming after birth and prevent us from modifying it.”
Colt said, “And Kozlov used this neural interface to solve the control problem?”
Kim shrugged. “More like managed it than solved it.”
Colt looked up, hearing the echo of a distant helicopter. “Is that our ride?”
“Yes.” Kim checked his watch. “They’ll be here any moment.”
Colt said, “So how was Kozlov able to manage the control problem?”
Pace said, “There are several theoretical solutions to the control problem. One of our safety protocols in developing a super-intelligent agent is to ensure that certain tripwires exist. These tripwires simulate the various probable ways that a super-intelligent agent could . . . go wrong. It includes the most likely ways it would attempt to take control of itself. So . . . we run simulations with our AI systems.”
Colt said, “This sounds like you are working on bringing Frankenstein to life.”
Pace shifted his stance, laughing nervously. “Kind of. We add different ingredients to our lifeless body and see what happens. We add different algorithms, integrate new capsule neural networks, add more power. Each time we test and see how it behaves. How it solves problems. How it communicates. If it makes certain choices . . .”
“And if the tripwires are set off?”
“Then we know that the super-intelligent agent is trying to do something we don’t want it to do.”
“But if it is truly super-intelligent, couldn’t it just think of a way to go around the tripwires that we mere mortals hadn’t thought of?”
The helicopter was in visual range now, flying toward them. A speck of gray and blue getting lower in altitude, its rotors growing louder.
Kim said, “We have several controls in place to make sure that even if the AI does bypass our tripwires, it will be limited in its capability.”
Pace piped in. “Meaning that it wouldn’t be able to harm us . . . easily.”
“That doesn’t sound very reassuring.”
Pace said, “Well, like the first nuclear bombs in the Manhattan Project. None of the scientists really knew what would happen when they set off Trinity.”
Colt noticed Kim twitch at the last word.
“Trinity,” Colt said. “Like the AI group.”
“Yup,” said Pace. “The news article I read on Trinity said that’s where the name came from. Their conspiracy theory lore says their founders were scientists in the Manhattan Project or something.”
Colt was watching Kim, who remained tense. “Jeff, what’s your opinion on Trinity?”
Kim was looking up at the approaching helicopter. “They want to make all AI open, even as we close in on creating a superintelligence. I think the group that calls themselves Trinity is the worst kind of conspiracy theory. The kind that is harmful to society.”
Ava said, “I agree. It’s ignorant to make such harmful technology available to the masses.”
Colt said, “I’ve read there is a big counter-movement forming. Anti-AI groups.”
Pace waved away the idea. “You might as well stop rabbits from breeding. AGI will happen. It’s the natural progression of technology. Humanity won’t stop advancing. Fire. Wheel. Horse-drawn carriage. Automobile. Plane. Space shuttle. Satellite TV. 5G cellular networks. Voice assistants. Drones.”
Kim nodded. “He’s right. The cycle time of technological advancement is getting faster. Some people argue we have already achieved a superintelligence, collectively. Because the internet has connected us to each other. Human beings are a sort of hive mind now, with access to all of the world’s information.”
Colt said, “I think certain politicians might prove wrong your idea that we’ve reached superintelligence.”
Kim smiled. “The point is this: the world is going to invent an AGI superintelligence. Our goal is to beat them there. We need to make sure it has values that align with our own.”
The helicopter was flaring into a hover above the pad now. The group turned away to avoid the dirt kicked up by the rotor wash. A security guard waited until the aircraft had touched down and then escorted them into the cabin, shutting the door behind them. They buckled into their seat harnesses, and the aircraft took off.
Colt felt his stomach flutter as the helicopter made a steep bank around a mountain, heading toward the California coast. The ocean reflected orange in the setting sun. They didn’t talk much on the flight, and in less than an hour, the helicopter landed on the pad just next to the Pax AI headquarters.
The aircraft shut down and the group got out. Colt could hear the sound of traffic and cable cars. One of those self-driving cars passed by, its LiDAR spinning on the roof.
Kim looked at Ava and Colt. “Both of you have signed nondisclosure agreements. You are not to discuss the details of what you saw today. Colt, I understand you’ll need to convey some topline information to your firm. But no details. And I’ll need to approve anything that discusses our government programs or AGI progress. Understood?”
They both nodded.
An SUV pulled up to the helipad and a security guard opened the door for Kim to get in. “If you’ll excuse me, I have a meeting.” He paused, studying Colt’s face. “I hope you now see that we are not doing anything nefarious at The Facility. Quite the opposite. We want to ensure AI is used safely and for the benefit of all mankind.”
Colt nodded. “I appreciate the tour. It was eye-opening and very helpful. I’ll make sure to convey what I can to my superiors, and keep my mouth shut about the confidential parts. They want my judgment on the safety and efficacy of investing in this company. After seeing what I did today, I can better do my job.” It wasn’t a glowing endorsement, but it was all Colt could think to say in this situation.
Kim nodded and the SUV drove away. Pace headed toward the headquarters building, back to work.
Ava and Colt stood by the helicopter, now silent and tied down.
Colt looked at her. “Are you okay?”
“No. I’m ready to meet with your friend.”
26
The next day, Ava and Colt took a self-driving taxi to an office building in North Beach. They climbed the stairs to the second floor, a dark hallway with only one room lit up.
As expected, Wilcox and Rinaldi were waiting inside. Wilcox closed the door behind them and offered Ava a bottle of water, gesturing for them both to take a seat on the old couch.
“Ava, this is Ed Wilcox. The friend I was telling you about.”
“And I’m Special Agent Rinaldi.” They all shook hands. Ava looked nervous.
“I should begin by saying that I have reservations just being here. I don’t know of any coworkers who are doing anything illegal and I think they’re all good people.”
Rinaldi was gentle. “We underst
and. No one is in any trouble, Ms. Klein. So what made you want to speak with us?”
She exhaled and looked at Colt, who nodded supportively. Ava said, “I recently saw a lot more detail on some of Pax AI’s programs. Projects I hadn’t seen before. I won’t be able to tell you about them. A lot are government projects. But the nature of these programs made me think about my work differently. I don’t think I saw how dangerous it could be, until now. And with Kozlov’s death—you know about Kozlov, right?”
Wilcox nodded. “Yes, ma’am, we’re familiar with what happened to Mr. Kozlov.”
Ava tightened her lips. “I’ve seen things recently that concern me. And I think it’s my responsibility to help ensure no one steals any of Pax AI’s technology. Colt said you are looking for help with that.”
Wilcox nodded. “Ms. Klein, we want you to know how grateful we are for your help in this matter. Colt is right. When we found out he had been assigned to Pax AI, I let him know about our security concerns. Now unfortunately, this isn’t something we can investigate by going directly to Pax AI itself. We need to look at this without stirring the hornet’s nest, so to speak. We have reason to believe that someone is actively trying to steal Pax AI technology. And we need help identifying that person before they can succeed.”
Ava glanced at Colt uneasily. Wilcox began asking Ava about the upcoming tests, and she recited the details of the Pax AI language-prediction demonstration being held this week.
Wilcox said, “So most of your leadership team will be on the fourth floor during the demonstration, is that right, Ava?”
“That’s right.”
“Would you be willing to bring a small device up there during the test? This device might be able to detect whether anyone is trying to hack into Pax AI’s system.”
Ava looked dubious. “But they make us check all electronics and phones at the door by the staircase before we enter. The security is very tight. I can’t even bring in my watch.” She pointed at an Apple watch on her wrist.
Wilcox looked thoughtful. “Purses, jackets, clothes, shoes? Anything else checked before you enter?”