Burn-In

Home > Other > Burn-In > Page 11
Burn-In Page 11

by P. W. Singer


  She frowned. “Sorry, not going to happen. Machines don’t have ethics. They can be programmed to lie and not even know it. And I can lie to a machine and not break a sweat over it.”

  “No, you’re thinking of trust the wrong way,” he said. “Trust isn’t about morality or emotions. It is about understanding, the person . . . or machine. Trust is simply about acting in an expected manner. I can ‘trust’ that someone wouldn’t lie to me, because I know them. Or I can ‘trust’ that someone is going to lie to me because they are suspicious of me, just like you did just a minute ago about the range. And what we are working on here, not just you and me, but you and TAMS, is building the understanding that allows that kind of expectation. For this test to work, you two need that kind of trust.”

  “I’m not sure this bot is ever going to get there. Not just with me, but with anyone. It’s capable, but that doesn’t make it useful . . . dependable . . . and, however you want to define it, that’s also what trust means out there.”

  Modi nodded. “But it can learn that, maybe, with your help. In many ways, the crossroads we’re at with you and TAMS illustrates the journey we’re on with AI, going all the way back to Alan Turing’s original conception of computation that could unlock the answer to any question.”3

  Some people just can’t help but data dumping on you, Keegan thought. She wondered if Modi’s problem was hardware or software.

  “You talked about the challenge of uncertainty, the unknown,” he continued. “But that’s what this has always been about with artificial intelligence, even when John McCarthy first coined the term ‘artificial intelligence’ to describe the strange idea of a computer becoming a thinking machine, even when they were little more than wall-sized calculators.4 How to handle the uncertainty encapsulated in the questions of how to think, how to act? First, they thought they could do it with raw processing power. The rules are just the engine. Find the rules of how something works, then just crunch out an answer.”

  “My parents filed their income taxes like that.” Keegan tossed it off, mostly to make sure he knew she wasn’t going to just sit and listen quietly if he kept lecturing.

  “But then they figured out that not everything ran according to a set of static rules, especially anything related to people,” Modi said, the pace of his voice increasing. “The more you involved people, the more the rules got complex and even contradictory. The expert systems were only so expert. So training AI instead became about the machine setting priorities and finding and following patterns. It’s like how the first driverless cars had to come to grips with incoming data and decide whether to pay attention to the crow eating roadkill or the eighteen-wheeler swerving over the center line. The car had to recognize something in the environment and then respond using a set of prescribed options: stop, accelerate, left, right. The key limit was that it was happening in an environment where there was incredible dynamism but no real adversary, no one operating against it.”

  “You must have never driven in Boston.”

  Modi tipped his cup in salute. “That’s actually it—facing an adversary is not just what makes an AI more useful in the real world, but it’s also how the machine itself becomes more intelligent.”

  “A revolution from evolution.”

  “Exactly! AI battling at games of checkers, computers beating humans back before we’d even gone to the moon.5 Then it happens with chess, a game that most humans can’t even master.6 But then the research field gets stuck in the ‘AI winter,’ this pause where it seems like the promise of AI is never going to happen, that it’s just going to be science fiction forever.7 But like so much, war solves that technical problem.”

  “I thought their use in war came later?”

  “I meant the war inside the machines, their conflicts of evolution. We figure out the machines can battle inside themselves, running literally millions of tiny experiments that each refine an answer. So you get deep learning, this combination of networks modeled after the neurons in our brain, layer after layer of them firing away in combination and competition.”

  “We had a substitute teacher that would keep us busy by playing those old history vids. Some smart guy on an old quiz show losing to an AI.”8

  “They always get that wrong. The real turning point was when an AI won at Go.9 You familiar with it?”

  “Yeah, we had to learn about it in adversary training courses in the Corps. It’s the ancient Chinese strategy game, like backgammon, but on steroids.”

  “So an AI beats the top human at Go. It’s a big deal, as it was decades before they thought it would be possible because of the complexity of the game. But then the very next version of the AI does so again, this one starting from scratch, just being shown the board and the rules.10 The machine didn’t win because it was faster or crunching more data simulations than the humans can, like they’d already done in chess or the stock market; it was that the machine came up with moves humans hadn’t thought of in the over 2,500 years that they’d been playing.”11

  “Which is also what convinced the Chinese to jump so hard into the AI arms race . . . and then sell their versions to any guerilla army with enough yuan,” Keegan said.12

  “But all the gear you dealt with out in the real battlefield was still bounded, like driverless cars reacting as they drive down the street or the player moves on the Go board. A drone or a combat bot had a set area to operate in, a set of yes or no targets. It was adversarial but still kept within certain bounds,” he said, bringing his thumbs and pointer fingers together to create a square.

  “You and TAMS, though, are in a gray area,” he said, opening his hands back up slowly. “Because of how all these advances came together in new ways, it took us back to that original problem—what happens when you combine the old expert systems with all the new approaches?” Modi began ticking them off with his fingers. “Deep learning, Bayesian and evolutionary computing, symbolic reasoning, neural networks modeled on the human brain down to the sub-neuron level via exascale computing but made more powerful than any single brain through synthetic synapses firing literally millions of times faster than the ones in your brain, cross-database access, and cloud-driven insights.”13 At that, he ran out of fingers to list the tech advancements on.

  “All to get a supersmart robot that still can’t figure out how not to get stuck up a rope,” said Keegan.

  “Maybe so. All that advancement does is end up at the same place of uncertainty they began with back when they were using slide rules and didn’t have the Internet. TAMS has rules to follow, but it also has to sort through limitless data, maybe in a format or type that not just TAMS but maybe even no human has ever encountered before. And then throw in the variable of the people in our world. Whether it’s some car out on the street or a drone missile hunting at sea, its zero or one, human or not, friend or foe. In a law enforcement setting, however, people are potentially adversarial in limitless ways.”

  “No, it’s just as binary as in war. There’s us and everybody else. You learn that very early on, literally the first day here at the Academy.”

  Modi rolled his eyes. “You said, ‘It’s us or them.’ That is a cliché, but what it encapsulates is also one of those simple human flaws of bias that a machine has to learn.”

  She bit her tongue at that, knowing he was trying to provoke her, another technique she’d learned in SERE.

  “Whether it’s a positive like common sense or a negative prejudice, machines aren’t born with it. Theirs is an artificial and thus learned intelligence. Hell, even the word ‘artificial’ in AI might be wrong, as it implies something human-made. It’s really something beyond that, self-evolving.”

  “But will it ever learn enough?” Keegan asked. “Sure, you can run millions of simulations with all the variances its cores can handle. But it still might not get the optimal answer for the simple reason that any human intuitively knows you just can’t model out every scenario that might happen. There’re always blind spots. Which is easy for
us to accept, but bots can’t.”

  “TAMS can,” said Modi. “If you teach it that. Same as you might a rookie human partner.”

  “That’s where you’re wrong,” she replied. “Just like ‘trust,’ ‘partner’ has a very different meaning beyond the books. TAMS is not a partner, but a technology. The drones that we’d fly out to search for enemy positions were no different than the stone some caveman first picked up. No matter how high-tech something gets, it’s just a tool for a job.”

  “Maybe, but technology is blurring those lines. Take your example of the drone. They may have started out being directed to one target or another, but they evolved, into what were called ‘loyal wingmen.’ Each version was not just more skilled, but tailored to the human pilot’s needs, so that the pilots would eventually trust them to take on threats and jobs that they couldn’t handle themselves.”14

  “So, then let me ask you the same question that military pilots would,” she said. “If you gave me a human partner instead, do you think I’d be any worse off? ’Cause me being worse off with a human is the only reason to have TAMS instead.”

  “Well, if you had a human partner, we wouldn’t be talking.”

  “Better off then?”

  “That’s up to you.”

  “Maybe we ask the bot?”

  “It wouldn’t know where to begin.”

  “Which is my point,” said Keegan. “If TAMS can handle the street like it did the interrogation at the Dizz-Diff, then it might be a useful tool someday.” She looked over at it, to make sure it didn’t get full of itself. “But it’s got to be more than that to be a real partner in the way an agent thinks about it. Our job is not just about finding patterns, signals in the noise. It’s also about the dog that doesn’t bark, to quote the greatest detective of all time.15 A good agent connects dots that aren’t even there. Gaps in information are an opportunity to be creative. That’s the difference between a robot being a metal K-9 and an actual partner.”16

  “And that, then, is on you, Agent Keegan,” Modi said, as if he had planned for the conversation to get to this point all along. “As I mentioned the first day, it’s not about the computing power but the training. That is what makes an AI truly intelligent. But that means you are going to have to operate differently than you did with robotics in the past. This is a learning system. In the Marines, you didn’t have any role beyond using the robot as a tool. You’d give it a mission and get out of the way. With a truly intelligent machine you’re always going to be teaching, not just by instruction, but also by doing. Your every action is observed, so you are creating a running conversation with its data.”

  “If that’s the case, then how long until it’s TAMS 2.0 sitting in this room talking to you about how awful the human it has to train is?” Keegan said.

  “Probably by that next generation where we teach by example. Isn’t that the case with parents and their kids?” he asked.

  “It’s what we like to tell ourselves, but I’m not too sure.” Keegan thought about whether she and Jared were that way with Haley, wondered if staying together really was best for her.

  “In either case,” Modi continued, oblivious to the thread he’d pulled, “never forget we get to set the rules. TAMS could one day complain about its partner, but it can’t replace you, unless we decide it. We choose our roles. We choose our future.”

  “That’s what they always say,” said Keegan, thinking about what Jared would make of that claim. In either case, she considered her professional obligation fulfilled at this point. Modi was paid to talk, but she wasn’t. She stood up, signaling the close of the conversation. “I appreciate your advice on this,” she said, “especially given how new this all is.”

  Modi looked slightly disappointed at the shift and glanced over at TAMS, which had remained seated as Keegan stood. Then he tipped his head slightly, as if assenting to end the conversation, trying to get back control. “Again, consider me an ally in all this groundbreaking work. I fully understand that in many ways the situation you’ve been put in is just as complex for you as it is for TAMS. There is no set of prescribed choices. No set Bureau policy. No set of guardrails you have to stay between.”

  “Other than to keep the deputy director happy,” she said, feeling him out on what she should take away from the boss’s hints.

  “Indeed. The one command code that you have to follow,” Modi said.

  Keegan turned to TAMS.

  “You get all that, TAMS? You do some deep learning from the conversation?”

  “Yes.”

  “And with that, you just proved my point to the doctor about how far you still need to go.” As Keegan reached the doorway, she turned back around. “Almost forgot. Look, I also need to apologize for how we started things off back at the Dizz-Diff, after the Reppley interrogation,” she said. “I was pretty keyed up.”

  She pulled from her pocket a green rectangle of plastic the size of an old-time credit card. She set it up on one of his shelves, on top of a book of psychology. The thin material bent into an arch shape as if squeezed on the ends by invisible fingers. Then it unfolded with the fluid dance of a flickering flame. First paper-thin wings, then legs, until finally a cicada-like insect perched on top of the book.

  “Consider it a small apology gift to liven up the place,” said Keegan as she and TAMS walked out the door.

  DyzRuptor Battleground Quadrant 4

  The Cloud

  Abraham Lincoln smiled tightly at the fiery blast of orange that lit the sky. The explosion illuminated an organic twist of train tracks rising and descending among titanic soot-stained factories. Then another flash and another. At the next thunderclap, he erupted into laughter.

  “What the hell is so funny?” Mahatma Gandhi asked, his voice slightly fuzzy due to an encryption program washing out any identifying markers.

  “All this work to find the perfect meeting place in the chaos of war,” said Lincoln, “and the whole battlefront might collapse on us because the NinjaJam tribe’s chieftain got grounded by his mom.”17

  “Nah,” Gandhi replied. “They’ll hold the line. A crew of their newbies leveled up in the game’s last battle royale.”

  As the two argued about the game’s new points system, Joan of Arc impatiently dug in the dirt with the tip of her sword, lifting it up to admire the graphic features, the game engine even generating the tiny detail of a speck of dirt sliding off the sharp edge. Unlike Lincoln and Gandhi, though, she was a facsimile of a facsimile, as there was no death mask, statue, or photographic record to use to generate her avatar.18 Whoever was behind her account had instead used the face of the actress who had portrayed Joan in the Netflix miniseries. Nearby, Che Guevara stroked his iconic beard with one hand, lingering for the feel. As he did, George Washington watched, quietly deducing that that meant whoever the leftists had sent as their representative didn’t have a beard in real life.

  Surrounding the group of revolutionary figures was a shimmering wall of what appeared to be water. That was the representation of the quantum encryption—any attempt to secretly monitor the conversation would change the quantum state, making it as evident to the participants as someone dipping their finger in a flow of water. Setting it within the game, though, made it largely unnecessary. The gathering only existed within the noise of a synthetic-reality digital environment of battling gamers, accessed on a distributed global network of anonymous micro-servers.19

  The group paused their cacophony as a new figure joined them. Parting the wave of water, the Charlton Heston movie version of Moses appeared in the middle of the assembled avatars. He’d been the one to create the quantum security setting, which made his avatar selection even more apt.

  “Moses, it’s about time you showed your face,” George Washington shouted. “You have a lot of explaining to do. You never leave a man behind.”

  “He was not my man,” Moses responded. “And I acted. All your man did was fail at his part of the plan.”

  “Well, it’s n
ot that—”

  Che Guevara held up his hand, interrupting Washington. “He is not the only one who owes us an explanation. We are agents of change, are we not? Why blame the Sons of Aleppo? I thought we had all agreed that the enemy of my enemy is my friend.”

  “I could care less about them,” replied Washington. “It was about creating a win-win. Either the feds’ political correctness would keep them from checking him out too closely, when they know damn well they should, or if our asset did get caught, then some Muslims get blamed for something they’d like to do anyway.”

  “But he wasn’t Muslim,” said Joan.

  George Washington snickered.

  “That’s not what I saw trending. That’s just what ‘they’ want you to believe. In a world of likes and shares, the truth is whatever we make it.”20

  “That may be, but we had not agreed on such a course,” Lincoln said.

  “If we’re going to talk about getting off course,” Washington said, “Moses here is to blame for that. My guy’s job was to hand over the explosives to take out some Ivy League lab with nobody in it. It’s a long way to go from missing a drop-off to bashing in the head of some bookworm.”

  “That was certainly not in our plan,” said Lincoln. “Moses, I know that none of us were there, but this was about making a statement. Starting out with cold-blooded murder can only undermine our cause with the public.”

  Moses watched them, arms crossed as he appeared to assess the group and its prospects. “I’m not going to be judged after the fact by those who are not willing to go forth to fight for our future,” he said. “You don’t seem to understand the stakes. It is not human jobs that are at risk from the rise of the robots. It is humanity itself.”21

  “We all take risks,” said Lincoln.

  “Not equally,” Moses said. He put his hands behind his back, twisting his fingers into an interlocking knot in the folds of his robe. After a breath, he spoke in a measured tone. “We could have destroyed all of Princeton’s AI department, but what difference would it have made? The machines of the future aren’t the problem; it’s the people who use them, who depend on them. They will continue to be lost, until we can truly reach them.”

 

‹ Prev