Burn-In

Home > Other > Burn-In > Page 20
Burn-In Page 20

by P. W. Singer


  “The English factories that kicked off the first Industrial Revolution were no more held back by Queen Elizabeth than a prohibition on systems like TAMS today would stop the advance of AI and robotics.”

  “Why bother with your commission then?” Keegan asked. “If it’s all going to happen anyway, why go through the motions?”

  “The repetition of fear does not mean that we shouldn’t observe it and assuage it,” he said. With ease, he launched into what sounded like another practiced speech.

  As she listened, part of Keegan was drawn in. But another part of her brain warned that his intonation and pauses were likely also scientifically tested to resonate. A tech guru was not naturally a great orator, but they could engineer themselves to be.

  “There is an arc that runs from good Queen Elizabeth to the revolutionary Thomas Paine, who urged us to cast off her nation’s royal rule, to the scientists like Carl Sagan, Stephen Hawking, and Elon Musk some two hundred years later. All of them warned, quite reasonably enough, of the dangers of automation. Sagan was perhaps the most eloquent harbinger because his concern came from a place of deep understanding: ‘I have a foreboding of an America in my children’s or grandchildren’s time—when the United States is a service and information economy; when nearly all the key manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness.’”15

  Keegan had served under a major who’d spouted Winston Churchill quotes that he listened to while he slept. The only way to stop it was to derail him with a question. “And what is your role, then?” she asked now.

  “I grew up with my hands in the soil before I could walk, Lara. My family are all farmers in Ohio dating back to their arrival in the first wheeled wagons. It’s an identity that transcends profit and loss. The productivity that we embraced started with GPS and Microsoft Excel, then crop-dusting drones and self-driving wheat harvesters. This was why I graduated from high school at eleven—to invent out of necessity. But machines meant fewer people each season. By the time I got my PhD at MIT, my parents were dead and the immigrants we hired each season were no longer needed. I was the only person left on the farm. There was only me, not we. I was the omega point of almost a thousand human generations of agricultural advancement.”

  He paused, as if to give her time to contemplate the statement.

  After a few seconds, she felt the need to fill the silence. It was a shtick, she knew, but on the other hand, the guy had the right to it. Of the two people in the room, he was the one with eleven zeros after his net worth. “So you feel responsible then?”

  “Of a sort,” Shaw said. “I’ve done my part to create the ‘disruption’ that the winners of the new economy worship, but that makes me very uneasy. So when I do my work in this town, it’s with an understanding of what’s truly at stake. What’s of grave seriousness to me is that our society has the tools it needs to make it through the coming storm. And the pathway through it is narrow, for we do not have Beijing’s ability to enforce consensus on social control, nor should we ever.”16

  He smiled. From the way his eyes widened and his lips curled, it was a look of genuine excitement at what he was about to share. “I’ll let you in on my secret, Lara: this is a solvable problem . . . once you get past the politics. Liberal or conservative, those are outdated labels useful for dividing, not solving. Society is simply a design ­problem.”

  “Yes, sir,” said Keegan, not knowing what else she was supposed to say. As she did, Shaw rubbed his forearm again and returned to the other side of the table. At that moment, Gibbon walked in with a plate of charred seaweed, shaved beets, and unfamiliar purple flower petals. Keegan noticed that there was also only one plate, answering the other question that had been troubling her: When was this meeting going to end?

  “I see I have kept you too long already. I have a simple offer, with a simple premise: consider me a friend, Lara. Whatever resources I can offer you and TAMS are yours should you need them. We all need allies in the battles yet to come.”

  Keegan still couldn’t tell whether he was lying. And she now knew that TAMS couldn’t help with that either.

  Potomac Overlook Neighborhood

  McLean, Virginia

  It shouldn’t have been as easy as 12345, but it was.

  Sipping oolong tea with the morning light shining in through his office’s window, Todd had started by probing the IP addresses at the Chait home. He did not want to rely on Preston’s tech unless he really had to, so his plan was to run a brute-force attack, using an old amended password recovery algorithm called Hashcat, to ping one potential password after another until he gained access.17

  As he typed out the program’s commands, Todd caught sight of his dirty fingers. Thinking through what he’d need to do for this unexpected diversion, he’d lain in bed all night, forgetting to even shower and change after the dig. Isabella used to get on him for that, the way he’d become so consumed with his work that he’d forget to take care of himself. It was perhaps what he missed the most—those gentle reminders that let you know someone else was watching out for you.

  He started the program, but before he could even lean back in his chair the computer pinged an alert that it had achieved its goal.

  “12345” it typed into Chait’s system.

  The lighting system in his neighbors’ smart home had come with a default password, and like most consumers, they hadn’t changed it.18 Whether the cause was their laziness or shoddy design on the part of the manufacturer didn’t matter. It allowed Todd to connect to the operating system for the energy-saving intelligent lighting system over the kitchen sink. From there, he moved laterally across the central software platform, gaining full systems access. It now allowed him to manipulate anything in the house that was connected, whether it was to unlock the doors or change the shower’s water temperature.19 The zeros and ones traveled thousands of miles around the globe, pinging from server to server, but he soon had access to what he really wanted, the gas stove just 3 feet from the hacked kitchen lights.

  Now it was just a matter of figuring out when. Todd ran through the home network’s machine-to-machine communication logs. Line by line, it gave up every detail of the Chaits’ lives. Not just what rooms they went to and for how long, but even what they did in their bedroom, revealed by the minute increase of room temperature caused by the increased body heat of physical exertion.

  As Todd saw just how much of their lives the couple had unknowingly given over to the machines—to monitor, to decide, to run—the more it felt like he would be liberating them. Death, after all, was a universal experience among living beings, but only the human mind so concerned itself with questions of its before and after. He was going to offer them up that insight, that lesson, as a gift.

  “Hakuna Matata,” Chait had said. Todd knew the song well, from his son watching it day after day after day on his little iPad, while eating Cheerios for breakfast.

  It means no worries

  For the rest of your days.20

  FBI Academy

  Quantico, Virginia

  “Your office’s changed,” Keegan said. “Where’d all the landscapes go?” She also wanted to ask where her origami robot had gone, but that would only draw attention to it.

  “Sorry, part of an office redecoration,” said Modi. “What do you think?”

  “I’m not a fan of the couch,” she said of the narrow brown two-person couch she sat on. “Not a fan of the color.”

  She didn’t say how much the tiny touch reminded her of the marriage counselor’s office, making her wonder if the furni
ture choice was part of a psychologist’s playbook. She flashed back to her and Jared sitting leg to leg, connected physically as they shared their emotions out loud. The counselor might have hoped it brought them closer, but all it had done was create a familiar claustrophobia. That feeling of being pressed into somebody, surrounded by the faint musk of shared fear and frustration, was too much like sitting in the back of an M-ATV, just waiting to be hit by an IED. They never went back to marriage counseling; she blamed the couch.

  “But I dig the pictures. Way better than the seascapes. Looks like it was done by some video game developer on acid.”

  The images on the wall now had the pixelated swirls of a pointillist painting, woozily human forms colliding with digitized inanimate objects. One had a blue upholstered chair running on a neon yellow sintered track, dashing ahead of a herd of monkeys. Another showed a sunset framed by clouds made up of thousands of rainbow-colored eyes.

  “It’s from Kendo, the Neuromodal Movement artist algorithm,” he said. “I got brain scanned, and a few weeks later the algorithms gave me my portfolio. Not another like it in the world.”

  “I can imagine. Expensive?”

  “I still work for the government,” he remarked, cocking an eyebrow as if reconsidering and approving of the investment. “Kendo’s algos are based on the original DeepDream code, the first AI artist, so there’s a retro element.”21

  “Only now by putting all that on your wall, TAMS now knows your innermost thoughts and identity.”

  “Who’s to say that it already doesn’t know that, it’s just not authorized to share it with you?” He smiled knowingly. “So, shall we begin?”

  “I thought we had,” said Keegan.

  “No, we’re just talking,” he said. “But now you’re on the clock . . . So how is your training going, TAMS?” He looked over at the machine that stood beside the couch. There was no way in hell she was going to have it sit beside her.

  Immediately, the robot began speaking, as if it had successfully predicted that it would be asked before Keegan. “We are working on developing situational awareness based on real-world data inputs, macro-observation, and scenario development,” said TAMS. “This is being complemented by synthetic experience modules derived from my training evolutions with Agent Keegan. Notably, we also apprehended two suspects attempting an armed robbery.”

  “I saw that. TAMS, does Agent Keegan like working with you?”

  Keegan shifted forward in her seat and Modi held up his hand before she could interject.

  “She is pleased with the pace of my evolving operational performance.”

  “How do you know that? Has she communicated that to you?”

  “Yes, her biometrics, physical observation, and other data all have shared that assessment.”

  “But not verbally,” Modi said, looking back at Keegan.

  She wasn’t sure how she felt about him catching that, so she showed no reaction. Maybe Shaw and TAMS could read her, but she didn’t need Modi doing it too.

  “What does that mean then, TAMS? Are you a good partner to Agent Keegan?”

  There was a brief pause as the machine seemed to run an assessment across untold parameters. “No,” said TAMS.

  Keegan felt a surge of, what was it? Shame? Had she disappointed a machine?

  “Will you be, TAMS?” Modi asked.

  “Yes.”

  “When, TAMS?”

  “Based on my current models and rate of learning, in approximately nine days.”

  “That exact, huh?” said Keegan.

  Modi turned back to her. “And you, Agent Keegan?”

  “I’ll need longer. Maybe ten days,” said Keegan.

  “That’s not what I meant,” said Modi with a chuckle.

  “I know,” she said. She brought the fingers of her hands together into a tent, and then rested her chin on the thumbs. Though she unconsciously did so, her brain signaled that it was a close mimic of the analytic stance Modi used at their first meeting, and she smiled. Modi smiled back, seeming to come to the same observation.

  “It’s going well.” Keegan pivoted her fingers down to point at Modi. “I can see there’s a lot of potential in tactical situations and real-time fusion analysis. But you’re going to need the right person with it. Every street we drive is familiar to it; it’s got the databases. But the street isn’t just the street. It’s more than the people, the vehicles, the weather, birds, whatever. It’s a feel. That makes not only each street different, but each time we drive or walk by it different. For example, Ben’s is going to be different next time we go by it—probably won’t be another couple of morons trying to rip off the place.”

  “And you may not go swinging off balconies again, either,” said Modi.

  “You joke, but you’re right, I can’t. Or rather I shouldn’t,” Keegan replied. “Not because it’ll kill my back again, but because it’s a move that I’ve pulled before. It’s predictable. That makes it exactly what TAMS wants. But being predictable is what might also get you killed. There is this saying by a German general back in World War II that we were taught in the Corps: ‘When faced with the same situation in combat, never do the same thing.’”22

  “I see. But you also said that we’re going to need the ‘right person,’” said Modi. “Is that you or someone else? What did you mean by that?”

  “I mean, it’ll read whoever the human is, I guess, by their facial expressions, biometrics, past performance, whatever it picks up data-wise,” said Keegan. “Tell me, do you think it will get more accurate at predicting what I will do, based on how it perceives the neurological and biological basis of my future behavior?”

  “Definitely. It’s what would make it a damn good poker player,” said Modi.

  “Not the first time I’ve heard that,” Keegan said.

  “How’s that?” said Modi.

  “Willow Shaw. You know him? He set up a meeting to check out TAMS.” She didn’t mention the location.

  “Know him? Not exactly like you do now, I guess, but of course I know who he is.”

  “He said something about how our emotions are simply data derived from our physical states and they can be hacked like anything else.”

  “It’s not that simple, Agent Keegan,” said Modi.

  She noted how he continued calling her by her title, unlike Shaw.

  “What you’re looking for is something that the old science fiction often got mixed up in the pre-AI days. In trying to predict how machines would one day approach true human-level understanding, they’d blend sentience and sapience. Sentience isn’t about a robot becoming a conscious being; you know the kind that were always going to rise up and ‘Kill All Humans.’ It’s simply the ability to perceive and understand one’s surroundings. Sapience, though, that’s the big stuff. It comes from the Latin for ‘wisdom,’ whether it’s just the wisdom that we call common sense or the kind of wisdom that comes from a deeper understanding of a situation beyond raw facts. How we read one another, as humans, is more in the land of sapience and it is not even close to being understood.”

  He pointed at the pixelated images on the wall. “A machine can learn patterns, complex ones far beyond our own brain’s ability to compute. But how we interact is far more complex than even that, for the reason that it is linked to the very question of how we define ourselves as Homo sapiens. Note that how we name our own species draws from the idea of having deeper wisdom and understanding.”

  “So I take it that you don’t agree with the billionaire.”

  “He may own his own plane, but no. Emotion has all sorts of tells that you or TAMS can learn, but truly understanding what emotion means takes wisdom.”

  “So when TAMS gets pissed off at another robot driver, then they’ll have crossed the line as a species?”

  “Of a sort, but it will still only be simulating an emotion.” He turned to look at the robot. “TAMS, what is the value of emotions?”

  “They provide data points, which aid my understandin
g of past, present, and projected human behavior.”

  “For example?” Modi asked.

  “While we traveled in a vehicle, Agent Keegan heard a song that reminded her of a prior relationship, most likely romantic. While she did not articulate her sadness, it was evident from her data.”

  Keegan blushed, and knowing she couldn’t hide it, smiled and waved it off. “OK, enough about my history of broken hearts,” she said. She considered what to say—she didn’t want this to run its course to areas she’d rather not talk on. “So if TAMS is learning, real time, then at what point does it ‘know’ something?”

  “What do you mean? ‘Know’ can have many meanings.”

  “It seems like ‘knowing’ something is a state. But TAMS is always going to be changing, given the data fire hose it’s drinking from. Does it have any actual knowledge or is it just always going to be data input and output? AI may be modeled after our brains, but what we observe is limited to such a dramatically narrower data set than what TAMS can register. I think that limitation, that very uncertainty, is exactly what makes it possible to know something.23 That there’s actually so much that we don’t know is what gives us the kind of conviction on certain things that no machine would ever be satisfied with.”

  “Deep thoughts for a Marine,” Modi observed, tipping his head in respect.

  “Safe space, right?” Keegan said. “Don’t tell anybody.”

  “I won’t. The truth is we don’t know, and may never. Because if our creations ever reach that point, their sapience would go past ours.” Modi leaned back in his chair, seeming to enjoy the conversation.

  Keegan realized she was too, and that left her even more uncomfortable on the undersized couch. “That seems like a good ending point,” she said. “We all good?”

  “That is for you and TAMS to decide.”

  Potomac Overlook Neighborhood

  McLean, Virginia

 

‹ Prev