Machine Learning: New and Collected Stories

Home > Science > Machine Learning: New and Collected Stories > Page 11
Machine Learning: New and Collected Stories Page 11

by Hugh Howey


  “Max, why won’t you keep your hands on?” I ask him. Between the three of us, we’ve asked him variations of this a hundred times.

  “I don’t want them there,” he says. It’s as useful as a kid saying they want chocolate because they like chocolate. Circular reasoning in the tightest of loops.

  “But why don’t you want them?” I ask, exasperated.

  “I just don’t want them there,” he says.

  “Maybe he wants them up his ass,” Greenie suggests. He fumbles for his vape, has switched to peppermint. I honestly don’t know how the boys are still functioning. We aren’t in our twenties or thirties anymore. All-nighters take their toll.

  “I think we should shut him down and go over everything mechanical one more time,” I say, utterly defeated. “Worst-case scenario, we do a wipe and a reinstall tomorrow before the finals.”

  Max’s primary camera swivels toward me. At least, I think it does. Peter shoots Greenie a look, and Greenie lifts his head and shifts uncomfortably on his stool.

  “What aren’t you telling me?” I ask.

  Peter looks terrified. Max is watching us.

  “You didn’t get a dump yesterday, did you?” I have to turn away from Peter and pace the length of the trailer. There’s a rumble outside as our upcoming opponent is put through his paces in the arena. Boy, would the SoCal guys love to know what a colossal fuck-up we have going on in here. “So we lost all the data from yesterday’s bout?” I try to calm down. Maintain perspective. Keep a clear head. “We’ve got a good dump from the semis,” I say. “We can go back to that build.”

  Turning back to the boys, I see all three of them standing perfectly still, the robot and the two engineers, watching me. “So we lost one bout of data,” I say. “He’s good enough to win. The Chinese were the favorites anyway, and they’re out.”

  Nobody says anything. I wonder if this is about ego or pride. Engineers hate a wipe and reinstall. It’s a last resort, an admittance of defeat. The dreaded cry of “reboot,” which is to say we have no clue and hopefully the issue will sort itself if we start over, if we clear the cache.

  “Are you sure you can’t think of anything else that might be wrong with him?” Peter asks. He and Greenie join me at the other end of the trailer. Again, that weird look on their faces. It’s more than exhaustion. It’s some kind of wonder and fear.

  “What do you know that you aren’t telling me?” I ask.

  “It’s what we think,” Greenie says.

  “Fucking tell me. Jesus Christ.”

  “We needed a clear head to look at this,” Peter says. “Another set of eyes.” He glances at Greenie. “If she doesn’t see it, then maybe we’re wrong . . .”

  But I do see it. Right then, like a lightning bolt straight up my spine. One of those thoughts that falls like a sledgehammer and gives you a mental limp for the rest of your life, that changes how you walk, how you see the world.

  “Hell no,” I say.

  The boys say nothing. Max seems to twitch uncomfortably at the far end of the trailer. And I don’t think I’m projecting this time.

  “Max, why don’t you want your arms?”

  “Just I don’t want them,” he says. I’m watching the monitors instead of him this time. A tactical module is running, and it shouldn’t be. Stepping through each line, I can see the regroup code going into a full loop. There are other lines running in parallel, his sixty-four processors running dozens of routines all at once. I didn’t notice the regroup code until I looked for it. It’s the closest thing we’ve ever taught him to retreat. Max has been programmed from the ground up to fight until his juice runs out. He knows sideways and forward, and that’s it.

  “You have a big bout in two days,” I tell Max.

  Another surge of routines, another twitch in his power harness. If his legs were plugged in, I imagine he’d be backing away from me. Which is crazy. Not only have we never taught him anything like what he’s trying to pull off—we never instructed him to teach himself anything like this.

  “Tell me it’s just a glitch,” Greenie says. He almost sounds hopeful. Like he doesn’t want it to be anything else. Peter is watching me intently. He doesn’t want to guide me along any more than he has to. Very scientific of him. I ignore Greenie and focus on our robot.

  “Max, do you feel any different?”

  “No,” Max says.

  “Are you ready for your next bout?”

  “No.”

  “Why not?”

  No response. He doesn’t know what to say. I glance at the screen to get a read on the code, but Peter points to the RAM readout, and I see that it has spiked. No available RAM. It looks like full combat mode. Conflicting routines.

  “This is emergent,” I say.

  “That’s what I told him,” Peter says. He perks up.

  “But emergent what?” Greenie asks. “Because Peter thinks—”

  “Let her say it,” Peter says, interrupting. “Don’t lead her.” He turns to me. There’s a look on his face that makes him appear a decade younger. A look of wonder and discovery. I remember falling in love with that look.

  And I know suddenly what Peter wants me to say. I know what he’s thinking, because I’m thinking it too. The word slips between my lips without awareness. I hear myself say it, and I feel like a fool. It feels wonderful.

  “Sentience,” I say.

  We live for emergent behaviors. It’s what we hope for. It’s what we fight robots for. It’s what we program Max to do.

  He’s programmed to learn from each bout and improve, to create new routines that will improve his odds in future fights. The first time I wrote a routine like this, it was in middle school. I pitted two chess-playing computers with basic learning heuristics against one another. Summer camp stuff. I watched as a library of chess openings was built up on the fly. Nothing new, just the centuries old rediscovered in mere hours. Built from nothing. From learning. From that moment on, I was hooked.

  Max is just a more advanced version of that same idea. His being able to write his own code on the fly and save it for the future is the font of our research. Max creates new and original software routines that we patent and sell to clients. Sometimes he introduces a glitch, a piece of code that knocks him out of commission, what evolution handles with death, and we have to back him out to an earlier revision. Other times he comes up with a routine that’s so far beyond anything else he knows, it’s what we call emergent. A sum that’s greater than its parts. The moment a pot of water begins to boil.

  There was the day he used his own laser to cut a busted leg free because it was slowing him down. That was one of those emergent days. Max is programmed at a very base level not to harm himself. He isn’t allowed to turn his weapons against his own body. It’s why his guns won’t fire when part of him gets in the way, similar to how he can’t swing a leg and hurt us by accident.

  But one bout, he decided it was okay to lop off his own busted leg if it meant winning and preventing further harm. That emergent routine funded half of our following season. And his maneuver—knowing when to sacrifice himself and by how much—put us through to the finals two years ago. We’ve seen other Gladiators do something similar since. But I’ve never seen a Gladiator not want to fight. That would require one emergent property to override millions of other ones. It would be those two chess computers from middle school suddenly agreeing not to play the game.

  “Max, are you looking forward to training today?”

  “I’d rather not,” Max says. And this is the frustrating part. We created a facsimile of sentience in all our machines decades ago. We programmed them to hesitate, to use casual vernacular; we wanted our cell phones to seem like living, breathing people. It strikes me that cancer was cured like this—so gradually that no one realized it had happened. We had to be told. And by then it didn’t seem like such a big deal.

  “Shit, look at this,” Peter says.

  I turn to where he’s pointing. The green HDD indicator on
Max’s server bank is flashing so fast, it might as well be solid.

  “Max, are you writing code?” I ask.

  “Yes,” he says. He’s programmed to tell the truth. I shouldn’t even have to remind myself.

  “Shut him down,” Greenie says. When Peter and I don’t move, Greenie gets off his stool.

  “Wait,” I say.

  Max jitters, anticipating the loss of power. His charging cables sway. He looks at us, cameras focusing back and forth between me and Greenie.

  “We’ll get a dump,” Greenie says. “We’ll get a dump, load up the save from before the semis, and you two can reload whatever the hell this is and play with it later.”

  “How’s my team?” a voice calls from the ramp. We turn to see Professor Hinson limping into the trailer. Hinson hasn’t taught a class in decades, but still likes the moniker. Retired on a single patent back in the twenties, then had one VC hit after another across the Valley. He’s a DARPA leech, loves being around politicians. Would probably have aspirations of being president if it weren’t for the legions of coeds who would come out of the woodwork with stories.

  “SoCal is out there chewing up sparring partners,” Hinson says. “We aiming for dramatic suspense in here?”

  “There might be a slight issue,” Greenie says. And I want to fucking kill him. There’s a doubling of wrinkles across Hinson’s face.

  “Well, then, fix it,” Hinson says. “I pay you all a lot of money to make sure there aren’t issues.”

  I want to point out that he paid a measly four hundred grand, which sure seemed like a lot of money eight years ago when we gave him majority stake in Max, but has ended up being a painful bargain for us since. The money we make now, we make as a team. It just isn’t doled out that way.

  “This might be more important than winning the finals,” I say. And now that I have to put the words together in my brain, the announcement, some way to say it, the historical significance if this is confirmed hits me for the first time. We’re a long way from knowing for sure, but to even suggest it, to raise it as a possibility, causes all the words to clog up in the back of my throat.

  “Nothing’s more important than these finals,” Hinson says, before I can catch my breath. He points toward the open end of the trailer, where the clang of metal on metal can be heard. “You realize what’s at stake this year? The Grumman contract is up. The army of tomorrow is going to be bid on next week, and Max is the soldier they want. Our soldier. You understand? This isn’t about millions in prize money—this is about billions. Hell, this could be worth a trillion dollars over the next few decades. You understand? You might be looking at the first trillionaire in history. Because every army in the world will need a hundred thousand of our boys. This isn’t research you’re doing here. This is boot camp.”

  “What if this is worth more than a trillion dollars?” Peter asks. And I love him for saying it. For saying what I’m thinking. But the twinge of disgust on Hinson’s face lets me know it won’t have any effect. The professor side of him died decades ago. What could be more important than money? A war machine turned beatnik? Are we serious?

  “I want our boy out there within the hour. Scouts are in the stands, whispering about whether we’ll even have an entry after yesterday. You’re making me look like an asshole. Now, I’ve got a million dollars’ worth of sparring partners lined up out there, and I want Max to go shred every last dollar into ribbons, you hear?”

  “Max might be sentient,” I blurt out. And I feel like a third-grader again, speaking up in class and saying something that everyone else laughs at, something that makes me feel dumb. That’s how Hinson is looking at me. Greenie too.

  “Might?” Hinson asks.

  “Max doesn’t want to fight,” I tell him. “Let me show you—”

  I power Max down and reach for his pincers. I clip them into place while Peter does the same with the buzz saw. I flash back to eight years ago, when we demonstrated Max for Professor Hinson that first time. I’m as nervous now as I was then.

  “I told them we should save the dump to look at later,” Greenie says. “We’ve definitely got something emergent, but it’s presenting a lot like a glitch. But don’t worry, we can always load up the save from before the semis and go into the finals with that build. Max’ll tear SoCal apart—”

  “Let us show you what’s going on,” Peter says. He adjusts the code monitor so Hinson can see the readouts.

  “We don’t have time for this,” Hinson says. He pulls out his phone and checks something, puts it back. “Save the dump. Upload the save from the semis. Get him out there, and we’ll have plenty of time to follow up on this later. If it’s worth something, we’ll patent it.”

  “But a dump might not capture what’s going on with him,” I say. All three men turn to me. “Max was writing routines in maintenance mode. There are a million EPROMS in him, dozens for every sensor and joint. If we flash those to factory defaults, what if part of what he has become is in there somewhere? Or what if a single one or zero is miscopied and that makes all the difference? Maybe this is why we’ve never gotten over this hump before, because progress looks like a glitch, and it can’t be copied or reproduced. At least give us one more day—”

  “He’s a robot,” Hinson says. “You all are starting to believe your own magic tricks. We make them as real as we can, but you’re reading sentience into some busted code.”

  “I don’t think so,” Peter says.

  “I’m with the professor,” says Greenie. He shrugs at me. “I’m sorry, but this is the finals. We got close two years ago. If we get that contract, we’re set for life.”

  “But if this is the first stage of something bigger,” I say, “we’re talking about creating life.”

  Hinson shakes his head. “You know how much I respect your work, and if you think something is going on, I want you to look into it. But we’ll do it next week. Load that save and get our boy out there. That’s an order.”

  Like we’re all in the military now.

  Professor Hinson nods to Greenie, who steps toward the keyboard. Peter moves to block him, and I wonder if we’re going to come to blows over this. I back toward Max and place a hand on his chest, a mother’s reflex, like I just want to tap the brakes.

  “C’mon,” Greenie tells Peter. “We’ll save him. We can look at this in a week. With some sleep.”

  My hand falls to Max’s new legs. The gleaming paint there has never seen battle. And now his programming wants to keep it that way. I wonder how many times we’ve been on this precipice only to delete what we can’t understand. And then thinking we can just copy it back, and find that it’s been lost. I wonder if this is why downloading the human consciousness has been such a dead end. Like there’s some bit of complexity there that can’t survive duplication. Hinson and Greenie start to push Peter out of the way.

  “Get away from Max,” Greenie says. “I’m powering him up. Watch your feet.”

  He’s worried about the pincers and the buzz saw falling off. Has to power up Max to get a dump. I hesitate before leaving Max’s side. I quickly fumble with the cord, plugging in his unused legs. I have this luxury, stepping away. Turning my back on a fight.

  “I’ll get the power,” I say. And Peter shoots me a look of disappointment. It’s three against one, and I can see the air go out of him. He starts to say something, to plead with me, but I give him a look, the kind only a wife can give to her husband, one that stops him in his tracks, immobilizes him.

  “Powering up,” I say out loud, a lab habit coming back. A habit from back when we turned on machines and weren’t sure what they would do, if they would fall or stand on their own, if they would find their balance or topple to one side. I pull Peter toward me, out of the center of the trailer, and I slap the red power switch with nothing more than hope and a hunch.

  The next three seconds stretch out like years. I remember holding Sarah for the first time, marveling at this ability we have to create life where before ther
e was none. This moment feels just as significant. A powerful tremor runs through the trailer, a slap of steel and a blur of motion. The pincers and buzz saw remain in place, but every other part of Max is on the move. A thunderclap, followed by another, long strides taking him past us, a flutter of wind in my hair, the four of us frozen as Max bolts from the trailer and out of sight, doing the opposite of what he was built for, choosing an action arrived at on his own.

  Afterword

  One of my favorite questions to ask my futurist friends is “When do you think AI will come online?” I asked Rod Brooks once, and he laughed and said it was too far away to even contemplate. I asked Sam Harris, and he thought it would be very soon. But it was my friend Kevin Kelly who gave me the most shocking answer. “It’s already here,” he said.

  This felt like an answer designed to shock rather than illuminate, but hearing Kevin’s rationalization, I came away in agreement. Machines are already doing what we very recently said would be impossible (driving cars; winning at chess, go, and Jeopardy!; writing newspaper articles; creating art, music, and drama). What we keep doing is moving the goalposts. Once we understand how AI does something, it’s no longer as magical as our own consciousness, and so we dismiss it as progress.

  This is one type of AI. There’s another type that I don’t think we’ve created yet, and that’s an intelligence that’s self-aware with goals that it arrives at on its own. I do not believe that this sort of intelligence will come about because we set out to create it. I think we will be making more numerous and more complex AIs until one or several cross a threshold and become something . . . different.

  Humans develop in this way. We don’t emerge from the womb with goals, ambitions, even self-awareness. These modules come online gradually. One day, a baby realizes that its hand is its own and parents no longer need to clip its nails to keep it from scratching itself. We learn to walk, to talk, to think, to plan, to reason, to create. And then we slowly wind down and lose these abilities, if we live long enough.

 

‹ Prev