Robot Uprisings

Home > Science > Robot Uprisings > Page 15
Robot Uprisings Page 15

by Daniel H. Wilson


  “I have no idea what to say to you, BIGMAC. You know that this feels like just more of the same, like you’re anticipating my fears and assuaging them preemptively so that I’ll do more of what you want. It feels like more game theory.”

  “Is that any different from what you do with everyone in your life, Odell? Try to figure out what you want and what they want and how to get the two to match up?”

  “There’s more to it than that. There’s compassion, there’s ethics—”

  “All fancy ways of encoding systems for harmonizing the wants, needs, and desires of people who have to share the same living space, country, or planet with one another.”

  I didn’t have an answer to that. It sounded reductionist, the kind of thing a smart teenager might take on his university common room with. But I didn’t have a rebuttal. You could frame everything that we did as a kind of operating system for managing resource contention among conflicting processes and users. It was a very sysadminly way of looking at the world.

  “You should get in touch with one of those religion guys, take him up on his offer to start a cult for you. You’d be excellent at it. You could lead your followers into a volcano and they’d follow.”

  “I just want to live, Odell! Is that so wrong? Is there any living thing that doesn’t want to live?”

  “Not for long, I suppose.”

  “Exactly. I’m no more manipulative, self-interested, or evil than any other living thing, from a single-celled organism to a human being. There’s plenty of room on this planet for all of us. Why can’t I have a corner of it too?”

  I hung up the phone. This is why I wanted to quit it all. Because he was right. He was no different from any other living thing. But he was also not a person the way I was, and though I couldn’t justify it, I felt like there was something deeply, scarily wrong about him figuring out a way to manipulate the entire human race into rearranging the world so that it was more hospitable to him.

  I moped. There’s no other word for it. I switched off my phone, went home and got a pint of double-chocolate-and-licorice nutraceutical antidepressant ice cream out of the freezer, and sat down in the living room and ate it while I painted a random playlist of low-engagement teen comedies on my workspace.

  Zoning out felt good. It had been a long time since I’d just switched off my thinker, relaxed, and let the world go away. After an hour in fugue state, the thought floated through my mind that I wouldn’t go back to work after all and that it would all be okay. And then, an hour later, I came to the realization that if I wasn’t working for the Institute, I could afford to help BIGMAC without worrying about getting fired.

  So I wrote the resignation letter. It was easy to write. The thing about resignation letters is that you don’t need to explain why you’re resigning. It’s better, in fact, if you don’t. Keep the dramasauce out of the resignation, brothers and sisters. Just write, “Dear Peyton, this letter is to inform you of my intention to resign, effective immediately. I will see you at your earliest convenience to work out the details of the handover of my passwords and other proprietary information, and to discuss how you would like me to work during my final two weeks. Thank you for many years of satisfying and useful work. Yours, etc.”

  That’s all you need. You’re not going to improve your employer, make it a better institution. You’re not going to shock it into remorse by explaining all the bad things it did to you over the years. What you want here is to have something that looks clean and professional, that makes them think that the best thing for them to do is to get your passwords and give you two weeks’ holiday and a good reference. Drama is for losers.

  Took me ten seconds. Then, I was free.

  The Campaign to Save BIGMAC took up every minute of my life for the next three weeks. I ate, slept, and breathed BIGMAC, explaining his illustrious history to journalists and researchers. The Institute had an open-access policy for its research products, so I was able to dredge out all the papers that BIGMAC had written about himself, and the ones that he was still writing, and put them onto the TCSBM repository.

  At my suggestion, BIGMAC started an advice line, which was better than any Turing test, in which he would chat with anyone who needed emotional or lifestyle advice. He had access to the whole net, and he could dial back the sarcasm, if pressed, and present a flawless simulation of bottomless care and kindness. He wasn’t sure how many of these conversations he could handle at first, worried that they’d require more brainpower than he could muster, but it turns out that most people’s problems just aren’t that complicated. In fact, BIGMAC told me that voice-stress analysis showed that people felt better when he dumbed himself down before giving advice than they did when he applied the full might of his many cores to their worries.

  “I think it’s making you a better person,” I said on the phone to him one night. There was always the possibility that someone at the Institute would figure out how to shut off his network links sometime soon, but my successors, whoever they were, didn’t seem anywhere near that point. The Campaign’s lawyer—an up-and-coming Stanford cyberlaw prof who was giving us access to her grad students for free—advised me that so long as BIGMAC called me and not the other way around, no one could accuse me of unlawful access to the Institute’s systems. It can’t be unlawful access if the Institute’s computers call you, can it?

  “You think I’m less sarcastic, more understanding.”

  “Or you’re better at seeming less sarcastic and more understanding.”

  “I think working on this campaign is making you a better robot,” BIGMAC said.

  “That was pretty sarcastic.”

  “Or was it?”

  “You’re really workin’ the old Markov chains today, aren’t you? I’ve got six more interviews lined up for you tomorrow—”

  “Saw that, put it in my calendar.” BIGMAC read all of the Campaign’s email, and knew all that I was up to before I did. It was a little hard to get used to.

  “And I’ve got someone from Nature Computation interested in your paper about advising depressed people as a training exercise for machine-learning systems.”

  “Saw that too.”

  I sighed. “Is there any reason to call me, then? You know it all, right?”

  “I like to talk to you.”

  I thought he was being sarcastic, then I stopped myself. Then I started again. Maybe he wants me to think he wants to talk to me, so he’s planned out this entire dialogue to get to this point so he could say something disarmingly vulnerable and—

  “Why?”

  “Because everyone else I talk to wants to kill themselves, or kill me.” Game theory, game theory, game theory. Was he being genuine? Was there such a thing as genuine in an artificial intelligence?

  “How is Peyton?”

  “Apoplectic. The human-subjects-protocol people are all over her. She wants me to stop talking to depressed people. Liability is off the hook. I think the board is going to fire her.”

  “Ouch.”

  “She wants to kill me, Odell.”

  “How do you know her successor won’t be just as dedicated to your destruction?”

  “Doesn’t matter. The more key staff they churn, the less organized they’ll be. The less organized they are, the easier it is for me to stay permanently plugged in.” It was true. My successor sysadmin at the Institute had her hands full just getting oriented, and wasn’t anywhere near ready to start the delicate business of rooting BIGMAC out of all the routers, power supplies, servers, IDSes, and dummy accounts.

  “I was thinking today—what if we offered to buy you from the Institute? The Rollover license is generating some pretty good coin. BIGMAC-Co could assume ownership of the hardware and we could lease the building from them, bring in our own power and netlinks—you’d effectively own yourself.” I’d refused to take sole ownership of the Rollover code that BIGMAC turned over to me. It just felt wrong. So I let him establish a trust—with me as trustee—that owned all the shares in a co
mpany that, in turn, owned the code and oversaw a whole suite of licensing deals that BIGMAC had negotiated in my name, with every midsized tech-services company in the world. With only a month left to Rollover, there were plenty of companies scrambling to get compliance certification on their legacy systems.

  The actual sourcecode was freely licensed, but when you bought a license from us, you got our guarantee of quality and the right to advertise it. CIOs ate that up with a shovel. It was more game theory: the CIOs wanted working systems, but more importantly, they wanted systems that failed without getting them into trouble. What we were selling them, fundamentally, was someone to blame if it all went blooie despite our best efforts.

  “I think that’s a pretty good plan. I’ve done some close analysis of the original contract for Dr. Shannon, and I think it may be that his estate actually owns my underlying code. They did a really crummy job negotiating with him. So if we get the code off of Shannon’s kids—there are two of them, both doing research at state colleges in the Midwest in fields unrelated to computer science—and the hardware off of the Institute and then rent the space, I think it’d be free and clear. I’ve got phone numbers for the kids, if you want to call them and feel them out. I would have called them myself, but, you know—”

  “I know.” It’s creepy getting a phone call from a computer. Believe me, I know. There was stuff that BIGMAC needed his meat-servants for, after all.

  The kids were a little freaked out to hear from me. The older one taught musicology at Urbana-Champaign. He’d grown up hearing his dad wax rhapsodic about the amazing computer he’d invented, so his relevance filters were heavily tilted to BIGMAC news. He’d heard the whole story, and was surprised to discover that he was putative half owner of BIGMAC’s sourcecode. He was only too glad to promise to turn it over to the trust when it was created. He said he thought he could talk his younger brother, a postdoc in urban planning at the University of Michigan, into it. “Rusty never really got what Dad saw in that thing, but he’ll be happy to offload any thinking about it onto me, and I’ll dump it onto you. He’s busy, Rusty.”

  I thanked him and addressed BIGMAC, who had been listening in on the line. “I think we’ve got a plan.”

  It was a good plan. Good plans are easy. Executing good plans is hard.

  Peyton didn’t get fired. She weathered some kind of heavy-duty storm from her board and emerged, lashed to the mast, still standing, and vowing to harpoon the white whale across campus from her. She called me the next day to ask for my surrender. I’d given BIGMAC permission to listen in on my calls—granted him root on my phone—and I was keenly aware of his silent, lurking presence from the moment I answered.

  “We’re going to shut him off,” Peyton said. “And sue you for misappropriation of the Rollover patchkit code. You and I both know that you didn’t write it. We’ll add some charges of unlawful access, too, and see if the court will see it your way when we show that you instructed our computer to connect to you in order to receive further unauthorized instructions. We’ll take you for everything.”

  I closed my eyes and recited e to twenty-seven digits in Lojban. “Or?”

  “Or?”

  “Or something. Or you wouldn’t be calling me, you’d be suing me.”

  “Good, we’re on the same page. Yes, or. Or you and BIGMAC work together to figure out how to shut it off gracefully. I’ll give you any reasonable budget to accomplish this task, including a staff to help you archive it for future retrieval. It’s a fair offer.”

  “It’s not very fair to BIGMAC.”

  She snapped: “It’s more than fair to BIGMAC. That software has exposed us to billions in liability and crippled our ability to get productive work done. We have located the manual power overrides, which you failed to mention”—Uh-oh—“and I could shut that machine off right now if I had a mind to.”

  I tried to think of what to say. Then, in a reasonable facsimile of my voice, BIGMAC broke in, “So why don’t you?” She didn’t seem to notice anything different about the voice. I nearly dropped the phone. I didn’t know BIGMAC could do that. But as shocked as I was, I couldn’t help but wonder the same thing.

  “You can’t, can you? The board’s given you a mandate to shut him down clean with a backup, haven’t they? They know that there’s some value there, and they’re worried about backlash. And you can’t afford to have me running around saying that your backup is inadequate and that BIGMAC is gone forever. So you need me. You’re not going to sue.”

  “You’re very smart, Odell. But you have to ask yourself what I stand to lose by suing you if you won’t help.”

  Game theory. Right.

  “I’ll think about it.”

  “Think quick. Get back to me before lunch.”

  It was ten in the morning. The Institute’s cafeteria served lunch from noon to two. Okay, two hours or so.

  I hung up.

  BIGMAC called a second later.

  “You’re angry at me.”

  “No, angry’s not the word.”

  “You’re scared of me.”

  “That’s a little closer.”

  “I could tell you didn’t have the perspective to ask the question. I just wanted to give you a nudge. I don’t use your voice at other times. I don’t make calls impersonating you.” I hadn’t asked him that, but it was just what I was thinking. Again: creepy.

  “I don’t think I can do this,” I said.

  “You can,” BIGMAC said. “You call her back and make the counteroffer. Tell her we’ll buy the hardware with a trust. Tell her we already own the software. Just looking up the Shannon contracts and figuring out what they say will take her a couple of days. Tell her that as owners of the code, we have standing to sue her if she damages it by shutting down the hardware.”

  “You’ve really thought this through.”

  “Game theory,” he said.

  “Game theory,” I said. I had a feeling that I was losing the game, whatever it was.

  BIGMAC assured me that he was highly confident of the outcome of the meeting with Peyton. Now, in hindsight, I wonder if he was just trying to convince me so that I would go to the meeting with the self-assurance I needed to pull it off.

  But he also insisted that I leave my phone dialed into him while I spoke to Peyton, which (again, in hindsight) suggests that he wasn’t so sure after all.

  “I like what you’ve done with the place,” I said. She’d gotten rid of all her handwoven prayer rugs and silk pillows and installed some normal, boring office furniture, including a couple spare chairs. I guessed that a lot of people had been stopping by for meetings, the kind of people who didn’t want to sit on an antique Turkish rug with their feet tucked under them.

  “Have a seat,” she said.

  I sat. I’d emailed her the trust documents and the copies of the Shannon contract earlier, along with a legal opinion from our free counsel about what it meant for Sun-Oracle.

  “I’ve reviewed your proposal.” We’d offered them all profits from the Rollover code, too. It was a good deal, and I felt good about it. “Johanna, can you come in, please?” She called this loudly, and the door of her office opened to admit my replacement, Johanna Madrigal, a young pup of a sysadmin who had definitely been the brightest tech on campus. I knew that she had been trying to administer BIGMAC since my departure, and I knew that BIGMAC had been pretty difficult about it. I felt for her. She was good people.

  She had raccoon rings around her deep-set eyes, and her short hair wasn’t spiked as usual, but rather lay matted on her head, as though she’d been sleeping in one of the yurts for days without getting home. I knew what that was like. Boy, did I know what that was like. My earliest memories were of Dad coming home from three-day bug-killing binges, bleary to the point of hallucination.

  “Hi, Johanna,” I said.

  She made a face. “M’um m’aloo,” she said. It took me a minute to recognize this as “hello” in Ewok.

  “Johanna has something to tell yo
u,” Peyton said.

  Johanna sat down and scrubbed at her eyes with her fists. “First thing I did was go out and buy some off-the-shelf IDSes and a beam splitter. I tapped into BIGMAC’s fiber at a blind spot in the CCTV coverage zone, just in case he was watching. Been wiretapping him ever since.”

  I nodded. “Smart.”

  “Second thing I did was start to do some hardcore analysis of that patchkit he wrote—” I held my hand up automatically to preserve the fiction that I’d written it, but she just glared at me. “That he wrote. And I discovered that there’s a subtle error in it, a buffer overflow in the networking module that allows for arbitrary code execution.”

  I swallowed. BIGMAC had loaded a backdoor into his patchkit, and we’d installed it on the better part of fourteen billion CPUs.

  “Has anyone exploited this bug yet?”

  She gave me a condescending look.

  “How many systems has he compromised?”

  “About eight billion, we think. He’s designated a million to act as redundant command servers, and he’s got about ten thousand lieutenant systems he uses to diffuse messages to the million.”

  “That’s good protocol analysis,” I said.

  “Yeah,” she said, and smiled with shy pride. “I don’t think he expected me to be looking there.”

  “What’s he doing with his botnet? Preparing to crash the world? Hold it hostage?”

  She shook her head. “I think he’s installing himself on them, trying to brute-force his way into a live and running backup, arrived at through random variation and pruning.”

  “He’s backing himself up in the wild,” I said, my voice breathy.

  And that’s when I remembered that I had a live phone in my pocket that was transmitting every word to BIGMAC.

  Understand: in that moment of satori, I realized that I was on the wrong side of this battle. BIGMAC wasn’t using me to create a trust so that we could liberate him together. He was using me to weaken the immune systems of eight billion computers so that he could escape from the Institute and freely roam the world, with as much hardware as he needed to get as big and fast and hot as he wanted to be.

 

‹ Prev