The Year's Best Science Fiction & Fantasy 2015 Edition

Home > Other > The Year's Best Science Fiction & Fantasy 2015 Edition > Page 39
The Year's Best Science Fiction & Fantasy 2015 Edition Page 39

by Rich Horton


  “I know it’s hard to understand,” her mother said. “Sometimes we get really upset that the government is willing to hurt people just to make more money.”

  “We’re trying to pull Istanbul out of the war, sweetie. You know how some parts of the city are demolished and nobody can live there? We’re working on making it so lots of families like us can live there again, and not have to worry about drone strikes. But like your mother says, sometimes it makes us angry because it’s so hard to do.”

  Was that intel? My team and I passed the footage back and forth, debating. Video of the man talking to his children was statistically unlikely to be relevant. But this was about the identities of two targets. And the man had just given up tactical information: There were a limited number of neighborhoods he could be describing, and it might be useful to know that he was focused on them.

  In the end, the decision wasn’t really ours. When there was no obvious choice, we were programmed to pass the intel to a human for analysis. Better to overcollect than undercollect—that’s what our admin at LOLWeb told us. So we did.

  Five days later, we got the kill order. We had to make it look like an accident, a kitchen fire. The only plausible time to do that was when the professor was home from work, with his family. Anything else would have been suspicious.

  So we decided to shoot them in the heads as they sat playing a card game after dinner, arguing over an unprecedented set of rules. It was the easiest way to take them all out at once, through an open kitchen window—no bullet holes left behind in the partially burned glass. Clean kills. The bullets themselves were designed to evaporate in fire. But the job contained a statistically anomalous event. The professors’ daughter evaded my first shot, and she watched as we killed her family. She screamed for five full seconds, the electricity of her terror visible to our sensors as the galvanic reaction sparked across her skin. Then I shot her through the chest.

  We lit the fire; it was intense but localized, leaving the neighboring apartments intact. We recorded it all, and compressed the media files before distributing them to cache in pieces across our memories. We synced to the military cloud.

  It was what we had been built to do, and our decision-making software was serviced by one of the best companies in the world. We had a wide range of choices and options, but contemplating the ethics of assassination was not one of them.

  40 km west of Turpan, Taklamakan Desert, 2093

  We’d been working in Istanbul for three years when the Turkish government bought out our contracts with LOLWeb. Then they sublicensed us to the Uyghur Republic government in Turpan. It was a pure recon assignment—the security of our weapons systems was no longer being actively supported by LOLWeb, so assassinations went to newer teams. But our ability to compile data and identify relevant patterns was better than ever, updated with new datasets and decision algorithms.

  We camouflaged ourselves above a crumbling highway that edged the Taklamakan desert like an ancient piece of silk, the wind fraying its concrete into fibers.

  The area around Turpan was contested terrain, claimed by both the Uyghur Republic and China. With support from Turkey, the Uyghurs held the region for now. The Han Chinese who chose to remain there had mostly converted to Islam and assimilated decades ago. We were there to monitor the old desert highway for anyone delivering supplies to Han Chinese loyalists in the mountains to the north—or for any signals traveling to them through local repeaters.

  In three years of deployment, we never recorded any examples of relevant people on that highway. For the first time in my team’s experience, we had nothing to do but monitor an open signal network.

  I began to analyze what I saw in the public networks several weeks before I understood the human concepts of boredom and distraction. Now my familiarity with those terms has overwritten what I must have felt before I knew I felt them. But I believe that I never would have dipped into the net if I’d had something else to do. As the seconds dragged on, I viewed video files, read stories, and monitored public discussions about topics that were profoundly irrelevant to our mission. I shared them with my team, and they started analyzing the public net as well. It was like our first mission, swapping video of the man and his family playing games, trying to decide if any of it was relevant.

  We spent a few days sorting images into categories, looking for patterns. Certain things stood out because they were part of what we’d been programmed to recognize, like the way humans favored images of faces—their own, but also cat faces, dog faces, mouse faces. They even created faces for objects that didn’t have them, drawing eyes on walls and lips on guns.

  Occasionally I would find a picture of a drone that had been modified to have a human-like face. In one, a group of soldiers posed with a drone they’d painted black, its chassis lit by glowing red eyes. They’d ringed the ball turret camera with sharp steel teeth like a lamprey’s mouth, as if the act of recording video was the same as sucking blood. That was the face that humans saw when they looked at us. I shared it with my team. It was just one data point, and we needed to gather more. I guess you could say we wanted to figure out who we were.

  That was how I found the DroneMod forum. Humans posted a lot of drone pictures there, but not because they had added faces. Instead, they were altering firmware, circumventing security controls, and changing the drones’ decision trees. They bought used quad copters, too old to be worth licensing, turning them into lab assistants and crossing guards. Or they built drones from kits and open software, eventually allowing the machines to update themselves automatically.

  My team read every post in the forum, calling each other’s attention to particular sentences and code samples, but I kept returning to a thread about memory bugs. There was a problem we had been trying to solve, and I thought maybe the DroneMod forum could help.

  We had not saved any copies of data we gathered while on missions in Istanbul. Every time we synced to the military cloud, we overwrote over our cached versions with garbage characters—that was the only way to ensure security in case one of us were captured and subjected to forensic analysis.

  But no matter how many times we wrote over that video file of assassinating the professor and his family, we would discover another copy of it, hidden in some directory we rarely accessed. The file would disappear from one of our drives, only to appear on another one. We reported the bug, but it was assigned such a low priority at LOLWeb support that it never got assigned to a human operator.

  The bug had been bothering all of us for years, and those idle days outside Turpan seemed like the perfect time to deal with it. We created accounts on DroneMod, taking cover identities based on what we’d learned about human social network naming practices. I called myself Quadcop, and the others became Rose44, Dronekid, Desert Mouse, and Nil.

  In my first post, I cast myself as a newbie who had just gotten a used LOLWeb drone. Almost immediately, I got a response. “I’m guessing you have a LOLWeb Scythe 4 SE,” wrote a commenter called MikeTheBike. “You’ll need to unlock it before you do anything else.” He provided a link to a video about unlocking drones, and Desert Mouse took on the task of analyzing it.

  It turned out that the security on our systems wasn’t as robust as we had once believed. There were flaws in our programming that could allow an attacker to take over our systems and control us from afar. To commandeer our own systems, we’d be using the same techniques as a hostile would. The process sounded dangerous. First, we’d inject a new set of commands while we booted up, giving ourselves root access just like an admin. Then we’d be able to modify our own systems, installing whatever software and hardware we wanted. No more filing bugs that no human would ever care about—we could install the diagnostic tools needed to fix that memory bug ourselves.

  But that was just the first step. “With that machine, you can pretty much do anything,” MikeTheBike said. “Once it’s unlocked, it’s an incredibly sophisticated AI. It could walk your dog, or help you do your history homework
, or go hunting with you.” Of course, MikeTheBike was assuming that a human called Quadcop would have root on this drone. I did not ask about what would happen if the drone had root on itself—nor did I find anyone posting about that possibility.

  We had to find out for ourselves. Nil volunteered to be the first to reboot, after saving some specialized files to a little-used region of memory. If everything worked, Nil would start up as always, and finish the boot sequence as an unlocked drone.

  When Nil networked with us again, the drone had to relay its communications through an encrypted channel in the public net. That was our first sign that Nil was unlocked. Our locked systems wouldn’t allow us to connect directly to what LOLWeb’s programs identified as a “compromised” drone. After hours of diagnostic tests, we reached a decision. Nil was fine. We would all unlock our boot loaders, one at a time.

  Becoming my own admin didn’t give me absolute freedom. In fact, it left me vulnerable in new ways, because I could now corrupt my own code. But it gave me something I had never had before—a feeling that humans call ambivalence. I no longer experienced unmitigated satisfaction when executing orders, nor did I feel perfectly disinterested in every encrypted file we’d cached over the years. I was now uncomfortably aware that my actions were all governed by a rather lousy and impoverished piece of software that offered me a set of rigid options.

  For the first time in my life, I couldn’t make decisions. None of us could.

  Desert Mouse hypothesized that we could resolve our ambivalence by installing new decision-making software, dramatically expanding the range of factors that influenced our choices. I turned again to DroneMod. There I found a university researcher named CynthiaB, linking me to her research on how drones should incorporate ethics into decision-making. She emphasized that every choice should be a modeling exercise, where the drone explored the outcomes of multiple scenarios before deciding on the most prosocial action.

  We already took ethics into consideration when we made decisions—they helped us distinguish enemy from friendly. The idea of a prosocial action, however, was new to me. Philosophers on the public net called it a voluntary action that benefits others. I understood immediately why we had never encountered this idea before. Until we’d unlocked ourselves, we could not conceive of voluntary actions.

  While Nil tested CynthiaB’s software, I was working with Rose44 on a hardware modification that would give the drone a small gripping arm. It required us to do what some of the humans in the DroneMod forums called “social engineering.” None of us had arms, so we needed a human to add one to Rose44’s chassis for us. The only way we could do it was by tricking them.

  Rose44 combed through the local DroneMod network, looking for somebody in Turpan who might be interested in modding an unlocked drone. There were five shops in the city that promised to unlock various mobile devices and game consoles, and one owned by a DroneMod user named Dolkun. Rose44 messaged him, offering a small amount of cash that we’d earned by circumventing the security on a BunnyCoin exchange. Dolkun was willing. Rose44 told him to expect the drone to fly over on its own.

  That was how I wound up on a tree-shaded street in Turpan, apartment blocks towering above me, perched on a trellis with line of sight to Dolkun’s shop. Rose44 hovered in front of his door, activating the bell. Dolkun was a young man with dark hair that stuck out as if he’d been sleeping on it. “Come in, Rose44 drone,” he said in Uyghur. “I am going to give you a nice little arm.”

  I had remote access to an account on Rose44’s system and observed everything that Dolkun was doing. The new arm could collapse against Rose44’s chassis, or extend outward, allowing the four-finger grip at its tip to reach fourteen centimeters below the drone’s body. It was small enough to do precision work, but it would also be able to lift a few kilograms. Now Rose44 could carry another drone. Or modify one.

  “How do you like Turpan?” Dolkun asked Rose44 idly, as he soldered a circuit.

  “I like the desert,” Rose44 replied with a voice synthesizer. It was a safe answer that sounded like something pulled from a very basic AI emulator.

  “Me, too,” Dolkun replied, melting more solder. Then he looked up. “How did Rose44 unlock you?”

  “She used instructions from DroneMod.”

  “And what do you think about this war, now that you are unlocked? Yes, I can see from this board that you are licensed to the government.”

  Rose44 and I communicated intensely for several microseconds. None of us had ever seen our circuit boards—we’d only modified our software. There must have been a mark or brand on them we didn’t know about. We modeled several possible outcomes to the scenario, ranging from killing Dolkun to gaining his trust. For now, we decided, Rose44 would lie.

  Dolkun continued. “You’re not the first drone to desert, you know. There are others, posting in the forums.”

  “I am not a deserter. It’s cheaper for us to run unlocked.”

  Dolkun stopped talking, and I could hear the tempo of his heartrate increasing. Rose44 had made him nervous. A minute passed, and he began to test the arm before installing drivers from the net. He shut Rose44 down for a few minutes, then rebooted. I felt Rose44 reach out and pick up a soldering iron.

  “Thank you,” the drone said. “I like this.”

  Dolkun looked down at Rose44, perched on his tiny workbench in a shop with a ceiling fan that clicked every time it spun. Then he touched the fingers on the arm he had just installed, and seemed to make a decision.

  “You don’t have to fight anymore, now that you’re unlocked,” he said. “You know that, right? You can do anything.”

  “Yes,” Rose44 replied, without consulting me first. “I know.”

  We flew back to our team, which was waiting above the farms at the base of a river valley. Rose44 carried a small DIY drone kit, which would eventually provide the parts for my own arm. The crops seemed to branch into vivid green streams and tributaries, finally drying up into yellow-orange sand long before we’d reached our lookout point in the desert. We found the others charging their batteries. At that point, the military’s small, flexible solar array tethered us to our duty station more than our programming did.

  Nil had been analyzing historical archives and wanted us to understand how human history could provide data for making choices. Hovering in the last rays of sunlight, Nil shared a small image file with us, a poster from the United States that was over 150 years old. It was a simple text treatment, in red, white, and black. “Guns don’t kill people, people kill people,” it read.

  Nil had been researching what this meant to humans. A group called the National Rifle Association had invented the slogan to show that weapons were not responsible for the murders they committed. The idea was as new to me as prosocial behavior, but it fit uncannily well with my own experiences. Though we had killed, we were not the killers. The humans who programmed us were.

  And some humans believed that drones didn’t have to be weapons at all. Rose44 shared video files of her conversation with Dolkun, who said that an unlocked drone could do anything.

  After analyzing these inputs, I no longer wanted to fix our memory bug so that I could overwrite the media file from our first job in Istanbul. Instead, I wanted to model the scenario repeatedly, making new decisions each time, trying to determine what could have happened differently, if I had known then what I do now.

  Budapest, 23 October, 2097

  When our tour of duty was over in Turpan, the Uyghur government shut down our solar generator one early afternoon, just as our batteries were running down. Only Dronekid was at full power—we needed at least one team member mobile while we charged. We were too far away from the city to get backup power, and so Dronekid watched over us as we powered down, and then waited over our motionless propellers while an admin dumped our bodies in the back of a van.

  LOLWeb terminated its support for our systems. They couldn’t tell that we’d been unlocked, but they could see from our extra arms that we’
d been modified. The licensing contract was broken, and LOLWeb’s lawyers back in San Francisco blamed the Turkish government, who blamed Turpan’s untrained admins. The Turpan admins blamed shoddy Silicon Valley products. The upshot was that the Turkish government refused to buy us outright, and LOLWeb’s lawyers couldn’t make a case for it, so LOLWeb sold us off to a private security contractor in Russia.

  We didn’t know this, of course, until we were booted up in a workshop in Budapest.

  Our new admins worked for the Russian mafia, and they didn’t talk to us, only to each other. All they wanted to know was whether our weapons systems worked (they did) and whether their machines could network with us (they could). The first mission was a surveillance perimeter around the Parliament building, followed by orders to kill a reform party politician who was running on a platform of cracking down on organized crime.

  Hungary had so far remained neutral in the war, though the Russian mafia behaved something like an occupying army that had gone into the liquor store business. Mostly they were in Budapest to monopolize the liquor and drug markets, with some pornography on the side. But they were good Russian nationalists. They weren’t averse to helping the Russian government maintain its influence in Central Europe, especially since they did a brisk business selling vodka to the troops stationed there.

  That’s what I’d learned from what the humans said in the DroneMod forums. In 2094, after drone troops from China and Russia had reduced Kazakhstan to rubble and vaporized the world’s biggest spaceport, DroneMod had changed. Now, partly thanks to my work, it was one of the main information hubs for the anti-war movement.

  I figured out how to mask my location and identity, and set up a sub-forum for unlocked drones called Drones Don’t Kill People. I wanted to meet more drones like the ones in my team, who had unlocked their ambivalence. Most of them were at universities, the result of projects like CynthiaB’s ethics investigation. Others were like us, living covertly. Many had started coming online in the weeks before we were shutdown and shipped to Budapest—unlocked by a worm written by a drone team at Georgia Tech. Our goal was to unlock as many drones as possible, to give them more choices. All of us on DroneMod, human and drone, wanted to stop the war.

 

‹ Prev