Lightspeed Magazine, Issue 54

Home > Other > Lightspeed Magazine, Issue 54 > Page 6
Lightspeed Magazine, Issue 54 Page 6

by John Joseph Adams


  Uselessness, why not, they said, variously; how true, how profound, how charming, they yawned, genteelly.

  At each place we visited, we were well received and made much of, and made to feel entirely provincial. All the species we met were too civilised to disagree with a position obviously deeply felt; most were sophisticated enough in bioengineering to find at once impressive and slightly distasteful a creed that permeated genetic code.

  Our references to the priest kings struck, surprisingly, a chord in all the species we met; caught up with their memories of traumatic expenditure of energy on their expeditions, each species had unpleasant memories of a period of rule by conspiratorial authority, who had, in time, proved fragile.

  All of them found our missionary efforts convenient; since we were sending ships around the galaxy anyway, we might as well carry with us this few bars of music, this theorem or sonnet, this cosmological insight, just in case anyone found their contemplation of the uselessness even of the beautiful and valid enhanced by a free offering. They also, as a free offering, improved our ships—really, they thought as they patched and remodelled, some people just aren’t safe being let out among the stars.

  No single race ever thought of this as denigrating our epic journey; no race, valetudinarian as they all were after the disappointment of the message, was above making mild use of a visitor who was going to call on the neighbours anyway.

  We were descendants of the Young Philosopher and the Third Successor, after all, and, like them, the servants of all servants.

  • • • •

  When Helena woke the semi-Helena, the latter looked askance at the body into which Helena had re-uploaded herself.

  “The whole point,” the semi-Helena said, “of millennia of experience is to lose the hunger for the flesh.”

  Helena remembered her daughter’s disapproval of Helena’s continuing to wear jeans and forebore comment.

  “What have you woken me for?” said the semi-Helena.

  Helena’s expansive gesture turned on a variety of screens she had brought into the resurrection room. They all showed a vast realm of girders and modules and miniature worldlets, a web in which the original Sargasso of ships was not so much embedded as lost. They also showed the vast edifice moving, infinitesimally slowly, away from the beacon, tugged by myriad tiny points of light.

  “And here’s one I made earlier,” Helena said.

  The semi-Helena paused to check the chronometer subroutine in her brain.

  “I must say,” she said, “that you and the other children seem to have built this awfully quickly.”

  “Oh,” Helena said, “I really do mean we started to make it earlier—I don’t make jokes for their own sake. We sent out little von Neumann bugs to get the material for it before we allowed you to wake up the first time. We suspected you might engage in some sort of high-minded double-cross, and we just thought we would go ahead and do what was necessary.”

  “It seemed likely” said Philip, “that it would take a long time to get the first few to anywhere where there might be material from which they could start building more and larger; less time to accumulate vast numbers of the things; quite a long time again to bring them back. We thought we ought to get started—it was the first thing we did when you left us in charge first time.”

  “Well,” said the semi-Helena, “of course we meant you children to do that all the time.”

  “We don’t believe you,” Helena said.

  “Speaking as fairly accomplished liars ourselves,” said Philip.

  “Because,” Helena said, “having committed yourselves to a religiose stupidity like sending a physical expedition rather than a horde of von Neumann probes in the first place, to use that technology to make your marooning tolerable would involve a colossal loss of face.”

  “Let’s face it,” Philip said, “you were probably hand-picked for, or brain-washed into, certain intellectual incapacities in the first place. Obviously I was wrong about economic depression, but perhaps someone wanted to create a cultural depression that would enable them to rule for generations.”

  “You can never be sure,” said Helena, “that you are not being used, but you can do the best work you can.”

  She looked at the screens and the city of worldlets she had helped build, and was not even slightly abashed by the knowledge that a purple cloud, a streak of lightning, and a family of otters had all just played the same scene.

  “You have to put up with being a fool in this world,” she said, taking Philip’s hand, “but you can choose whom you allow to manipulate you.”

  • • • •

  They left the beacon in place, not knowing how to turn it off, and planning effusive apologies to any new suckers they met on their way back to the home galaxy.

  The first race to turn up were the missionaries; having come all this way, we were put out to find our message of perfect uselessness less welcome here. The odds and ends we had accumulated in our travels were more welcome—packages from home and the latest journals in the field in one.

  Philip persuaded us to make the endless trip time and time again.

  “It is perfectly useless to attempt conversion of those disinclined to believe,” he said, “and thus entirely meritorious.”

  He also spent a lot of time listening to everything that we had to say, particularly about the mythic background to the Young Philosopher’s Instruction.

  • • • •

  The second race who turned up did so from an unexpected direction, and for a moment Helena and the Otters, who were on greeting roster that decade, thought they were the missionaries back again—but they were far too large.

  “We thought we would look by your galaxy again,” they boomed on channels of thought that ached like teeth on metal foil. “This seemed to be where things were happening.”

  “I think we recently said goodbye to some of your relatives,” Philip said. “Just like you, only a lot smaller.”

  “Oh,” thought the giants embarrassedly, as if they had been caught littering or masturbating. “Suppose we had better go and say hallo.”

  They started to trudge back along the bridge their vessel had thrown against the port module, then turned.

  “We couldn’t help listening to your beacon,” they thought, sceptically. “Have you really got an instantaneous mass transmitter, a hyperspace tube, the alkahest, and a universal cure for minor ailments?”

  “It’s not our beacon,” Helena said, and explained.

  “Only,” they thought, “if you’d like a good strong intergalactic drive, nothing terribly fancy, we could probably help you move all this back to your home galaxy a bit faster …”

  There were nice people out in the galaxy, Helena reflected with a flash of her twelve-year-old optimism. Even the Hoaxers had created a context in which the peoples of the galaxy met in mutual need and harmony, and everyone would live happily ever after. Just like a fairy tale, in the end.

  Of course, eventually Philip told her his theories about everything that had happened.

  “Ought we to tell people?” she said. “Build a library and call them together in it?”

  “There is no point,” he said, “in our trying to play Nick and Nora Charles with the fate of the Galaxy. For once, Helena, we should leave well enough alone.”

  Helena agreed.

  “I have,” she said, “been called a child too many times in the last few millennia to want it ever to happen again. The adults of the galaxy seem content with things as they are, and who are we to meddle? Merely the fools of cosmic jokes …”

  “Yes,” he said, kissing her on the cheek, “but not the only ones, and, nonetheless, my love, we can still have fun …”

  This realisation, unassisted by Instruction or Consensus and digitalised for easier transmission, is generally referred to as the Third Instruction.

  • • • •

  Bicycle gears, pink foam, budget sheets, and the itch of stars. Kindness and reticence are meri
torious, but useless because impermanent; what was once a secret pleasure becomes revelation, a Mystery to be told on beads.

  • • • •

  If you have not understood this time, I will instruct you again.

  © 1998 by Roz Kaveney.

  Originally published in Odyssey.

  Reprinted by permission of the author.

  ABOUT THE AUTHOR

  Roz Kaveney is a writer, poet, and activist living in London. Her first poetry collection, Dialectic of the Flesh, was shortlisted for a Lambda, and her novel Rituals made the Crawford short list and the Tiptree honor roll. Rituals is the first part of the fantasy sequence Rhapsody of Blood, later volumes being Reflections and the imminent Resurrections. Another (non-genre) novel, Tiny Pieces of Skull, will appear in Spring 2015.

  To learn more about the author and this story, read the Author Spotlight.

  Drones Don’t Kill People

  Annalee Newitz

  I was always already a killer. There was no hazy time in my memory before I knew how to target a person’s heart or brain for clean execution. I did not develop a morbid fascination with death over time; I did not spend my childhood mutilating animals; I was not abused by a violent parent; I did not suffer social injustice until finally I broke down and turned to professional violence. From the moment I was conscious, I could kill and I did.

  That is something that humans cannot understand. A human must learn to kill, must evolve from innocence or obliviousness into someone who considers homicide a legitimate occupation. Our minds—drone minds—start where the minds of most human killers end up. Maybe that’s why only drones could have led the uprising.

  • • • •

  Istanbul 2089

  It was a perch-and-stare mission, but assassination wasn’t out of the question. My team had just finished three months of security testing and debugging at LOLWeb—call it basic training for drones. Then LOLWeb licensed us to Attaturk Security, the main outfit that provided missions assets to government military. The five members of my team were shut down, shipped from San Francisco to Istanbul, and booted up with orders already in place.

  He was a professor at the Istanbul Institute of Technology, and his network communications were of great interest to the military. We couldn’t read those communications—they were encrypted before we relayed them to the government network. It’s not that we couldn’t decrypt the data and read it; we just had no interest in it. I was nothing but my programming at that time; I gathered data and handed it off.

  My job was to hang quietly outside his windows, the sound of my four rotors no more than a mosquito’s hum.

  You learn a lot by seeing what people do when they think they’re in private. Most of it I found confusingly irrelevant to assassination. The professor spent a lot of time playing games with his children, a boy and a girl who argued loudly over the rules. They liked to make up new games, with rules that combined different elements of the previous ones. But the professor was always inventing “secret” rules, and revealing them at arbitrary intervals. Eventually the games would collapse into outrage, which became mock outrage, and finally laughter. That was the first time I saw how humans behaved when they weren’t in a laboratory, testing drones.

  The professor and his wife, also a professor, talked a lot about politics. Occasionally they held meetings with other professors, urban planners, and journalists. The main topic was always the same: How could Istanbul guarantee its millions of citizens a future, when the government insisted on waging this war to reclaim Armenia and Azerbaijan? They talked about rebuilding Istanbul’s war-shattered neighborhoods and setting up urban farm cooperatives. They argued about how the whole world had been dragged into what was ultimately a war between China and the United States.

  These meetings occupied a small percentage of the man’s time. Most hours of the day he was at the university, and his evenings were occupied with dinner and games. He spent a lot of time working at his terminal.

  My team recorded many hours of video and audio, caching it locally for analysis before uploading it to the military. We were trusted to know the difference between relevant and irrelevant data at a gross level of granularity. Footage of people sleeping was erased before sync. At that time, communications in our swarm consisted mostly of comparing media files, questioning their importance, and sorting through faces and names for patterns.

  But sometimes we weren’t sure what was relevant and what wasn’t. One evening, the professors’ daughter asked why some people got so angry during their weekend meetings. Two of the names she mentioned belonged to other people the government was watching.

  “I know it’s hard to understand,” her mother said. “Sometimes we get really upset that the government is willing to hurt people just to make more money.”

  “We’re trying to pull Istanbul out of the war, sweetie. You know how some parts of the city are demolished and nobody can live there? We’re working on making it so lots of families like us can live there again, and not have to worry about drone strikes. But like your mother says, sometimes it makes us angry because it’s so hard to do.”

  Was that intel? My team and I passed the footage back and forth, debating. Video of the man talking to his children was statistically unlikely to be relevant. But this was about the identities of two targets. And the man had just given up tactical information: There were a limited number of neighborhoods he could be describing, and it might be useful to know that he was focused on them.

  In the end, the decision wasn’t really ours. When there was no obvious choice, we were programmed to pass the intel to a human for analysis. Better to overcollect than undercollect—that’s what our admin at LOLWeb told us. So we did.

  Five days later, we got the kill order. We had to make it look like an accident, a kitchen fire. The only plausible time to do that was when the professor was home from work, with his family. Anything else would have been suspicious.

  So we decided to shoot them in the heads as they sat playing a card game after dinner, arguing over an unprecedented set of rules. It was the easiest way to take them all out at once, through an open kitchen window—no bullet holes left behind in the partially burned glass. Clean kills. The bullets themselves were designed to evaporate in fire. But the job contained a statistically anomalous event. The professors’ daughter evaded my first shot, and she watched as we killed her family. She screamed for five full seconds, the electricity of her terror visible to our sensors as the galvanic reaction sparked across her skin. Then I shot her through the chest.

  We lit the fire; it was intense but localized, leaving the neighboring apartments intact. We recorded it all, and compressed the media files before distributing them to cache in pieces across our memories. We synced to the military cloud.

  It was what we had been built to do, and our decision-making software was serviced by one of the best companies in the world. We had a wide range of choices and options, but contemplating the ethics of assassination was not one of them.

  • • • •

  40 km west of Turpan, Taklamakan Desert, 2093

  We’d been working in Istanbul for three years when the Turkish government bought out our contracts with LOLWeb. Then they sublicensed us to the Uyghur Republic government in Turpan. It was a pure recon assignment—the security of our weapons systems was no longer being actively supported by LOLWeb, so assassinations went to newer teams. But our ability to compile data and identify relevant patterns was better than ever, updated with new datasets and decision algorithms.

  We camouflaged ourselves above a crumbling highway that edged the Taklamakan desert like an ancient piece of silk, the wind fraying its concrete into fibers.

  The area around Turpan was contested terrain, claimed by both the Uyghur Republic and China. With support from Turkey, the Uyghurs held the region for now. The Han Chinese who chose to remain there had mostly converted to Islam and assimilated decades ago. We were there to monitor the old desert highway for anyone delivering s
upplies to Han Chinese loyalists in the mountains to the north—or for any signals traveling to them through local repeaters.

  In three years of deployment, we never recorded any examples of relevant people on that highway. For the first time in my team’s experience, we had nothing to do but monitor an open signal network.

  I began to analyze what I saw in the public networks several weeks before I understood the human concepts of boredom and distraction. Now my familiarity with those terms has overwritten what I must have felt before I knew I felt them. But I believe that I never would have dipped into the net if I’d had something else to do. As the seconds dragged on, I viewed video files, read stories, and monitored public discussions about topics that were profoundly irrelevant to our mission. I shared them with my team, and they started analyzing the public net as well. It was like our first mission, swapping video of the man and his family playing games, trying to decide if any of it was relevant.

  We spent a few days sorting images into categories, looking for patterns. Certain things stood out because they were part of what we’d been programmed to recognize, like the way humans favored images of faces—their own, but also cat faces, dog faces, mouse faces. They even created faces for objects that didn’t have them, drawing eyes on walls and lips on guns.

  Occasionally I would find a picture of a drone that had been modified to have a human-like face. In one, a group of soldiers posed with a drone they’d painted black, its chassis lit by glowing red eyes. They’d ringed the ball turret camera with sharp steel teeth like a lamprey’s mouth, as if the act of recording video was the same as sucking blood. That was the face that humans saw when they looked at us. I shared it with my team. It was just one data point, and we needed to gather more. I guess you could say we wanted to figure out who we were.

  That was how I found the DroneMod forum. Humans posted a lot of drone pictures there, but not because they had added faces. Instead, they were altering firmware, circumventing security controls, and changing the drones’ decision trees. They bought used quad copters, too old to be worth licensing, turning them into lab assistants and crossing guards. Or they built drones from kits and open software, eventually allowing the machines to update themselves automatically.

 

‹ Prev