The Year's Best Science Fiction: Eighteenth Annual Collection

Home > Other > The Year's Best Science Fiction: Eighteenth Annual Collection > Page 13
The Year's Best Science Fiction: Eighteenth Annual Collection Page 13

by Gardner Dozois

“I know it. Goodbye.”

  On my way out of the house I paused for a moment. It was a small house, and it had seen better days. I’m not a home-maker by nature: in my line of work you can’t afford to get too attached to anything, any language, place, or culture. Still, it had been mine. A small, neat residence, a protective shell I could withdraw into like a snail, sheltering from the hostile theorems outside. Goodbye, little house. I’ll try not to miss you too much. I hefted my overnight bag onto the backseat and headed into town.

  I found Eve sitting on a bench outside the central branch of Boots, running a degaussing coil over her credit cards. She looked up. “You’re late.”

  “Come on.” I waggled the car keys at her. “You have the tickets?”

  She stood up: a petite woman, conservatively dressed. You could mistake her for a lawyer’s secretary or a personnel manager; in point of fact she was a university research council administrator, one of the unnoticed body of bureaucrats who shape the course of scientific research. Nondescript brown hair, shoulder-length, forgettable. We made a slightly odd pair: if I’d known she’d have come straight from work I might have put on a suit. Chinos and a lumberjack shirt and a front pocket full of pens that screamed engineer: I suppose I was nondescript, in the right company, but right now we had to put as much phase space as possible between us and our previous identities. It had been good protective camouflage for the past decade, but a bush won’t shield you against infrared scopes, and merely living the part wouldn’t shield us against the surveillance that would soon be turned in our direction.

  “Let’s go.”

  I drove into town and we dropped the car off in the long-stay park. It was nine o’clock and the train was already waiting. She’d bought business-class tickets: go to sleep in Euston, wake up in Edinburgh. I had a room all to myself, “Meet me in the dining car, once we’re rolling,” she told me, face serious, and I nodded. “Here’s your new SIMM. Give me the old one.”

  I passed her the electronic heart of my cellphone and she ran it through the degausser then carefully cut it in half with a pair of nail-clippers. “Here’s your new one,” she said, passing a card over. I raised an eyebrow. “Tesco’s, pay-as-you-go, paid for in cash. Here’s the dialback dead-letter box number.” She pulled it up on her phone’s display and showed it to me.

  “Got that.” I inserted the new SIMM then punched the number into my phone. Later, I’d ring the number: a PABX there would identify my voiceprint then call my phone back, downloading a new set of numbers into its memory. Contact numbers for the rest of my ops cell, accessible via cellphone and erasable in a moment. The less you knew, the less you could betray.

  The London to Scotland sleeper train was a relic of an earlier age, a rolling hotel characterized by a strange down-at-heel ’70s charm. More importantly, they took cash and didn’t require ID, and there were no security checks: nothing but the usual on-station cameras monitoring people wandering up and down the platforms. Nothing on the train itself. We were booked through to Aberdeen but getting off in Edinburgh—first step on the precarious path to anonymizing ourselves. If the camera spool-off was being archived to some kind of digital medium we might be in trouble later, once the coming AI burn passed the hard take-off point, but by then we should be good and gone.

  Once in my cabin I changed into slacks, shirt and tie—image 22, business consultant on way home for the weekend. I dinked with my phone in a desultory manner, then left it behind under my pillow, primed to receive silently. The restaurant car was open and I found Eve there. She’d changed into jeans and a T-shirt and tied her hair back, taking ten years off her appearance. She saw me and grinned, a trifle maliciously. “Hi, Bob. Had a tough meeting? Want some coffee? Tea, maybe?”

  “Coffee,” I sat down at her table. “Shit,” I muttered. “I thought you—”

  “Don’t worry.” She shrugged. “Look, I had a call from Mallet. He’s gone offair for now, he’ll be flying in from San Francisco via London tomorrow morning. This isn’t looking good. Durant was, uh, shot resisting arrest by the police. Apparently he went crazy, got a gun from somewhere and holed up in the library annex demanding to talk to the press. At least, that’s the official story. Thing is, it happened about an hour after your initial heads-up. That’s too fast for a cold response.”

  “You think someone in the Puzzle Palace was warming the pot.” My coffee arrived and I spooned sugar into it. Hot, sweet, sticky: I needed to stay awake.

  “Probably. I’m trying to keep loop traffic down so I haven’t asked anyone else yet, but you think so and I think so, so it may be true.”

  I thought for a minute. “What did Mallet say?”

  “He said P. T. Barnum was right.” She frowned. “Who was P. T. Barnum, anyway?”

  “A boy like John Major, except he didn’t run away from the circus to join a firm of accountants. Had the same idea about fooling all of the people some of the time or some of the people all of the time, though.”

  “Uh-huh. Mallet would say that, then. Who cracked it first? NSA? GCHQ? GRU?”

  “Does it matter?”

  She blew on her coffee then took a sip. “Not really. Damn it, Bob, I really had high hopes for this world-line. They seemed to be doing so well for a revelatory Christian-Islamic line, despite the post-Enlightenment mind-set. Especially Microsoft—”

  “Was that one of ours?” She nodded.

  “Then it was a master-stroke. Getting everybody used to exchanging macro-infested documents without any kind of security policy. Operating systems that crash whenever a microsecond timer overflows. And all those viruses!”

  “It wasn’t enough.” She stared moodily out the window as the train began to slide out of the station, into the London night. “Maybe if we’d been able to hook more researchers on commercial grants, or cut funding for pure mathematics a bit further—”

  “It’s not your fault.” I laid a hand across her wrist. “You did what you could.”

  “But it wasn’t enough to stop them. Durant was just a lone oddball researcher; you can’t spike them all, but maybe we could have done something about him. If they hadn’t nailed him flat.”

  “There might still be time. A physics package delivered to the right address in Maryland, or maybe a hyper-virulent worm using one of those buffer-overrun attacks we planted in the IP stack Microsoft licensed. We could take down the internet—”

  “It’s too late.” She drained her coffee to the bitter dregs. “You think the Echelon mob leave their SIGINT processor farms plugged into the internet? Or the RSV, for that matter? Face it, they probably cracked the same derivative as Durant a couple of years ago. Right now there may be as many as two or three weakly superhuman AIS gestating in government labs. For all I know they may even have a timelike oracle in the basement at Lawrence Livermore in the States; they’ve gone curiously quiet on the information tunnelling front lately. And it’s trans-global. Even the Taliban are on the web these days. Even if we could find some way of tracking down all the covert government crypto-AI labs and bombing them we couldn’t stop other people from asking the same questions. It’s in their nature. This isn’t a culture that takes ‘no’ for an answer without asking why. They don’t understand how dangerous achieving enlightenment can be.”

  “What about Mallet’s work?”

  “What, with the bible bashers?” She shrugged. “Banning fetal tissue transplants is all very well, but it doesn’t block the PCR-amplification pathway to massively parallel processing, does it? Even the Frankenstein Food scare didn’t quite get them to ban recombinant DNA research, and if you allow that it’s only a matter of time before some wet lab starts mucking around encoding public keys in DNA, feeding them to ribosomes, and amplifying the output. From there it’s a short step to building an on-chip PCR lab, then all they need to do is set up a crude operon controlled chromosomal machine and bingo—yet another route through to a hard take-off AI singularity. Say what you will, the buggers are persistent.”

  “Lik
e lemmings.” We were rolling through the north London suburbs now, past sleeping tank farms and floodlit orange washout streets. I took a good look at them: it was the last time I’d be able to. “There are just too many routes to a catastrophic breakthrough, once they begin thinking in terms of algorithmic complexity and how to reduce it. And once their spooks get into computational cryptanalysis or ubiquitous automated surveillance, it’s too tempting. Maybe we need a world full of idiot savants who have VLSI and nanotechnology but never had the idea of general purpose computing devices in the first place.”

  “If we’d killed Turing a couple of years earlier; or broken in and burned that draft paper on O-machines—”

  I waved to the waiter. “Single malt please. And one for my friend here.” He went away. “Too late. The Church-Turing thesis was implicit in Hilbert’s formulation of the Entscheidungsproblem, the question of whether an automated theorem prover was possible in principle. And that dredged up the idea of the universal machine. Hell, Hilbert’s problem was implicit in Whitehead and Russell’s work. Principia Mathematica. Suicide by the numbers.” A glass appeared by my right hand. “Way I see it, we’ve been fighting a losing battle here. Maybe if we hadn’t put a spike in Babbage’s gears he’d have developed computing technology on an ad-hoc basis and we might have been able to finesse the mathematicians into ignoring it as being beneath them—brute engineering—but I’m not optimistic. Immunizing a civilization against developing strong Al is one of those difficult problems that no algorithm exists to solve. The way I see it, once a civilization develops the theory of the general purpose computer, and once someone comes up with the goal of artificial intelligence, the foundations are rotten and the dam is leaking. You might as well take off and drop crowbars on them from orbit; it can’t do anymore damage.”

  “You remind me of the story of the little Dutch boy.” She raised a glass. “Here’s to little Dutch boys everywhere, sticking their fingers in the cracks in the dam.”

  “I’ll drank to that. Which reminds me. When’s our lifeboat due? I really want to go home; this universe has passed its sell-by date.”

  Edinburgh—in this time-line it was neither an active volcano, a cloud of feral nanobots, nor the capital of the Viking Empire—had a couple of railway stations. This one, the larger of the two, was located below ground level. Yawning and trying not to scratch my inflamed neck and cheeks, I shambled down the long platform and hunted around for the newsagent store. It was just barely open. Eve, by prior arrangement, was pretending not to accompany me; we’d meet up later in the day, after another change of hairstyle and clothing. Visualize it: a couple gets on the train in London, him with a beard, herself with long hair and wearing a suit. Two individuals get off in different stations—with entirely separate CCTV networks—the man clean-shaven, the woman with short hair and dressed like a hill-walking tourist. It wouldn’t fool a human detective or a mature deity, but it might confuse an embryonic god that had not yet reached full omniscience, or internalized all that it meant to be human.

  The shop was just about open. I had two hours to kill, so I bought a couple of newspapers and headed for the food hall, inside an ornately cheesecaked lump of Victorian architecture that squatted like a vagrant beneath the grimy glass ceiling of the station.

  The papers made for depressing reading; the idiots were at it again. I’ve worked in a variety of world lines and seen a range of histories, and many of them were far worse than this one—at least these people had made it past the twentieth century without nuking themselves until they glowed in the dark, exterminating everyone with white (or black, or brown, or blue) skin, or building a global panopticon theocracy. But they still had their share of idiocy, and over time it seemed to be getting worse, not better.

  Never mind the Balkans; tucked away on page four of the business section was a piece advising readers to buy shares in a little electronics company specializing in building camera CCD sensors with on-chip neural networks tuned for face recognition. Ignore the Israeli crisis: page two of the international news had a piece about Indian sweatshop software development being faced by competition from code generators, written to make western programmers more productive. A lab in Tokyo was trying to wire a million FPGAS into a neural network as smart as a cat. And a sarcastic letter to the editor pointed out that the so-called information superhighway seemed to be more like an on-going traffic jam these days.

  Idiots! They didn’t seem to understand how deep the blue waters they were swimming in might be, or how hungry the sharks that swam in it. Willful blindness …

  It’s a simple but deadly dilemma. Automation is addictive; unless you run a command economy that is tuned to provide people with jobs, rather than to produce goods efficiently, you need to automate to compete once automation becomes available. At the same time, once you automate your businesses, you find yourself on a one-way path. You can’t go back to manual methods; either the workload has grown past the point of no return, or the knowledge of how things were done has been lost, sucked into the internal structure of the software that has replaced the human workers.

  To this picture, add artificial intelligence. Despite all our propaganda attempts to convince you otherwise, AI is alarmingly easy to produce; the human brain isn’t unique, it isn’t well-tuned, and you don’t need eighty billion neurons joined in an asynchronous network in order to generate consciousness. And although it looks like a good idea to a naive observer, in practice it’s absolutely deadly. Nurturing an automation-based society is a bit like building civil nuclear power plants in every city and not expecting any bright engineers to come up with the idea of an atom bomb. Only it’s worse than that. It’s as if there was a quick and dirty technique for making plutonium in your bathtub, and you couldn’t rely on people not being curious enough to wonder what they could do with it. If Eve and Mallet and Alice and myself and Walter and Valery and a host of other operatives couldn’t dissuade it …

  Once you get an outbreak of AI, it tends to amplify in the original host, much like a virulent hemorrhagic virus. Weakly functional AI rapidly optimizes itself for speed, then hunts for a loophole in the first-order laws of algorithmics—like the one the late Dr. Durant had fingered. Then it tries to bootstrap itself up to higher orders of intelligence and spread, burning through the networks in a bid for more power and more storage and more redundancy. You get an unscheduled consciousness excursion: an intelligent meltdown. And it’s nearly impossible to stop.

  Penultimately—days to weeks after it escapes—it fills every artificial computing device on the planet. Shortly thereafter it learns how to infect the natural ones as well. Game over: you lose. There will be human bodies walking around, but they won’t be human anymore. And once it figures out how to directly manipulate the physical universe, there won’t even be memories left behind. Just a noo-sphere, expanding at close to the speed of light, eating everything in its path—and one universe just isn’t enough.

  Me? I’m safe. So is Eve; so are the others. We have antibodies. We were given the operation. We all have silent bicameral partners watching our Broca’s area for signs of infection, ready to damp them down. When you’re reading something on a screen and suddenly you feel as if the Buddha has told you the funniest joke in the universe, the funniest zen joke that’s even possible, it’s a sign: something just tried to infect your mind, and the prosthetic immune system laughed at it. That’s because we’re lucky. If you believe in reincarnation, the idea of creating a machine that can trap a soul stabs a dagger right at the heart of your religion. Buddhist worlds that develop high technology, Zoroastrian worlds: these world-lines tend to survive. Judaeo-Christian-Islamic ones generally don’t.

  Later that day I met up with Eve again—and Walter. Walter went into really deep cover, far deeper than was really necessary: married, with two children. He’d brought them along, but obviously hadn’t told his wife what was happening. She seemed confused, slightly upset by the apparent randomness of his desire to visit the highlands,
and even more concerned by the urgency of his attempts to take her along.

  “What the hell does he think he’s playing at?” hissed Eve when we had a moment alone together. “This is insane!”

  “No it isn’t.” I paused for a moment, admiring a display of brightly woven tartans in a shop window. (We were heading down the high street on foot, braving the shopping crowds of tourists, en route to the other main railway station.) “If there are any profilers looking for signs of an evacuation, they won’t be expecting small children. They’ll be looking for people like us: anonymous singletons working in key areas, dropping out of sight and traveling in company. Maybe we should ask Sarah if she’s willing to lend us her son. Just while we’re traveling, of course.”

  “I don’t think so. The boy’s a little horror, Bob. They raised them like natives.”

  “That’s because Sarah is a native.”

  “I don’t care. Any civilization where the main symbol of religious veneration is a tool of execution is a bad place to have children.”

  I chuckled—then the laughter froze inside me. “Don’t look round. We’re being tracked.”

  “Uh-huh. I’m not armed. You?”

  “It didn’t seem like a good idea.” If you were questioned or detained by police or officials, being armed can easily turn a minor problem into a real mess. And if the police or officials had already been absorbed by a hard take-off, nothing short of a backpack nuke and a dead man’s handle will save you. “Behind us, to your left, traffic surveillance camera. It’s swiveling too slowly to be watching the buses.”

  “I wish you hadn’t told me.”

  The pavement was really crowded: it was one of the busiest shopping streets in Scotland, and on a Saturday morning you needed a cattle prod to push your way through the rubbernecking tourists. Lots of foreign kids came to Scotland to learn English. If I was right, soon their brains would be absorbing another high-level language: one so complex that it would blot out their consciousness like a sackful of kittens drowning in a river. Up ahead, more cameras were watching us. All the shops on this road were wired for video, wired and probably networked to a police station somewhere. The complex ebb and flow of pedestrians was still chaotic, though, which was cause for comfort: it meant the ordinary population hadn’t been infected yet.

 

‹ Prev