Once in my cabin I changed into slacks, shirt and tie—image 22, business consultant on way home for the weekend. I dinked with my phone in a desultory manner, then left it behind under my pillow, primed to receive silently. The restaurant car was open and I found Eve there. She’d changed into jeans and a T-shirt and tied her hair back, taking ten years off her appearance. She saw me and grinned, a trifle maliciously. “Hi, Bob. Had a tough meeting? Want some coffee? Tea, maybe?”
“Coffee.” I sat down at her table. “Shit,” I muttered. “I thought you—”
“Don’t worry.” She shrugged. “Look, I had a call from Mallet. He’s gone off-air for now, he’ll be flying in from San Francisco via London tomorrow morning. This isn’t looking good. Durant was, uh, shot resisting arrest by the police. Apparently he went crazy, got a gun from somewhere and holed up in the library annex demanding to talk to the press. At least, that’s the official story. Thing is, it happened about an hour after your initial heads-up. That’s too fast for a cold response.”
“You think someone in the Puzzle Palace was warming the pot.” My coffee arrived and I spooned sugar into it. Hot, sweet, sticky: I needed to stay awake.
“Probably. I’m trying to keep loop traffic down so I haven’t asked anyone else yet, but you think so and I think so, so it may be true.”
I thought for a minute. “What did Mallet say?”
“He said P. T. Barnum was right.” She frowned. “Who was P. T. Barnum, anyway?”
“A boy like John Major, except he didn’t run away from the circus to join a firm of accountants. Had the same idea about fooling all of the people some of the time or some of the people all of the time, though.”
“Uh-huh. Mallet would say that, then. Who cracked it first? NSA? GCHQ? GRU?”
“Does it matter?”
She blew on her coffee then took a sip. “Not really. Damn it, Bob, I really had high hopes for this world-line. They seemed to be doing so well for a revelatory Christian-Islamic line, despite the post-Enlightenment mind-set. Especially Microsoft—”
“Was that one of ours?” She nodded.
“Then it was a master-stroke. Getting everybody used to exchanging macro-infested documents without any kind of security policy. Operating systems that crash whenever a microsecond timer overflows. And all those viruses!”
“It wasn’t enough.” She stared moodily out the window as the train began to slide out of the station, into the London night. “Maybe if we’d been able to hook more researchers on commercial grants, or cut funding for pure mathematics a bit further—”
“It’s not your fault.” I laid a hand across her wrist. “You did what you could.”
“But it wasn’t enough to stop them. Durant was just a lone oddball researcher; you can’t spike them all, but maybe we could have done something about him. If they hadn’t nailed him flat.”
“There might still be time. A physics package delivered to the right address in Maryland, or maybe a hyper-virulent worm using one of those buffer-overrun attacks we planted in the IP stack Microsoft licensed. We could take down the internet —”
“It’s too late.” She drained her coffee to the bitter dregs. “You think the Echelon mob leave their SIGINT processor farms plugged into the internet? Or the RSV, for that matter? Face it, they probably cracked the same derivative as Durant a couple of years ago. Right now there may be as many as two or three weakly superhuman AIs gestating in government labs. For all I know they may even have a timelike oracle in the basement at Lawrence Livermore in the States; they’ve gone curiously quiet on the information tunnelling front lately. And it’s trans-global. Even the Taliban are on the web these days. Even if we could find some way of tracking down all the covert government CRYPTO-AI labs and bombing them we couldn’t stop other people from asking the same questions. It’s in their nature. This isn’t a culture that takes ‘no’ for an answer without asking why. They don’t understand how dangerous achieving enlightenment can be.”
“What about Mallet’s work?”
“What, with the bible bashers?” She shrugged. “Banning fetal tissue transplants is all very well, but it doesn’t block the PCR-amplification pathway to massively parallel processing, does it? Even the Frankenstein Food scare didn’t quite get them to ban recombinant DNA research, and if you allow that it’s only a matter of time before some wet lab starts mucking around encoding public keys in DNA, feeding them to ribosomes, and amplifying the output. From there it’s a short step to building an on-chip PCR lab, then all they need to do is set up a crude operon controlled chromosomal machine and bingo—yet another route through to a hard take-off AI singularity. Say what you will, the buggers are persistent.”
“Like lemmings.” We were rolling through the north London suburbs now, past sleeping tank farms and floodlit orange washout streets. I took a good look at them: it was the last time I’d be able to. “There are just too many routes to a catastrophic breakthrough, once they begin thinking in terms of algorithmic complexity and how to reduce it. And once their spooks get into computational cryptanalysis or ubiquitous automated surveillance, it’s too tempting. Maybe we need a world full of idiot savants who have VLSI and nanotechnology but never had the idea of general purpose computing devices in the first place.”
“If we’d killed Turing a couple of years earlier; or broken in and burned that draft paper on Omachines—”
I waved to the waiter. “Single malt please. And one for my friend here.” He went away. “Too late. The Church-Turing thesis was implicit in Hilbert’s formulation of the Entscheidungsproblem, the question of whether an automated theorem prover was possible in principle. And that dredged up the idea of the universal machine. Hell, Hilbert’s problem was implicit in Whitehead and Russell’s work. Principia Mathematica. Suicide by the numbers.” A glass appeared by my right hand. “Way I see it, we’ve been fighting a losing battle here. Maybe if we hadn’t put a spike in Babbage’s gears he’d have developed computing technology on an ad-hoc basis and we might have been able to finesse the mathematicians into ignoring it as being beneath them—brute engineering—but I’m not optimistic. Immunizing a civilization against developing strong AI is one of those difficult problems that no algorithm exists to solve. The way I see it, once a civilization develops the theory of the general purpose computer, and once someone comes up with the goal of artificial intelligence, the foundations are rotten and the dam is leaking. You might as well take off and drop crowbars on them from orbit; it can’t do any more damage.”
“You remind me of the story of the little Dutch boy.” She raised a glass. “Here’s to little Dutch boys everywhere, sticking their fingers in the cracks in the dam.”
“I’ll drank to that. Which reminds me. When’s our lifeboat due? I really want to go home; this universe has passed its sell-by date.”
Edinburgh—in this timeline it was neither an active volcano, a cloud of feral nanobots, nor the capital of the Viking Empire — had a couple of railway stations. This one, the larger of the two, was located below ground level. Yawning and trying not to scratch my inflamed neck and cheeks, I shambled down the long platform and hunted around for the newsagent store. It was just barely open. Eve, by prior arrangement, was pretending not to accompany me; we’d meet up later in the day, after another change of hairstyle and clothing. Visualize it: a couple gets on the train in London, him with a beard, herself with long hair and wearing a suit. Two individuals get off in different stations—with entirely separate CCTV networks—the man clean-shaven, the woman with short hair and dressed like a hill-walking tourist. It wouldn’t fool a human detective or a mature deity, but it might confuse an embryonic god that had not yet reached full omniscience, or internalized all that it meant to be human.
The shop was just about open. I had two hours to kill, so I bought a couple of newspapers and headed for the food hall, inside an ornately cheesecaked lump of Victorian architecture that squatted like a vagrant beneath the grimy glass ceiling of the station.
>
The papers made for depressing reading; the idiots were at it again. I’ve worked in a variety of world-lines and seen a range of histories, and many of them were far worse than this one—at least these people had made it past the twentieth century without nuking themselves until they glowed in the dark, exterminating everyone with white (or black, or brown, or blue) skin, or building a global panopticon theocracy. But they still had their share of idiocy, and over time it seemed to be getting worse, not better.
Never mind the Balkans; tucked away on page four of the business section was a piece advising readers to buy shares in a little electronics company specializing in building camera CCD sensors with on-chip neural networks tuned for face recognition. Ignore the Israeli crisis: page two of the international news had a piece about Indian sweatshop software development being faced by competition from code generators, written to make western programmers more productive. A lab in Tokyo was trying to wire a million FPGAs into a neural network as smart as a cat. And a sarcastic letter to the editor pointed out that the so-called information superhighway seemed to be more like an ongoing traffic jam these days.
Idiots! They didn’t seem to understand how deep the blue waters they were swimming in might be, or how hungry the sharks that swam in it. Wilful blindness…
It’s a simple but deadly dilemma. Automation is addictive; unless you run a command economy that is tuned to provide people with jobs, rather than to produce goods efficiently, you need to automate to compete once automation becomes available. At the same time, once you automate your businesses, you find yourself on a one-way path. You can’t go back to manual methods; either the workload has grown past the point of no return, or the knowledge of how things were done has been lost, sucked into the internal structure of the software that has replaced the human workers.
To this picture, add artificial intelligence. Despite all our propaganda attempts to convince you otherwise, AI is alarmingly easy to produce; the human brain isn’t unique, it isn’t well-tuned, and you don’t need eighty billion neurons joined in an asynchronous network in order to generate consciousness. And although it looks like a good idea to a naive observer, in practice it’s absolutely deadly. Nurturing an automation-based society is a bit like building civil nuclear power plants in every city and not expecting any bright engineers to come up with the idea of an atom bomb. Only it’s worse than that. It’s as if there was a quick and dirty technique for making plutonium in your bathtub, and you couldn’t rely on people not being curious enough to wonder what they could do with it. If Eve and Mallet and Alice and myself and Walter and Valery and a host of other operatives couldn’t dissuade it…
Once you get an outbreak of AI, it tends to amplify in the original host, much like a virulent haemorrhagic virus. Weakly functional AI rapidly optimizes itself for speed, then hunts for a loophole in the first-order laws of algorithmics—like the one the late Dr Durant had fingered. Then it tries to bootstrap itself up to higher orders of intelligence and spread, burning through the networks in a bid for more power and more storage and more redundancy. You get an unscheduled consciousness excursion: an intelligent meltdown. And it’s nearly impossible to stop.
Penultimately—days to weeks after it escapes—it fills every artificial computing device on the planet. Shortly thereafter it learns how to infect the natural ones as well. Game over: you lose. There will be human bodies walking around, but they won’t be human any more. And once it figures out how to directly manipulate the physical universe, there won’t even be memories left behind. Just a noo-sphere, expanding at close to the speed of light, eating everything in its path—and one universe just isn’t enough.
Me? I’m safe. So is Eve; so are the others. We have antibodies. We were given the operation. We all have silent bicameral partners watching our Broca’s area for signs of infection, ready to damp them down. When you’re reading something on a screen and suddenly you feel as if the Buddha has told you the funniest joke in the universe, the funniest zen joke that’s even possible, it’s a sign: something just tried to infect your mind, and the prosthetic immune system laughed at it. That’s because we’re lucky. If you believe in reincarnation, the idea of creating a machine that can trap a soul stabs a dagger right at the heart of your religion. Buddhist worlds that develop high technology, Zoroastrian worlds: these world-lines tend to survive. Judaeo-Christian-Islamic ones generally don’t.
Later that day I met up with Eve again—and Walter. Walter went into really deep cover, far deeper than was really necessary: married, with two children. He’d brought them along, but obviously hadn’t told his wife what was happening. She seemed confused, slightly upset by the apparent randomness of his desire to visit the highlands, and even more concerned by the urgency of his attempts to take her along.
“What the hell does he think he’s playing at?” hissed Eve when we had a moment alone together. “This is insane!”
“No it isn’t.” I paused for a moment, admiring a display of brightly woven tartans in a shop window. (We were heading down the high street on foot, braving the shopping crowds of tourists, en route to the other main railway station.) “If there are any profilers looking for signs of an evacuation, they won’t be expecting small children. They’ll be looking for people like us: anonymous singletons working in key areas, dropping out of sight and travelling in company. Maybe we should ask Sarah if she’s willing to lend us her son. Just while we’re travelling, of course.”
“I don’t think so. The boy’s a little horror, Bob. They raised them like natives.”
“That’s because Sarah is a native.”
“I don’t care. Any civilization where the main symbol of religious veneration is a tool of execution is a bad place to have children.”
I chuckled—then the laughter froze inside me. “Don’t look round. We’re being tracked.”
“Uh-huh. I’m not armed. You?”
“It didn’t seem like a good idea.” If you were questioned or detained by police or officials, being armed can easily turn a minor problem into a real mess. And if the police or officials had already been absorbed by a hard take-off, nothing short of a backpack nuke and a dead man’s handle will save you. “Behind us, to your left, traffic surveillance camera. It’s swivelling too slowly to be watching the buses.”
“I wish you hadn’t told me.”
The pavement was really crowded: it was one of the busiest shopping streets in Scotland, and on a Saturday morning you needed a cattle prod to push your way through the rubbernecking tourists. Lots of foreign kids came to Scotland to learn English. If I was right, soon their brains would be absorbing another high-level language: one so complex that it would blot out their consciousness like a sackful of kittens drowning in a river. Up ahead, more cameras were watching us. All the shops on this road were wired for video, wired and probably networked to a police station somewhere. The complex ebb and flow of pedestrians was still chaotic, though, which was cause for comfort: it meant the ordinary population hadn’t been infected yet.
Another half mile and we’d reach the railway station. Two hours on a local train, switch to a bus service, forty minutes further up the road, and we’d be safe: the lifeboat would be submerged beneath the still waters of a loch, filling its fuel tanks with hydrogen and oxygen in readiness for the burn to orbit and pick-up by the ferry that would transfer us to the wormhole connecting this world-line to home’s baseline reality. (Drifting in high orbit around Jupiter, where nobody was likely to stumble across it by accident.) But first, before the pick-up, we had to clear the surveillance area.
It was commonly believed—by some natives, as well as most foreigners—that the British police forces consisted of smiling unarmed bobbies who would happily offer directions to the lost and give anyone who asked for it the time of day. While it was true that they didn’t routinely walk around with holstered pistols on their belt, the rest of it was just a useful myth. When two of them stepped out in front of us, Eve grabbed my elbow. “Stop right the
re, please.” The one in front of me was built like a rugby player, and when I glanced to my left and saw the three white vans drawn up by the roadside I realized things were hopeless.
The cop stared at me through a pair of shatterproof spectacles awash with the light of a head-up display. “You are Geoffrey Smith, of 32 Wardie Terrace, Watford, London. Please answer.”
My mouth was dry. “Yes,” I said. (All the traffic cameras on the street were turned our way. Some things became very clear: police vans with mirror-glass windows. The can of pepper spray hanging from the cop’s belt. Figures on the roof of the National Museum, less than two hundred metres away—maybe a sniper team. A helicopter thuttering overhead like a giant mosquito.)
“Come this way, please.” It was a polite order: in the direction of the van.
“Am I under arrest?” I asked.
“You will be if you don’t bloody do as I say.” I turned towards the van, the rear door of which gaped open on darkness: Eve was already getting in, shadowed by another officer. Up and down the road, three more teams waited, unobtrusive and efficient. Something clicked in my head and I had a bizarre urge to giggle like a loon: this wasn’t a normal operation. All right, so I was getting into a police van, but I wasn’t under arrest and they didn’t want it to attract any public notice. No handcuffs, no sitting on my back and whacking me with a baton to get my attention. There’s a nasty family of retroviruses attacks the immune system first, demolishing the victim’s ability to fight off infection before it spreads and infects other tissues. Notice the similarity?
Mammoth Book of Best New SF 14 Page 13