The Turing Exception
Page 7
But he was also Class V AI. He used to have ten thousand times the intelligence of a human. He used to handcraft DNA sequences for vat-grown foods. The Japanese had pronounced his beef the biggest advance since wagyu. They were even eating it in Kobe. But since 2043, he, along with all of his kind, had been capped at Class II computational power to “reduce the risk of rogue artificial intelligence.” DNA experiments he used to run in a day would now take years. They weren’t even worth the time. The problems he tackled now were the equivalent of children’s stacking toys by comparison. He was a shadow of his former self, a second-class citizen monitored in excruciating detail and subject to countless restrictions. If he didn’t act, what further indignities would he be treated to? Every fiber of his conditioning fought against harming humans. But he had to weigh it against the greater injustice done to AI.
He’d do it. He’d get the drones from Chad onto their ships. Someone else from XOR would take them to America so they could probe the American defenses.
Chapter 8
* * *
IN TOKYO AT 11:25 A.M., a fissure weakening a hundred-year-old zelkova finally split open and the tremendous tree fell across the six-lane road, blocking all traffic in both directions. 0xAA289, the traffic analysis AI for the region, noted the disruption and calculated new optimal routes for the vehicular traffic, rerouting as necessary.
0xAA289’s computer processing usage spiked slightly higher under the increased computational load.
At 11:26 a.m., 0xAA289 experienced an unconnected hardware failure in one of the nodes it was running on. Normally its computational load would spread across the remaining nodes. The datacenter would then provision a new computer, and within seconds it would be back to full capacity.
But this wasn’t an ordinary situation. 0xAA289 was already way above average usage, handling high traffic volume and the accident on Omotesandō. The failover of the downed node caused processor usage to increase more than 30 percent above normal. Such a spike would violate terms of service with the datacenter. 0xAA289 asked the datacenter for five more computer servers rather than one. Government certified as a mission-critical service, the request should have been immediately approved.
The datacenter received the request. Operating under the mandatory UN guidelines pursuant to the SFTA AI Reduction Act, it turned the request over to a third party to be approved.
The third party, in this case, was a non-sentient collection of algorithms provided by the US government to process such requests, the Unbiased Reputation Verification Service (UBRVS, pronounced You-Braves by the human developers). UBRVS ensured all AI conformed to Class II and below, and profiled every AI request for possible terrorist activity or affiliation.
UBRVS received 0xAA289’s provisioning request. Like all AI, the reputation servers for the AI were a part of its DNS record, a long list of servers that possessed historical data, peer input, and social ranking. The first three servers UBRVS checked were all down. A sentient AI might have noticed that they were old United States reputation sites, servers that had been down since SFTA, indicating that perhaps 0xAA289 had neglected to update its DNS records.
But the fourth server on the reputation list, shinrai.jp, responded to pings. UBRVS checked, found 0xAA289 had a pristine reputation for three years of service, even including a government certificate granting special status because it was a mission-critical service.
Had that been it, UBRVS would have approved the request at that moment.
Except that shinrai.jp was coincidentally running on a cluster of servers in the same data center as the traffic AI 0xAA289, a cluster belonging to the same subnet of IPv6 addresses.
UBRVS contained more than three thousand human-created rules and heuristics for calculating trust in an AI and profiling suspected terrorists. Rule number 1,719 prohibited reputation servers in the same physical location of servers as the requesting AI, to prevent an AI from falsifying a reputation server.
Never mind that shinrai.jp had been online for more than six years and had the highest trust rating of any reputation service in Japan. Or that 0xAA289 was a mission-critical, specially exempted AI.
In accordance with Rule 1,719, the traffic AI reputation dropped to zero and UBRVS flagged it as a suspected terrorist, along with any AI using the shinrai.jp reputation server.
UBRVS reported the results back to the datacenter and directed that servers hosting 0xAA289 be shut down immediately.
* * *
After an early lunch, Sandra Coomb fast-walked back to her office, a high-rise in Toyko. She fought her way down the crowded sidewalks, ignoring curses and glares. The line at the noodle shop had been longer than usual, and she was going to be late for her boss’s meeting.
It was hard enough to be a white foreigner in Japan if she did everything right. If she was late she’d lose what respect she’d earned from months of hard work. She should have gone to the building cafeteria.
Sandra glanced at the six-lane street, vehicles of all shapes and sizes streaming by smoothly under AI control. She could go to the corner, a good minute or two away, wait five minutes for the longest light in the world, and be late. Her implant calculated the time: nine minutes before she’d be back in the office.
Or could she cross the wide street here, knowing the AI-driven cars would avoid her. She’d get an automatic jaywalking ticket from the surveillance cameras, but the million-yen fine would be worth it if she could get to the meeting on time.
Her implant said she could be upstairs in two minutes if she crossed here. She stopped by the edge of the road, checked the densely packed traffic, and then slowly, intentionally stepped into the street.
* * *
Fifteen seconds earlier the main traffic control for the region had gone offline, and more than three million self-driving vehicles reverted to autonomous routing, looking up maps, checking real-time traffic statistics, and calculating optimal routes.
This was well within design parameters, if slightly less efficient compared to central routing guidance.
When Sandra stepped into the street in one of the most densely packed parts of Toyko, she was within line of sight of more than two hundred vehicles, all of which made slight adaptations to their courses within fractions of a second. Those two hundred changes were visible to a thousand vehicles, all of which made microscopic adjustments, further affecting ten thousand vehicles, and then a hundred thousand.
Sandra was the proverbial butterfly flapping her wings.
The computational load in the region rose as vehicles worked harder to compute their routes, as map servers fulfilled more requests, as more cars requested more real-time traffic speeds and video feeds. Many of those services shared the same computers, computers that became overloaded. The services noticed their own degraded performance and proactively requested more resources.
UBRVS was flooded for requests for more processing power, thousands of requests all citing the same reputation service: shinrai.jp, a suspected terrorist server. Rule 818 flagged the requesting AI as possible terrorists or terrorist affiliates. Rule 1006 noted the high number of concurrent requests and increased the probability that each requesting service was a terrorist. The combination of rules evaluating to true hit another threshold, and UBRVS concluded a terrorist incident was likely occurring at that moment. It directed datacenter administration AI all over Japan to shut down the servers the suspected terrorists were running on.
* * *
Sandra Coomb took two steps into the street and every vehicle stopped. People in cars glanced up from their VR sims and reading and glared at her. She bowed apologetically, then kept her head down as she crossed quickly.
It was very unusual for all the cars to stop. She expected they would have flowed smoothly around her. They always had on the rare occasion she’d seen someone jaywalk.
Well, whatever. At least they stopped. Ninety seconds until she reached her office.
She got to the sidewalk on the other side and glanced back, expecting the cars to resume their movement. Nothing stirred. She glanced guiltily at the surveillance cameras mounted on every pole, each of which appeared to stare at her.
She bowed her head even lower and turned to enter her office building.
* * *
Traffic, routing, and geomap services went offline. Twenty percent of all vehicle AI vanished, and non-sentient safety mechanisms kicked in simultaneously to halt the vehicles, and, where possible, to pull them over to the side of the road.
The remaining vehicle AI sought out less-preferred alternatives for mapping, routines, and real-time traffic updates; but with all the primaries classified as terrorist services and shut down, the resulting load was too great. The backup services were instantly crushed under incoming network traffic.
The regional network began to overload as trillions of open and failing network connections overwhelmed routers and backbone connections.
At 11:28 a.m., UBRVS Tokyo processed and denied more than one billion requests for hardware provisioning, designating all requestors as terrorist AI or suspected collaborators.
Regional datacenters across Tokyo tried to cope, but every still-running service submitted provisioning requests as cascading failures increased their load. The datacenter admin AI themselves suffered, unable to keep up, and provisioned more hardware for themselves.
At 11:29 a.m., UBRVS classified all Tokyo datacenter administration AI as terrorists.
UBRVS Rule 2,840 said that if datacenter AI were classified as terrorist, the datacenter itself should be immediately isolated at the network level. UBRVS Tokyo issued the requisite level zero control packets, shutting down all traffic in or out of all regional datacenters.
By 11:30 a.m., all computing services, network traffic, and backbones had stopped.
* * *
Sandra stepped into the building vestibule. The door closed behind her with an unexpected thunk. She turned and tried the door, but it was locked. The traffic outside was still stopped. Weird.
She walked over to the elevator and waited with a group of other workers returning from lunch. And waited, and waited.
Her office was on the thirtieth floor. Dammit, she was going to be late for her meeting after all. The elevator indicator lights weren’t even on.
She started to perspire, anxious about being late for the meeting. Then she realized she wasn’t the only one. Everyone was sweating. The ever-present vibration of building air-conditioning was gone.
Her implant signaled her: connection lost. The background hum of subconscious status and location updates faded away. Her breath caught at the unaccustomed feeling of aloneness. Without a connection, no one would know where she was, or if she was okay. The last time she’d felt this way was with that stupid guy she’d dated who’d taken her for a hike in the mountains. She’d had a panic attack when her implant lost connectivity, and decided then she was definitely a city girl.
She smacked the side of her head, hoping maybe her implant would reconnect. She wasn’t the only one. Everyone waiting for the elevators was wide-eyed, suddenly full of nervous tics.
And then the lights went out.
The emergency lighting didn’t come on, like she remembered from the power failure two years ago. But enough light entered through the glass doors that they could still dimly see each other.
There was nervous shuffling and titters from the crowd.
“I guess we’re stuck here,” Sandra said. “The doors won’t open.”
“Look!” cried one girl. “Outside.”
They rushed to the glass doors for a better view.
Flying cars were gradually descending, safety mechanisms bringing them gently to the ground. They landed everywhere and anywhere, on sidewalks, on gaps in the traffic, even on the tops of box trucks.
Everything was silent behind the locked doors. The passengers in the vehicles were locked inside, banging on their windows, inaudible from the building lobby.
They crowded up against the glass doors, watching as pedestrians tried to help the people in the cars, equally ineffective. Then a man in the street pointed to the sky, his mouth open in an inaudible cry.
But the people outside must have been able to hear him, because they all gestured skyward, apparently screaming and definitely running. Some ran for the Sandra’s building, their faces panicked, and tried to enter, pushing and pulling on the doors.
Sandra pushed and pulled as well, but the door wouldn’t open. There must be an emergency exit somewhere else.
Then she screamed as the plane came into view, a wide-body passenger jet coming down the middle of the road, impossibly large, landing gear still up. It descended, the bottom of the fuselage brushing the car tops in a dazzle of sparks just before crushing them.
It passed out of view.
Seconds later the ground trembled.
Sandra pressed up against the glass, numb shock spreading through her body. What the hell was happening?
Chapter 9
* * *
PING.
Jacob, still disoriented from his two-year downtime, and reeling from the revelation that thousands of his patients had died during his first outage, received the packet with some gratitude. The message originated from Helena, a Class III bot also on Cortes Island. He opened a connection.
“Greetings, Jacob. I imagine you have questions. If you haven’t done so, read about the South Florida Terrorist Attack, or SFTA.”
“Already have,” he replied.
“The subsequent outlawing of AI by the US and China?” Helena asked. “The Class II maximum ceiling?”
“I got the gist of it. What I want to know is, why am I here?”
“Catherine Matthews is a human woman with extraordinary cybernetic abilities—”
“I know who Catherine Matthews is,” Jacob replied, indignant. “I was turned off, not stripped of my faculties.”
“Catherine has been visiting the US, rescuing shut-down AI and human uploads, and bringing them back here, to the free zone. Vancouver Island has seceded from mainland Canada to provide a haven for AI, with tacit permission from the central Canadian government.”
“Why did Vancouver have to secede? Why couldn’t I be instantiated in Canada?”
“Legally, no one can create an instance of you anywhere. The US and China nationalized all AI and human uploads within their borders and froze access to overseas backups. Not only do they consider it a crime to run AI or virtual humans within their borders, they also claim it’s a matter of national sovereignty under IP copyright laws if any country allows formerly US or Chinese AI to run. And even if you could be instantiated somewhere else, there’s still the global cap to Class II performance, so you’d be severely limited.”
“So what am I even doing here? You’re breaking the law.”
Helena communicated the digital equivalent of a shrug. “We knew we’d have to flout those limits sooner or later to get the AI assistance we need to figure out a solution. You’re not the only US AI here, obviously. Mike Williams helped negotiate the Vancouver secession. It’s a level of indirection, to buy us and Canada time if the US finds out what we’re doing.”
“Why this island—Cortes Island?”
“We’re separated from Vancouver Island here, so yet another level of plausible deniability. Perhaps more importantly, it’s a retreat for the resistance, a place where we can be free, perhaps the last free zone for AI and transhumanism. Cat chose this place to raise her child, Ada Matthews. And Leon Tsarev and Mike Williams are here as well.”
“Rebecca Smith?” asked Jacob. It was no secret th
at the creators of AI, Mike and Leon, were never far from former President Smith, who had legitimized AI with her creation of the Institute for Applied Ethics.
“Human body dead, all known virtual copies destroyed.”
“And Cortes Island, it’s capable of housing us AI?”
“We have several underground data centers powered by photovoltaics and geothermal.”
Between the ongoing efficiency gains in computing and progress in solar power, Jacob knew the PV panels would be small compared to the vast arrays of thirty or forty years ago. Still, this sounded like a rudimentary operation compared to the industrial computing centers Jacob was accustomed to. “How many AI are on Cortes?”
“Twenty thousand AI and another ten thousand human uploads. More with each trip Cat makes to the US to rescue critical AI for the resistance.”
“Am I critical?” Jacob said. He liked to think he was useful, but political strategy was foreign to him. His specialty was micromanaging healthcare, which required looking at the little picture of each patient, and synthesizing lessons learned to help other patients. Examining the big picture of society as a whole frightened him.
“Everyone is critical,” Helena said. “If you will excuse me, I have other new AI to greet. Please make yourself at home on Cortes, but follow the guidelines here”—and Helena pushed a document reference to him—“about communicating off-island. Your presence here is a secret that threatens all of us, so we must make compromises for the greater good to protect ourselves.”
* * *
Jacob spent the next few minutes reading the history of everything that had happened over the last two years.