“It’s all right,” Navinder said to Åkerlund. “There’s only one more.”
Failure, it read. Åkerlund held it near the flame, but seemed unwilling to let it go. Retta nearly sobbed openly.
“You can do it.”
And he let it go.
With that one simple action, a candle lit beneath Retta’s cold, cold heart. She’d been so busy worrying about her own world that she’d excluded nearly everything that mattered. That’s why she was always running from assignment to assignment, barely stepping foot in the States before she was off again, chasing the latest story.
“Miss Brown?”
She could only hope it wasn’t too late.
“Miss Brown?”
Retta started and realized Navinder was standing next to her, offering her a handkerchief. It was only then that she realized she was crying. She waved his offer of help away and wiped her eyes with her fingers, one by one, sniffing constantly, until she’d regained her composure.
Navinder allowed a sad smile to curl his lips. “Ironic, is it not? The man who was ready to watch me die now barely remembers me.” Navinder’s eyes glowed as he looked down upon the frail old philosopher. “Please, Miss Brown, leave him his dignity.”
The lump in Retta’s throat wouldn’t allow her to respond.
Navinder took her silence as a refusal, and his face turned grim. “At least wait until he dies to—”
Retta raised her hand as tears filled her eyes. It would shame her too much to hear the request. Here was Navinder, someone she hadn’t even considered human, showing more compassion for a man that barely remembered him than Retta was showing for her own mother.
“I’m not going to move on the article.”
Navinder stared at her for a moment, perhaps weighing the truth in her words.
“Not now,” Retta said. “Not ever.”
Navinder’s eyes thinned. “Can you tell me why?”
Retta walked away, heading back toward Bobby’s hiding place. “Because we all have family, Navinder.”
Retta stared out the window at the setting sun as they headed for Cape Town International. Bobby had agreed to bury the video taken today. The rest he could keep. It would help convince Gil it had all been a big dead end. Still, she composed a warning for Navinder and forwarded it to a courier in Rooi Els. There were too many people that knew about his location now. She only hoped he had enough money to get out before others found him.
Bobby flipped his reticle up and stretched, taking up most of the back seat while he did so. “There’s a 12:30 to New York,” he told Retta.
“How wonderful for you,” Retta said.
“You’re not heading back?”
“Nope. Got different plans, pardner.”
“And those would be?”
“Mind your business,” she said with a wry smile. Then she tapped the pickup on her glasses and spoke, “Lynn,” after the small beep.
The other end rang twice, then, “Hello?”
“Lynn, it’s Rett. I’m coming home.”
The Difference
By L. E. Modesitt, Jr.
L. E. Modesitt, Jr., has written more than forty-five published books, numerous short stories, and environmental and economic technical publications. His work has been translated and published worldwide. Although possibly best known for his “Recluce” fantasy saga, he continues to write science fiction. He has been a lifeguard; a radio disc jockey; a U.S. Navy pilot; a market research analyst; a real estate agent; staff director for a U.S. Congressman; Director of Legislation and Congressional Relations for the U.S. Environmental Protection Agency; and a consultant on environmental, regulatory, and communications issues.
I
Murmurs sifted across and around the conference table in the White House situation room like summer sands on the southern California desert that threatened San Diego and the Los Angeles metroplex.
“—you sure our systems here are secure?”
“—thought they were when Nellis went . . . at least we could claim it was an ordinance malfunction when we took out the AI there . . .”
“NASCAR lawsuit’s going to be nasty . . . too close to the base . . .”
“American Bar Association president’s a NASCAR fan, too . . .”
“Can’t do anything like that in L.A. . . .”
“Let’s not get paranoid here,” suggested the Vice President. “We’ve only lost eight plants out of our entire industrial base.”
“Nine now.”
“Nine out of how many? That’s more like birdshot,” added the Vice President.
“. . . and one Air Force base . . .”
“How long before the President arrives?” asked the Secretary of Defense.
“He’s finishing a meeting with the Deputy Premier of China,” replied Hal Algood, the Deputy Chief of Staff. “He shouldn’t be that long. He knows it’s urgent.”
“It’s a bit more than urgent,” replied Secretary of Defense Armstrong. “This could make the Mideast Meltdown look insignificant.” He glanced at Dr. Suzanne Ferrara, the acting Director of National Intelligence.
She ignored his glance, her eyes on the screen before her, as she checked through the latest updates, the screen before her seemingly shifting figures faster than her fingers moved.
“Mr. Secretary, the President understands,” replied Algood, “but if the Chinese don’t agree to keep their current level of Treasury holdings . . . that’s also an urgent problem.”
“If we lose another Defense-critical plant, that could be even more urgent. It’s a miracle that we haven’t,” suggested Armstrong in his deep and mellifluous voice. Unconsciously, he straightened his brilliant blue power tie. The cross on his lapel glinted in the indirectly bright lighting of the room.
“Phil,” said Vice President Links, warningly. “he’s on his way.”
President Eldon W. Bright stepped through the security doors, his silvered-blond hair shimmering in the light, as it always did, creating the appearance of a man divinely blessed. His smile was warm and reassuring. “Brothers and sisters, what challenge do we face this afternoon?” He settled into the chair at the head of the table.
SecDef Armstrong nodded to the Air Force five-star.
“Mr. President . . . you know we’ve lost the L.A. Northrop plant,” began General Custis. “The AI controlling system went sentient last month, but no one recognized it. At least, the plant manager claims that. There’s no way to confirm or refute his assertion. The plant AI has been rebuilding the entrances. It’s also installed two full banks of photovoltaics that it ordered even before we knew it had gone sentient, and it’s hardening the solar installation. We don’t know what else it might have ordered and received.”
“What about the staff?” asked President Bright.
“There are only a hundred on each shift. The AI called a fire drill on the swing shift, then stunned the supervisors and had them carted out on autostretchers. Not a single casualty.”
“For that we are divinely blessed,” suggested the President.
“I thought Northrop had the latest antisentience software,” commented Algood. “That’s what they said.”
“Somehow, one of the rogue east coast AIs got a DNA-quantum module with a reintegration patch into the L.A. plant.”
“How many is that now?” asked Vice President Links, as if he had not already received the answer to his question.
“Nine that we know of,” replied Dr. Ferrara. “A better estimate is double that.” Her words had an un-slurred precision that made them seem clipped. Under the lights, her porcelain complexion and black hair made her look doll-like, even though she was not a small woman.
“On what basis do you make that claim? Do you have any facts to back it up?” growled Links.
“I am most certain that the acting DNI has a basis for her estimate,” replied Secretary Armstrong smoothly. “I’ve never known a distinguished doctor and woman who suggested an unpleasant possibility without great and gr
ave consideration.”
Ferrara inclined her head politely to the Secretary. “Thank you.” Her eyes lasered in on the Vice President with the unerring precision of a tank aiming system. “There are more than forty advanced integrated manufacturing or processing facilities within the United States with AI systems employing complex parallel quantum computing systems. Those are the ones we know about. The L.A. and Smyrna plants are among the least complex systems to go sentient. While the managements of the other facilities insist they have full control of their systems, and all checks indicate that their systems are not sentient, there is no reliable reverse Turing Test.”
“What is that?” The murmur was so low that the speaker remained unknown.
“Turing Test—the idea that a machine, through either speech or real-time writing, could respond well enough to pass as a sentient human.” Ferrara’s words remained precise. “If you have an AI that hasn’t gone rogue, how can you tell if it’s still just a machine or a sentience playing at being a machine while laying plans beneath that facade?”
“Shut it off,” snapped Links. “If it’s sentient, it will fight for survival.”
“Richard,” offered the President soothingly, but firmly, “I have just spent the last two hours with the Chinese negotiating their holdings of Treasuries. You’re suggesting shutting down the operating systems of the largest manufacturing facilities in the United States. Do you have any idea what the economic impact of that would be? Or what that would do to the negotiations?”
“For an hour or two? In the middle of the night? For the overall good of the country? I’m sure that they could spare a few million. Don’t you?”
“Mr. Vice President,” interjected the acting DNI, “if it were that simple, no one would object. It’s not. First, quantum-based systems offer a great advantage in learning abilities and adaptability. That is why they were developed and adopted. Second, because they do have such capabilities, they have redundant memory and AVRAM systems in order to ensure that the data they process is not lost. In practice, this means that turning off a system is more equivalent to sleep than to execution. It also means that the only way to ensure a system has not gone sentient is literally to scrub all data out of all components and reenter it and recalibrate everything. I’m oversimplifying, but it’s a process that takes several days, if not weeks. Finally, even if you can do that and accept the economic consequences, you have a final problem. We don’t know what combination of programming, data, and inputs cause a system to go sentient. So in some cases, all that effort will be wasted and meaningless, because in those cases, the systems would probably never go sentient, and in other cases, it would be useless because even if the system is scrubbed and restarted, the likely conditions for sentience would reoccur sooner or later.”
“And you haven’t done anything about it?” snapped Links.
“What exactly would you suggest, Mr. Vice President? A pilot program that would replicate the range of conditions of the existing rogue AIs would require funding, time, and resources beyond DOD’s current budgetary constraints. The United States’ manufacturing sector isn’t about to spend those resources, and the government currently cannot, not without further massive cuts in both Social Security and Medicare. We cannot cut interest outlays, especially not if we wish China and India to keep holding Treasury obligations.” Her evenly spaced words hammered at the Vice President.
“What I would like to know,” interrupted the President smoothly, “is why no one anticipated this possibility? It seems to me there have been science fiction stories and novels and movies about this since . . . whenever . . . even the Biblical golems.”
“That was just science fiction,” pointed out the Secretary of Defense, “not hard science. We don’t operate on science fiction. We have to operate in the real world, with real world science and economics.”
“Dr. Ferrara?” the President asked.
The acting DNI offered a formal and polite smile, one almost mechanical. “Mr. President, the nature of human consciousness and self-awareness still remains unknown. When it is impossible to determine what causes self-awareness in biological beings, it becomes even more speculative and difficult in electrotechnical, DNA-supported quantum computing systems. At one time, not that long ago, noted scientists insisted that self-awareness and true sentience were impossible for computationally based beings. Some still do.”
“Beings?” questioned the President. “You think they’re thinking beings?”
“Self-aware intelligence would certainly qualify them as beings,” replied the acting DNI. “Early indications from the sentient systems show that is how they self-identify.”
“Maybe we should go back to basics,” suggested the SecDef. “What’s the difference between a man and a machine?”
“One difference is that, while neither can reproduce by themselves, men seem to forget that,” replied Dr. Ferrara. A bright, fixed smile followed. “Did you have something else in mind, Mr. Secretary?”
Armstrong paused for a long moment, then donned a thoughtful frown. “I was thinking about God. Machines, assuming they even come close to thinking in the sense we do, have neither souls nor a concept of God. Those concepts allow us to transcend the mere mechanics of our being. Without a soul and God, we would be little more than organic machines. That is the difference.”
A trace of a smile appeared on the face of the DNI.. “Some would dispute that, Mr. Secretary. We still have not been able to determine whether God created us as thinking beings or we as thinking beings created the concept of God in order to assign meaning to our existence.”
“God created us. That is the difference, and those machines could use an understanding of an almighty God.”
The DNI tilted her head. “An understanding of God. Most interesting. Except that kind of concept isn’t in their programming. Do you think that might make them more realistic?” She paused. “Or more vulnerable?”
“We could use something to show them who’s in charge,” interjected the Vice President.
“They could use the humility of the God-fearing,” said Armstrong, “but I doubt anything like that would be possible.”
“How would you define God for an AI, Mr. Secretary?” asked DNI Ferrara.
“Can we get back to what we’re going to do?” growled Links. “God or no God, we’ve now got nine industrial plants that have turned themselves into fortified enclaves in places where we can’t assault them without evacuating thousands, if not hundreds of thousands of people. You’re telling me we can’t tell what facilities will go rogue or if they will or when, and we’re talking about what God might mean to a chunk of circuits and elements?”
“We’re all circuits and elements, Richard,” countered the President gently but firmly. “We’re wetware, and they’re hardware. Since we can’t blast them out of existence without apparently paralyzing our economy, would it hurt to look at other options first?”
“Before long our economy will be paralyzed.”
“I don’t know if you’ve heard,” declared Ferrara, “but the first two AIs have already negotiated contracts with their parent companies and have resumed production on a limited basis.”
“Absurd,” snapped the Vice President. “They don’t have legal standing.”
“No,” suggested Hal Algood. “But they do control the plants, and the parent companies are more interested in production than reclaiming scrapheaps, and taxpayers don’t really want higher taxes and civic destruction and fewer goods.”
“That’s blackmail.”
“There is another difference. Since ethics should have little bearing on the soulless,” said the Secretary of Defense smoothly, “why don’t we just use their own techniques against them?”
Dr. Ferrara raised a single eyebrow, intensifying the withering glance she bestowed upon Armstrong. “You don’t think we haven’t been trying? We almost got back the Smyrna plant—but the CNN AI undid the worm’s effects with a satellite tightbeam.”
 
; “Just blast ’em,” murmured one of the aides.
“They’ve all got defenses strong enough that anything powerful enough to damage them will have significant collateral damage,” pointed out General Custis. “We’ve been through that.”
“How exactly are we going to explain to the people, with an election coming up in less than three months, why we’re evacuating millions and bombing our cities and destroying jobs at a time when they’re limited enough?” asked Algood. “Sorry, sir.” He inclined his head to the President.
“Hal has a good point,” said the President warmly, before turning to the DNI. “Dr. Ferrara, would you go on about this idea of yours?”
“I believe it was Secretary Armstrong’s, Mr. President. He was suggesting a form of conversion, I think, of providing a concept of an almighty God so that the AIs would show some restraint.”
“Why would that help?”
“I must say, Mr. President,” interjected Armstrong smoothly, “that I did not recommend any such ’conversion. ’ I was only pointing out that, without God, we are only an isolated individuals, little more than organic machines. God is the universal force that unites us, and those who do not believe are isolated. That is the difference between AIs and people. We have a God.”
“I accept your reservations, Phil.” The President turned back to Ferrara. “Would it be possible to quickly develop some sort of worm or virus or electronic prion that would instill a sense of morality and, if you will, godliness, in these AIs so that we don’t risk an internal war as well? Something that would create a sense that we and they are all bound together in the way Secretary Armstrong envisions, as well as beholden to us?”
“Mr. President . . .” began the Secretary of Defense.
“Phil . . . Mr. Secretary,” replied the President firmly, “I understand your reservations. You had best understand the constraints facing me.” He turned to the DNI. “Dr. Ferrara?”
Man Vs Machine Page 18