by Phoenix Ward
I don’t know!
The static almost drowned out the transmission, but it slowly faded away and left them with the two voices.
” ‘Under the influence of an I.I.’?” the program’s host echoed.
“Yes,” Stewart continued, speaking like a teacher answering naive questions. “You see, Karl had installed an I.I. into his own brain as part of his research.”
“How did he manage to do that?” The host’s voice was saturated with bewilderment.
“He changed the structural code of the I.I. in question to function with his cerebral computer,” Stewart explained. “This allowed him to ‘share his thoughts,’ so to speak, with the I.I. That is why we called it the mindshare process around the lab.”
“And you’re suggesting that it was this relationship that inspired or motivated Terrace to commit these acts of terror?”
“That is the theory, yes.”
“Having an I.I. in one’s brain is certainly eccentric, but why would that lead Terrace on a path of murder and hate?” the host asked without any emotion.
“Well, you see, the I.I. Karl installed into his mind was unstable, even before being downloaded,” Stewart continued. “We believe the mindshare process motivated Karl to commit his crimes, either through a form of brainwashing or simple persuasion—we’re not certain.”
“How was Karl allowed access to an unstable I.I.?”
“It was part of his work at the lab. The management granted him the I.I.,” Stewart’s tone changed to one of haste, “but I don’t believe they knew the potential it had.”
“He had,” the host corrected.
“Of course,” Stewart replied. “He.”
Karl could hear the I.I. scoff within his skull.
“He’s bullshitting,” Maynard said.
Shh! Karl urged.
“And what solution do you have for the problem?” the presenter asked in as frank a tone as he could muster. “What can be done to make sure such murderous rage never happens again as the result of an I.I.?”
“Well,” Stewart started, chuckling a little, “as for all murder, I don’t think that can be helped. But as far as slayings related to an installed intelligence, the answer is simple.”
“Is it?”
“Yes. Have you ever read Asimov?”
“The science-fiction author. Yes, I have.”
“Then you must be familiar with the three laws of robotics, as he called them.”
“Certainly. I have them here,” the host said. There was a slight pause. “First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm. Second Law: A robot must obey the orders given it by the human beings except where such orders would conflict with the First Law. Third Law: A robot must protect its own existence as long as such—”
“That’s them, correct,” Stewart said. “Now, the Laws are flawed, of course. That was the point of his Robot series. But I don’t think even Asimov would realize that they have valid real-world applications in areas that extend beyond the use of an artificial intelligence. I often wonder what he would think of our current installed intelligences and if he would be happy with the way things are. That doesn’t matter, however. What matters is what’s best for the public. The living public,”
“Bodied humans?” the host wanted to clarify.
“Yes. Those of us with hearts and brains still made of organic material,” Stewart answered. “These are the people I aim to protect, first and foremost.”
“Like I asked, how are you planning to do that?”
Karl could hear his former acquaintance swallow. There was a small part of him that enjoyed the thought of Stewart squirming under a dozen studio lights. Maynard seemed to share the sentiment.
“In short, I want to implement my own set of laws to all future installed intelligences,” Stewart replied without any further delay. “I want to make sure that nothing like the Terrace situation ever happens again.”
“And what laws are you working on?”
“Those are bit too complicated to recite over the airwaves, but I promise that our coders and engineers are hard at work on flawless laws that keep all installed intelligences from hurting those of us still alive.”
That’s insane, Karl thought. Such a thing can’t be done.
“If it can,” Maynard started, “it’s the first true form of total slavery. Slavery of the mind, and of the soul. I have a soul just as you do, Karl.”
I know that!
“Then why do people see a line between you and me? Why do they talk about me like I’m a bad virus instead of a human being?”
Karl couldn’t think of answer for several minutes.
You know why, was all he could manage.
“How would this impact I.I.s already installed?” the interviewer asked.
“Well,” Stewart said, his voice dragging along like it was chained to several weights. “Unfortunately, there would have to be a massive recall.”
“A recall?”
“Yes. We would need to collect all currently existing I.I.s and assess them for compliance with these new laws.”
“All of them?”
“All of them.”
A recall? Karl thought
“A collection,” Maynard said.
“Don’t you foresee some resistance in the courts? The ACLU, for example?” the radio presenter said.
The program almost faded away, but Karl gave the radio a good whack and the signal returned in full strength.
“We are prepared,” Stewart said. “The lawyers are primed, so to speak.”
“Don’t you see anything wrong with collecting installed intelligences? They are considered human beings, after all.”
“No, for the time being, the first generation of I.I.s have been a defective programming error,” Stewart argued. “We don’t seek to do anything to these ‘people’ other than update their code with modern safety compliance. I think that’s only fair.”
“When will this recall begin?” the host asked.
“It’s already starting,” Stewart replied. “Thanks to a quick win with the D.O.J., we have started to gather the most risk-prone I.I.s. I expect this battle will reach the courts, at which time we be able to collect all prior-generation intelligences.”
“How many I.I.s do you estimated you’ve already acquired?”
“Around three thousand. Remember, these were all programs cleared by a panel of psychologists to be at high risk of having violent tendencies.”
A tone played and there was a brief pause.
“Well, I’d love to continue this discussion—it’s a truly fascinating subject—but that’s all the time we have for today,” the presenter spoke. “Mr. Lythe, thank you very much for joining us today.”
“It was my pleasure,” was the reply.
“Don’t forget to tune in next week, when we speak to Dr. Shirley Martinez, a highly regarded psychologist, about why she thinks Karl Terrace wrote his manifesto. Until next time!”
There was only static.
Karl turned the radio off.
Stalward
“I know his voice,” Maynard said.
What? Karl asked.
“Stewart,” the I.I. said. “I thought I recognized it at first, but I wasn’t sure.”
What are you talking about?
“That voice belongs to someone I knew a long time ago,” Maynard answered. “Though, he wasn’t called Stewart Lythe then.”
Karl waited silently for a follow-up. He was growing tired of having to prod for answers.
“I need a moment to think about it all,” Maynard whispered. “I don’t remember everything so well. It’s like stuff is locked behind sticky goo, and I have to pry it loose. It might take me awhile. I’m sorry.”
It’s okay, Karl thought.
It had been at least thirteen hours since the psychologist had disconnected Maynard and left him to his own musings. Now that he was alone with his own thoughts, he couldn’t help
but notice how frantic the I.I.’s tone had been when they’d last spoken.
Throughout the entire time Karl knew him, Maynard had always seemed cool and hard to rattle. Nothing was unworthy of ridicule in his book, and Karl found it kind of admirable. Now, though, with the I.I.’s confidence shaken, it disturbed the fugitive.
He decided enough time had passed since his last check in that it was worth powering up his C.C. enough to awaken Maynard. When he did so, it was like he’d walked in on the I.I. in a conversation with himself, but none of the words were discernible.
Before Karl could manage any thought, the I.I. noticed him and redirected his conversation.
“I know who he is,” Maynard said.
Stewart? Karl asked.
“Yeah. Though he was called Glenn Stalward when I knew him.”
Glenn Stalward? Karl echoed.
“Did I stutter?” Maynard replied in typical fashion. “He helped me with the hardware when I was developing the cerebral computer. We worked together quite often.”
What makes you think that they’re the same person?
There was a noise as if Maynard had managed to turn an eye roll into an audible format.
“I know. Trust me. With my current ‘state,’ I have access to all kinds of voice comparison software and more things that you could think to run on your dinky C.C.”
Dinky? You invented it!
“Some fathers outgrow their children,” Maynard replied. “Either way! I am certain beyond a reasonable doubt that Glenn Stalward and Stewart Lythe are the same person. As a fellow scientist, you must know the gravity of those words.”
Okay, Karl started, but how is that even possible? Wouldn’t Stalward be in his seventies by now? At least?
“Yes, and that’s a fair question,” Maynard countered, “so I looked into it. During the brief snippets of power you’ve granted me in the last day, I managed to snag a few stories about Stalward’s lifestyle.”
His lifestyle? Karl thought, bewildered.
“He was a silicon junkie,” Maynard continued. “He’d always mentioned his interest in plastic surgery when we worked together, but I thought he was just a bit vain. I wasn’t aware of what lengths he’d go to in order to stay young.”
He told you about this?
“Occasionally,” Maynard said. “Most of our communication was passive aggressive texts about how he was unable to build a proper machine to hold my code. He was not fond of me, that’s for sure. But whenever he wasn’t complaining about his lack of engineering talent, he was airing his grievances about aging. He kept saying things like, ‘a creature’s lifespan is indicative of its impact on the world.’ He truly thought it was amazing that people still died after only about a century.”
Well, lives have grown longer, that’s for sure, Karl started. Medicine is always advancing.
“Yeah, well he always felt it wasn’t advancing fast enough,” Maynard explained. “Ironic, isn’t it? His greatest criticism of science could be solved by the sharing of human and I.I. minds, yet that’s the one thing he is trying to keep us from doing.”
We still don’t know that Lythe is Stalward, Karl repeated.
Maynard sighed—a soft airless breath.
“I’ve already started a scan through my memory banks,” Maynard said. “It will take a little while without constant internet access, but when it’s done, you’ll see that I was right. I’ll have actual evidence. Until then, though, you might try a little trust.”
I do trust you, Maynard, you know that.
“Then start acting on it,” Maynard requested. “We’re partners now, Karl. Remember that.”
You know, Thompson might be able to help with the scan, Karl mused.
He could feel Maynard’s interest pique again.
“How so?”
If you risk just a little connection time, we could send him some of your database and he could go over it. He’d have internet the entire time without any risk to us.
“How will we know if he finds anything?” Maynard asked.
He still knows what mailbox to use, Karl thought. That part of the plan hasn’t changed.
“Okay, I can prepare him a compressed folder,” Maynard said.
Karl was pleased that the I.I. seemed to agree with his suggestion without any sarcasm or snark. At first, he thought he might be going a little mad, but it felt like Maynard was actually becoming more friendly to him. It was a gradual progression, but Karl could see it now. They weren’t just partners. They were kin.
Maynard could hear Karl’s thoughts, and offered no rebuttal.
Decision
Karl had spent the last three and a half hours staring down the walkway leading from the cabin’s front door, watching the snow crystals fall indiscriminately into a sea of their brothers. The wind nipped at his cheeks, but that sensation had vanished after only the first twenty minutes. Perhaps it was the hollow howl of the wind wrapping around the walls, or the dull thud of his own heart, but he needed to be outside of the cabin.
A bird too distant to identify was fluttering from branch to branch on a tree just outside the cabin property. Karl couldn’t tell if it was building a nest or if it was pecking around the bark to find some sustenance. A bit ago, this kind of isolation and silence would have inspired a deep sense of anxiety within his heart. Now, however, it was a dull peace—like trying to drift asleep on a wave of painkillers.
There was a growing part of him that started to find a sort of zen in his isolation. He noticed that very few of his thoughts were of outrage or frustration. Instead, he now mused those great philosophical questions that he’d never imagined having the time to tackle. Most importantly, he started to think about the nature of the installed intelligence.
When he was a child, I.I.s were first starting to come to form. In those days, however, they were only a commodity for the ultra-rich. Something to allow them to have something fancier than a normal burial. Now, though, he saw them as true human beings. Souls without bodies, but people nonetheless. It was difficult for him to pinpoint exactly when the transition from glorified tombstone to full-fledged person had taken place. It had to be around the Santson incident. He was young man then. He felt in awe that such an event had even taken place, but he’d never really calculated the retaliation it would cause.
He figured, as long as the Supreme Court found it lawful, no one else would have an issue with the humanity of an I.I. He couldn’t have been more wrong, however. He saw it as a matter of logic and fact, but it was never portrayed as such. As with the racial debates of his fathers, he didn’t see any reason in denying I.I.s their identity.
However, the debate still existed. Even here in this serene wilderness, it troubled him. He couldn’t find the reason in it, and without reason, he was without opinion.
He decided it had been long enough. He would take the long walk down the road to post his letter to Thompson.
Clutching the letter he’d found in the mailbox, he pushed the gate open and mentally activated the part of his C.C. that awoke Maynard.
Without delay, the I.I. seemed to sense something important had happened through the patterns in Karl’s thoughts.
“What is it?” Maynard asked.
I went down the road to the mailbox to send Thompson a letter. I wanted to have him check up on the connection between Stewart and this Stalward character so we’d have an answer for sure, but I found he’d already sent us something.
“What?”
Take a look for yourself.
Karl held up the letter—which had been written out with what appeared to be an old mechanical typewriter—so that Maynard could read it through his eyes.
“A package receipt?” Maynard said. “He says its dimensions match all the gear and weapons that the shooters used on the lab. A big package.”
That’s right.
“He got this from the data we got out of the lab?”
Karl physically nodded.
It looks like it was buried hard in an
email account someone set up in my name. Look. It sounds exactly like me. But this isn’t one of my addresses.
“That’s who the package receipt is for. But do you see who the sender is? Look carefully, after the shipping details.”
Mr. Stalward, Karl read.
There was a moment of silence between them while the name sunk in. He felt like he must have read the name wrong, but after three passes, he knew it was as it appeared.
“You see? I was right,” Maynard said. “Stewart Lythe and Glenn Stalward are the same person.”
We haven’t proven that, Karl said. All we know is that Stewart and Stalward are both involved in the shootings.
“What, you think they’re just two fellows helping each other out? Isn’t my word—my memory—enough for you? You’re not adhering to Occam’s Razor, my friend.”
Well, it’s not exactly Occam’s Razor, is it? Karl argued. I simply can’t accept that an age-defying silicon addict, who once killed you and is now setting me up in order to reprogram I.I.s, is the “simplest solution.”
“The evidence is there. What’s left to argue?”
I don’t know. I just don’t like it.
“Karl, why don’t you trust me?” The I.I. seemed to sigh. “After all we’ve been through, and the way we’re connected, I find it a bit distasteful that you still won’t trust my judgement.”
I’m sorry, Karl thought. You must understand, I’m a scientist. I find it hard to accept any answer unless it’s the only one that remains.
“So you won’t even consider my theory?”
I am, Karl said. In fact, I still posted the letter to Thompson. I want to know for sure if there’s a connection between Stewart and Stalward—a public connection, that is—because I consider it a possibility.
“So then you do believe me.”
At least a little bit.
There was a moment of quiet as Maynard continued reading the receipt Thompson had typed out by hand. Karl had already read it, but he followed along.
“The package came from a place called Fort Leddy, Wyoming. I’ve never heard of it,” Maynard said.
I’m not surprised, Karl started. Fort Leddy was built a little bit after your death. It was part of some sort of urbanization-of-the-Midwest push some property developers came up with when I was a kid.