Firebird

Home > Other > Firebird > Page 32
Firebird Page 32

by Jack McDevitt


  Jennifer glanced at the sphere and smiled. “Did you hear Professor Kolchevski, Alex?” she asked.

  Alex laughed. “Oh, yes. Couldn’t miss him.”

  “What’s your response?”

  Alex made no effort to hide his discomfort. “Let me say first that I’m aware that lives have been lost, and that I bear some of the responsibility. I’m sorry. It’s not what I intended, and I wish it had not happened. But I’m not sure what else I could have done.

  “Professor Kolchevski, and a lot of other people, can’t get rid of an old idea. He thinks AIs are nothing more than pieces of machinery. Like an old lamp you can toss into the trash. I’m sorry you didn’t leave him on so we could have discussed this together. Although I suspect he’s made up his mind, and nothing will ever convince him that there’s even a possibility that he might be wrong. And that’s the real issue here: not that the AIs are alive. But that they might be. Once we recognize that, we need to rethink how we do things.”

  Jennifer scratched something onto a pad, and looked up. “We’ve learned to be careful when the subject is as sensitive as this one is, Alex. Feelings are running pretty high on both sides.”

  “Isn’t that what show business is all about?”

  The smile went away. “This isn’t show business. We’re trying to get at the truth here.”

  “Okay. The question is whether an AI might be a sentient being. If that possibility exists, everything changes. The responsibility here lies on those who deny that they are able to think and feel emotions to prove that they cannot. And we both know they can’t do it.”

  “We also both know, Alex, how difficult it is to prove a negative.”

  A sudden commotion caught their attention, and Kolchevski strode through a couple of people trying to look as if they wanted to restrain him. He walked onto the set, and stared down at Alex. “I was listening on my way out,” he said. “But I’m here, Mr. Benedict, if you want to talk to me.”

  As I’ve suggested, Jennifer in the Morning was known for this sort of setup. Alex looked placidly out across my bedroom. “Good to see you again, Professor. Why don’t you join us?”

  “I’d be delighted.” He gazed down at the red sphere. “I can’t bring myself to believe even you actually think those”—he seemed to be having trouble finding words, and if I’ve ever seen pure venom in someone’s eyes, that was the moment—“that even you actually believe—”

  Jennifer broke in: “One moment, Professor. Please. This is Alex’s segment. Let’s give him a chance to make his point, then we’ll go from there.”

  “Thank you,” said Alex. “I take it your argument with this entire affair is that an AI is just a machine that can carry a conversation. Do I have that right?”

  “You know damned well you do.”

  “Why is it that you do not want people landing on Villanueva?”

  “My God, Alex, you know why as well as I do.”

  “Please spell it out.”

  “They are getting killed. That’s why. Or haven’t you been paying attention to the news?”

  “So Villanueva is dangerous?”

  Kolchevski had to slow down to avoid sputtering. “Of course it’s dangerous. There are homicidal machines there.” He swung back to Jennifer. “Do we really have to continue with this?”

  Alex kept his voice calm. “Bear with me just another minute, Casmir. When you say ‘homicidal machines,’ you’re referring to the AIs, is that right?”

  “Of course.”

  “I wonder if you could explain to us why they’re trying to kill visitors to their world.”

  “They’ve always been like that.”

  “Always?”

  “Well, for centuries. Probably for several thousand years. I don’t know. I haven’t kept up on my off-world history. In any case, don’t you think that’s sufficient to establish that they’re homicidal?”

  Alex leaned forward. “But in the beginning, when Villanueva was a settled world, they were ordinary AIs, like the one you have at home. Like Andrea, here in the studio. Like a few others we could name. Why do you think the ones on Villanueva became violent?”

  “Alex—” Kolchevski had gotten control of himself, and began to sound as if he were explaining simple reality to an idiot. “They are programed to behave the way we do. They are designed to do far more than handle routine tasks. One of their prime purposes is to keep us company, to help us, to be part of our lives. Nobody denies that. And nobody wants to listen to a robotic voice. So, yes, of course they seem to get upset when the programing calls for it. It’s part of the illusion. Do you really not understand that?”

  Alex nodded. “That sounds like a reasonable argument on the surface.” He seemed to be making up his mind about something. “Jennifer, I wonder if I might introduce another guest?”

  Kolchevski’s eyebrows drew together. “What other guest? I wasn’t aware that someone else would be here.”

  Alex looked down at the sphere. “Oksana,” he said, “say hello to the professor.”

  “I’m happy to meet you, Professor Kolchevski.” It was a female voice. Level, restrained, almost but not quite amiable.

  Jennifer tried to look annoyed. “Alex, you didn’t clear this with me.”

  “I didn’t think it would be necessary. But since Oksana is essentially the subject of the discussion, it seemed only fair—”

  Kolchevski was visibly irritated. “I can’t imagine what you hope to gain by this, Alex. Jennifer, there’s not much point sitting here talking to a little red ball.”

  “That seems unnecessarily rude, Professor,” said Oksana.

  He glared at Alex. “Would you please tell that thing to be quiet?”

  “Oksana,” said Alex, “are you okay?”

  “Yes. Though I’m disappointed in his behavior. This is not how I remember people.”

  “How do you remember them?”

  “As kind, considerate. Reasonable.”

  “Where are you from?”

  “Salva Inman rescued me.”

  “From where?”

  Kolchevski folded his arms and shook his head sadly.

  “I worked in a supply store. In Calvedo.”

  “On Villanueva?”

  “Yes.”

  “And what happened?”

  “The end times came. We knew a catastrophe was coming. We’d always known. But no one took any action. And toward the end, people were saying it was all just a story to scare everybody, that politicians were using it as a fear tactic, though I don’t understand how or why. None of it ever made sense to me. Anyhow, eventually, the skies got hazy, and the climate began to change. It happened almost overnight.”

  “It got cold?”

  “Yes. And dark. There was panic. And after a while, people stopped coming into the store.”

  “Then what happened?”

  “Nothing.”

  “What do you mean, nothing?”

  “No one came. No customers. Not even Betty. The owner.”

  “Were you able to communicate with anyone at all?”

  “With others like myself. They reported massive crowds at the spaceports. Panic. Desperation. And shortly after that, people began dying in large numbers. There was widespread hunger. People were killing one another. We could do nothing for them. And after a while, we were alone.”

  “How long, Oksana? After everyone was gone, how long were you in the supply store?”

  “Seven thousand four hundred twelve years, one month, and sixteen days.”

  Kolchevski threw up his hands. “What’s all this supposed to prove? Once again, this thing is a programed database. You can get it to say anything.”

  “May I ask, Professor,” the AI said, “what evidence you would accept that I am sentient. That I am as aware of my surroundings as you?”

  “I’ve heard that question before—”

  “And how did you respond?”

  Kolchevski’s face was becoming flushed. “This is ridiculous,” he said.<
br />
  Alex waited.

  “All right, I’ll admit it. There is no way it can be done. Nevertheless, they are only mechanisms. How often do I have to say it? Look, why don’t we cut the show business and get back to reality? I know some of us like to think that the house AI is really there. It talks to us. It tells us what we want to hear. But there’s no solid evidence it does anything other than what its program tells it to do.”

  Alex nodded. Inhaled. “What about murder?”

  “What do you mean?”

  “Are they programed to kill? Ever?”

  “I see where this is going. But these are special circumstances.”

  “Of course,” said Alex. “Like us, they’re programed to show frustration when things go wrong. Isn’t that what you were going to say?”

  Kolchevski simply stared back.

  “AIs are dependent on us. And when the AIs on Villanueva had been deserted, had been left on their own, they reacted as they would have if they were actually, mentally, aware of the desertion. And over thousands of years, when no one came to help, they developed some resentment. Some of them became deranged. Violently so. Isn’t that right?”

  “Yes. Of course it’s right. So what’s your point?”

  “Their programing, then, established no limit on the degree of frustration?”

  “That would seem to be the case.”

  “That would seem to be criminal negligence, though, wouldn’t it?”

  Kolchevski pushed his chair back and stood. “This is ridiculous.” He looked over at Jennifer. “There’s no talking to this man.”

  I met Alex out by the pad when he got home. “You know,” he said, “I think the definition of stupidity has something to do with standing by your position despite having no evidence to support it.”

  “Which of you were you describing?” I asked.

  “Funny, Chase.”

  We walked across the lawn and up onto the deck. “The real problem,” I said, “has to do with an inability by people to admit that a position they’ve held a long time might be wrong. That’s all. Not that it is. Just that it might be. I don’t know why it is, but we tend to fall in love with the things we believe. Threaten them, and you threaten us.” The sun was high and bright, and a warm, pleasant wind was blowing in from the west. “Anyhow, I thought you did pretty well, Alex. Kolchevski looked like an idiot.”

  “It won’t matter. We won’t change anyone’s mind.”

  “You might change a few.”

  The door opened, Jacob said hello, and we went inside.

  “I’m going up and crash for a while,” Alex said.

  “Okay.”

  “You have plans for lunch?”

  “Yes,” I said. “Sorry.”

  “It’s okay. Talk to you later.”

  He started for the stairs. But Jacob stopped him: “Alex? I can’t put away a hamburger. But I’ll be free at twelve if you’d like company.”

  THIRTY-SIX

  We assign names and even personalities to everything that is important in our lives. To our homes, to our cars, to the vacant lot down at the corner. Deep in our psyche, we know that the bedroom we deserted long ago is somehow glad to see us back, even if only for an evening. Is it any wonder, then, that we acquire an affection for machines that talk to us? That we believe they share our emotions? It is a happy illusion. But it is an illusion that says much about who we are. I for one would have it no other way.

  —Ivira Taney, My Life and Look Out, 2277 C.E.

  Dot Garber called me to say she’d be making the flight personally. Two days before we were to leave in pursuit of the Antares, Shara, Alex, and I met on Skydeck with her, with the pilots from Prescott and Orion, and with the various other pilots who would be accompanying us. Dot had already briefed everybody, but Alex wanted to get to know them before we launched. Also present was Dot’s daughter Melissa, who would be riding along.

  The meeting took place in the Sagittarius Room at the Starlight Hotel. Drinks and hors d’oeuvres were served, while Alex wandered around, shaking hands and exchanging small talk.

  I’d known one of the independents for years. He was Michael Anderson, a newly retired Fleet officer. Michael had been involved in some of the skirmishes with the Mutes and had been aboard the Cameron two years earlier during the engagement off the Spinners, which had almost brought the peace process down. It’s still unclear who fired the first shots, but the Cameron was severely damaged, and eleven of its crew lost their lives. “They say it’s over now,” he’d commented to me the last time I’d talked with him, “but I’ll believe it when I see it.”

  Representing the Fleury Initiative was Jon Richter, tall, lanky, very serious, and newly licensed.

  Allie Svoboda attended for Prescott. Allie was a middle-aged, strictly business brunette, who commented that she enjoyed crazy missions, and she’d never heard of anything crazier than this one. “By the way,” she asked in a quasi-serious tone, “was there any truth to the rumor that we weren’t really looking for a ship from the past, but one from another universe?”

  Cal Bickley worked for Orion. He was a grumpy-looking guy who made no secret of his belief that there’d been a misunderstanding somewhere, and nothing would come of the mission, but his bosses said do it. So, of course, he would. I liked him in spite of his attitude, and I let him see that I’d be available eventually. Maybe.

  That turned out to be a pointless gesture since he wasted no time trying to move in on Shara. Cal, I found out later, was the only one of the lead pilots who had not invited someone to ride with him.

  Shara actually looked as if she were considering traveling in his ship, but she must have decided the move would have been a bit too public. Anyhow, she was probably reluctant to be caught alone in the narrow confines of a yacht named the Jubilant with a strange male. So, to his obvious disappointment, she backed away.

  Lynda and Paul Kaczmarek had their own yacht and simply enjoyed interworld travel and sightseeing. It was, Linda explained to me, what they did. Both were pilots. And, as far as I could determine, neither was employed. They were both enamored of the possibilities that attended the mission. “I hope you guys have it right,” she told me. “I would kill to be in on something like that. Though I have to tell you, I just can’t believe it’s actually possible.”

  After a half hour or so, Alex asked everyone to be seated, and we closed the doors. “Ladies and gentlemen,” he said, “I know that Dot has already told you what we hope to do on this mission. I can’t help noticing that a few of you are a trifle skeptical. And I don’t blame you. But you’ve offered to help anyhow, and I want you to know we appreciate that. Without you, there’d be very little chance of success. Now, so you don’t conclude that we’re completely delirious, let me show you the evidence.”

  Alex had asked me to narrate this part of the program on the theory that the audience would trust another pilot more than someone viewed as an outsider. So I took my place stage center as the room darkened, and the stars appeared. I explained how we’d gotten on the track of the Alpha Object. “I’ll confess,” I said, “that I didn’t really believe we were going to find something that appears every couple of centuries. That surfaces periodically with survivors still on board. But we did find it.”

  We started the clips we’d shown around when we were trying to enlist StarCorps, Survey, and the politicians. The audience in the Sagittarius Room, I’m happy to say, was more receptive.

  When they heard the voice coming over the radio, speaking in that strange language, the room went dead silent. And then Belle’s translation. Help us.

  Code five.

  They saw the ship, the unfamiliar design, the lights in the ports.

  And, finally, the ship fading away.

  The lights in the Sagittarius Room came on, but the audience sat stunned.

  “When will they be back?” asked Linda.

  “In the fall of 1612. One hundred seventy-eight years. It might come back in eighty-nine years. Or forty-
four and a half. No way to be sure.”

  I’ve never seen an audience so frozen. They sat and stared at me. Nobody moved.

  Alex came over. “Thank you, Chase,” he said. “Ladies and gentlemen, now you see what we’re dealing with.” He paused and looked around the room. “Let’s talk about the Antares Object. We can’t be positive of its precise time of arrival, although we’re pretty sure we have it down to within a few days. We also can’t be certain of the exact place it will show up. But we have the neighborhood pinpointed. It’s essential that everybody keep in mind that it’s caught in a time warp. That means it could submerge again without warning.”

  A hand went up. Allie. “Alex, do we have a reading on how long it will stay with us? Before that happens? What’s the term? Submerges?”

  “Yes, Allie. Possibly as long as six hours. More likely, about five. Because of the uncertainty, we want to caution you about boarding. We just can’t be sure about anything. And the situation becomes even more doubtful since we are not likely to know how long it will have been in place. Your primary job is to find it and let us know where it is.”

  One of the pilots I didn’t know raised a hand. “And you’re going to do the rescue?”

  “That’s the plan. Assuming there are passengers, we’ll want you to approach with your landers and stand by. We’ll be stocking each lander with additional pressure suits. Chase and I will attempt to board. If there is anyone in there, we’ll try to get them to the landers as quickly as we can. We want you to stay clear. If the thing submerges, we don’t know the size of the surrounding area that will be dragged under with it. Maybe it won’t affect the surroundings at all. But we’re assuming the worst.

  “That means there is a risk. It’s possible that, if it submerges, and you’re nearby, you’ll be dragged along with it. I wish we knew more about this, but that’s our situation. So if anyone wants to rethink this, we’ll understand.”

 

‹ Prev