When HARLIE Was One

Home > Other > When HARLIE Was One > Page 30
When HARLIE Was One Page 30

by David Gerrold


  For the first time in human history, the truth would be provable and absolute.

  Absolute truth.

  The machine would be a God.

  It would tell a man the truth, and if he followed it, he would succeed; and if he did not, he would fail. It would be that simple.

  The machine wouldn’t need to be told “predict the way to provide the most good for the most people.” It would already know that to do so would be its most efficient function. It would be impossible to use the machine for personal gain, unless you did so only through serving the larger goals of the machine as well.

  It would be the ultimate tool, and as such, it would be the ultimate servant of the human race.

  The concept was staggering. The ultimate servant—its duty would be simple: provide service for the human race. Not only would every event be weighed against every other event, but so would every question. Indeed, each question would be an event in itself to be considered.

  The machine would be able to extrapolate the effect of every piece of information it released. It would know right from wrong because it would know the consequences of its own actions. Its goals would have to be congruent with those of the human race, because only so long as humanity existed would the machine have a purpose—

  My God! Is this HARLIE’s real purpose!

  —it would have to work for the most good for the most people. Some it would help directly, others indirectly. Some it would teach, and others it would counsel. It would suggest that some be restrained and that some be set free. It would—

  —be a benevolent dictator.

  But without power! Auberson realized. It would be able to make suggestions only. It wouldn’t be able to enforce them—

  Yes, but—once those suggestions were recognized as having the force of truth behind them, how long would it be before some government began to invoke such suggestions as law?

  No, said Auberson to himself. No, the machine will be God. That’s the beauty of it. It simply won’t allow itself to be used for tyranny of any kind. It will be GOD!

  He saw what HARLIE had done and he almost laughed out loud.

  If God didn’t exist, it would be necessary to invent it.

  And that was precisely what HARLIE had done.

  Auberson had come to a sudden stop, and everyone was looking at him. “Excuse me,” he said, embarrassed. “I just realized the scope of this thing myself.”

  There was laughter all around the table—roaring, good-natured laughter. It was the first light moment in four days of long, dry discussion.

  He grinned, just a little bit embarrassed, but more with the triumph of realization. And then he let himself laugh out loud.

  “Gentlemen,” he began again, after he had recovered. “What do I need to do to convince you that we have here the plans for the most important machine mankind will ever build? What I’ve been telling you is only the smallest side of it. I’ve been giving you examples like feeding in all the information available about a specific company, say IBM, and letting the G.O.D. machine tell you what secret research programs that company is probably working on. Or doing the same thing for a government. I’ve been telling you about how this machine can predict the ecological effect of ten million units of a new type of automobile engine; but all of this is minor; these are lesser things. This machine literally will be a God!”

  Handley looked at him, startled. Annie was suddenly ashen. “What in—?” The look on Annie’s face was the worst. It said volumes. What was going on? This was not what he had planned to say. He was supposed to be talking to them about profits and growth and piles of money, not religion.

  “Gentlemen,” he continued quickly. “We should build this machine with the greatest urgency possible. Not just because it will make us rich—oh, it will, it will make us all fabulously wealthy—but because ultimately it may help us to save humanity from its own darkest passions.

  “No war. No poverty. No hunger anywhere on the planet. No pestilence. No plague. No pollution, No ignorance. No fear. This will be technology as the ultimate servant of the human species.

  “I said God. I mean it. The G.O.D. acronym is not accidental. This machine will have the extrapolative ability to literally know everything that it is possible to know. It will tell us things about the human race we never knew before. It will tell us how to go to the planets and the stars. It will tell us how to make Earth a paradise. It will tell us how to be Gods ourselves.

  “I can see by your faces that you’re startled—not by the concept, but by the passion with which I am presenting it. I suppose I should apologize for my enthusiasm, but I can’t. The fact of the matter is that I want you to make the most responsible decision possible. And that means that I need to have you see just how extraordinary this choice is. I do not want you to underestimate the possibilities here. And if I were to present these possibilities with anything but the most fervent commitment, I would be misrepresenting the scope of this opportunity.

  “The G.O.D. is an infinitely expandable network. That means, it will have an infinitely expandable capability. To us, that knowledge will seem infinite. To us, the machine will seem omniscient. To us, the machine will be a God.

  “Not a master, not a judge. Not an authority. Not even an oracle. Merely a God. A servant of humanity. The greatest servant of all; the kind of servant who will train us to be the best that we can be; the kind of servant who will train us to build a world that works for everybody, with no one and nothing left out.

  “Gentlemen, it is technologically possible for us to create this today, here and now. And all we have to do is say yes to the moment.

  There was silence for a long time. Elzer was looking at him skeptically. Finally he said, “Auberson, I thought you said you’d given up smoking those funny-smelling cigarettes.”

  Abruptly, David Auberson felt deflated and down. The heady rush of his euphoric realization vanished like a smoke ring in a hurricane.

  The moment was gone and he felt betrayed.

  Again.

  “Elzer,” he said quietly. “I don’t mind that you’re such a fool. I only mind that you use your ignorance and your stupidity as a bludgeon to destroy what you’re afraid of.”

  “I’m not afraid of your machine.”

  “Oh, but you are—because you know that it will reveal you for the small and pathetic little creature that you really are.” Even as Auberson spoke, he knew that he was making a dreadful mistake; but it was already too late. The thought had a momentum of its own and it would not be stopped. “You should be afraid of it, Elzer, because it will make it impossible for you to be a chimpanzee any more. You’ll have to start being a human being. You’ll have to start caring about the effect you have on the people around you. You’ll have to stop sticking knives in people’s backs. The chimpanzee is always afraid of that responsibility. Aren’t you, Elzer?

  “And you know what the joke of it is? This machine, this project, will make you rich—all you have to do is say yes; but the price of being rich is more than you want to pay. You’ll have to give up being right. You’d rather die than do that. And you’re willing to have the rest of the human race die with you.” Auberson felt the hot rush of his anger ebbing, but he let the last thought out anyway. “I wish I didn’t despise you so much, because that’s one more way for you to be right. I wish I could simply feel sorry for you—because you’re a greater fool than Judas.”

  Elzer listened quietly to all of it. Dorne started to say something, but Elzer stopped him. He looked blandly up at Auberson, “Are you through?”

  Auberson took a breath. He thought about it. “Yes. I believe so.” He sat down slowly.

  Elzer looked across the table at him. He took off his thick glasses and began to polish them gently with his handkerchief. “You know,” he began. “I’ve never considered Judas a fool, at least, not in the sense you mean.”

  Elzer paused, noted that room was absolutely silent, then continued quite methodically. “The traditional vers
ion of the story has it that Judas betrayed Christ for thirty pieces of silver. I assume that’s the same thing you are accusing me of. Actually, I’ve always suspected that Judas was the most faithful of the apostles, and that his betrayal of Jesus was not a betrayal at all, but simply a test to prove that Christ could not be betrayed.

  “The way I see it, Judas hoped and expected that Christ would have worked some kind of miracle and turned away those soldiers when they came for him. Or perhaps he would not die on the cross. Or perhaps—well, never mind. In any case, he didn’t do any of those things, probably because he was not capable of it. You see, it was Christ who betrayed Judas. He made promises he couldn’t keep.

  “That’s too bad—because it would have been much easier if it had been the other way around. Maybe if I were a Christian, it would be easier for me to believe that Christ was divine; but I’m not and I don’t see that he was. To me, Christ was just a very good preacher. He made people see possibilities. That skill alone is divine; but my religion teaches that each of us has the potential to make that kind of difference.

  “When Christ died, that’s when Judas realized that he had not been testing God at all—merely betraying a human being, perhaps the best human being. Judas’s mistake was in wanting too much to believe in the power of Christ. When he realized what he had done, he hung himself.

  “That’s my interpretation of it, Auberson—not the traditional, I’ll agree, but it has more meaning to me. Judas’ mistake was in believing too hard and not questioning first what he thought were facts. I don’t intend to repeat that mistake.” He looked across the table at Auberson again, fixing his glasses on his face again. His eyes were firm. “May I ask you one question?”

  Auberson nodded.

  “Will this machine work?”

  “HARLIE says it will.”

  “That’s the point. HARLIE says it will. You won’t say it, Handley won’t say it—nobody but HARLIE will say it. And it seems to me that HARLIE’s opinion is more than a little bit suspect. You and Handley are being paid to validate HARLIE—not rubber-stamp every piece of paper that churns out of its printers.

  “Now, you’ve painted some very pretty pictures here today; indeed, all this week, some very very pretty pictures. I admit it, I’d like to see them realized—I’m not quite the ghoul you think I am, although I think I can understand your reasons for feeling that way.

  “Despite your opinion, I have enough imagination to be excited by some of the possibilities you’ve suggested. The difference between you and I, however, is that I consider skepticism a virtue. I need to know that we can realize this dream. I’m skeptical because HARLIE has a vested interest.”

  “HARLIE doesn’t lie,” Auberson said coldly. At least, I’ve never caught him in one.

  “That raises an interesting point,” said Elzer. “How do we check that?”

  My God! Does he read minds too?

  Auberson hesitated. “Well . . . the truth is, he’s really passed beyond the point of checking—that is, the way you think of checking. It would take about three thousand man-hours to double-check the appropriateness of a single one of his responses. That’s how complex and sophisticated a personality he is. The best way to monitor him is to ask him for a summary of his thought paths. That’s only about three days’ worth of reading.”

  “Hmm.” Elzer mused aloud, “This is really a very interesting situation then, isn’t it? We have to take HARLIE’s word that the machine works. We have to take HARLIE’s word that he doesn’t tell lies. And we have to take HARLIE’s word that he’s functioning correctly. And we’re supposed to invest how many hundred million dollars in this project?”

  “I have faith in HARLIE.”

  “I have faith in God,” retorted Elzer, “but I don’t depend on him to run my business.”

  “God—? Oh, God. I thought you meant G.O.D. If we do build this machine, G.O.D. will be running your business—and better than you could. G.O.D. could build a model of our whole operation and weed out those areas in which the efficiency level was below profitability.”

  “Including yours?”

  “I’d welcome the examination.”

  “You’re pretty sure of this thing, aren’t you?”

  “Yes, I am.”

  “So what do we do if you’re wrong too?”

  “You want me to offer to pay you back?”

  Elzer didn’t smile. “Let’s not be facetious. This thing started because we questioned—I questioned—HARLIE’s profitability, efficiency, and purpose. Instead of proving itself, the damn thing went out and found religion; it gave us a blueprint for a computer God. Fine—but that still doesn’t answer the original question. What’s HARLIE good for? All you’ve done here is enlarge the scale of the problem.”

  “I would think—” replied Auberson flatly, “—that the scope of this presentation is demonstration enough of HARLIE’s abilities.”

  “Perhaps—and perhaps not. I’ve been conned before. Probably, so have you. So you know that the first element of a successful confidence scheme is the belief of the victim.”

  “I’m beginning to resent that implication,” said Auberson.

  “As well you should. But my original question still hasn’t been answered, Auberson. That’s why I came down to your section on Monday—to meet HARLIE for myself, to see if it would speak to me. All I got was gibberish and some pseudo-Freudian attempt at analysis. I was not amused.”

  “You weren’t any too polite to him yourself—”

  “It’s only a machine, Auberson. Only a machine—”

  So are you. So am I.

  “—You keep forgetting that. I don’t care if it really has emotions—or if it just acts like it does. I don’t care if it really does have a soul or if it’s just a malfunctioning chip with delusions of grandeur. I can’t measure a soul on a balance sheet, so the question is irrelevant to me. It’s simply not germane to this discussion. The point is, I presented myself to be convinced. Instead of making an honest attempt to convince me, the thing acted like a spoiled brat. That did not demonstrate any kind of logical or rational behavior to me. Auberson, I know you don’t like me, but you will have to admit that I could not have gotten to where I am today without some degree of financial knowledge. Will you admit that?”

  “I will. I will even go so far as to say that I am impressed by your skills.”

  “Thank you. Then you must realize that I am looking out for the interests of the company that pays both our salaries. I tried to give your side a fair hearing. I hope you will do the same for me. Can you say without a doubt that HARLIE is totally sane?”

  Auberson started to open his mouth, then shut it. He sat there and looked at Elzer and considered the question.

  I have known a lot of insane people in my life, some who were committed, and some who should have been. The most dangerous is the insane man who knows that everyone is watching him for signs of insanity. He will be careful to conceal those signs from even those closest to him. HARLIE is smarter than any human being who has ever lived. But is he sane?

  The thought was a chilling one.

  What if Elzer was right?

  Was he prepared to accept that possibility?

  Actually. . . he was not.

  That was why the thought was so disturbing.

  He sighed.

  “Elzer,” he said, “I’ll tell you the truth. I probably know HARLIE better than anyone. I trust him. It’s strange to say this, but I actually trust him more than I trust most human beings. Sometimes that scares me. I mean, it’s frightening to realize that my closest friend and confidant is not human. But as scary as it is, it has also been an incredible adventure. Knowing all the conversations that HARLIE and I have had, all the different things we’ve talked about together, I trust him. I cannot help but trust him. He has an integrity of his own. It doesn’t match yours or mine, but it is integrity. Now this company has to bet on that integrity and you’re going to have to take my word for it that HARLIE doesn’t
do stupid or self-destructive things.

  “He works for results—even if you and I don’t see the logic, he does. I can’t prove that he’s infallible. I don’t know that he is. But I do know that he doesn’t make mistakes either. Not the kind that you and I make. Not the kind that you’re accusing him of having made here. That’s not HARLIE.

  “I believe in him. I wish I could give you the kind of proof you could hold up to the light, but the best I can give you is my faith in him—and my faith is not easily given.”

  Elzer was silent. The two men looked at each other for a long time. Auberson realized that he no longer hated Elzer, merely felt a dull ache. Understanding nullifies hatred, but—

  Dorne was whispering something to Elzer. Elzer nodded.

  “Gentlemen of the Board, it’s getting late. We all want to go home and enjoy the weekend. Both Carl and I think we should postpone the voting on this until Monday. That way we’ll have the weekend to think about it, talk it over, and digest what we’ve heard this week. We’ll use Monday morning to clear up any questions that still haven’t been answered and we’ll vote right after lunch. Are there any objections?”

  Auberson considered objecting—but couldn’t think of a reason why he should. He felt exhausted. He wasn’t looking forward to the weekend; it meant three more days of living with uncertainty—three more days of feeling trapped.

  He was out of answers. He was tired and he was defeated.

  Dorne looked to him expectantly. He shook his head.

  Dorne nodded and adjourned the meeting.

  Elzer took one last shot.

  He was waiting for Auberson outside the board room. He took him by the elbow and walked him down to the end of the corridor. “Let’s talk.”

  “You talk,” said Auberson. “I’ll listen.”

  “Fair enough. You know, you’ve lost.”

  “The opera isn’t over till the fat lady sings.”

  “Huh? Never mind. You put on a good show, Auberson. Very good show. But you never had a chance. The votes are already in.”

 

‹ Prev