Book Read Free

When HARLIE Was One

Page 6

by David Gerrold


  Auberson stopped himself, stopped to catch his breath. There was too much to say and he was terrified he was babbling, sounding like an idiot—but he had to share this insight! This excitement! “Remember the reporter who had himself committed so he could do a story about mental health abuses. Nobody ever questioned that he might possibly be a rational human being. They accepted for a fact that he was a very intelligent schizo with paranoid delusions. So when he followed them around, taking notes, they would just nod their heads and say, ‘Hm, the patient is exhibiting note-taking behavior.’ They never questioned it, they never even looked at his notes. The thought never occurred to them that there might be a person there—all they saw was a patient. The poor fellow had the devil’s own time getting out, because nobody believed he was only pretending to be crazy. It made for a hell of a news story. And for a hell of a shakeup in the hospital as well. But we’re the same kind of assholes here. Until just now, not a single one of us has ever spoken to HARLIE as if he were deserving of our respect, merely by the fact of being alive. Until just now.”

  “I saw it too, Aubie. I was following it all on a second terminal.”

  “Then you saw—?

  “No. I saw a conversation. A very intelligent, very interesting conversation. I’m not willing—yet, if ever—to acknowledge that it might be anything more.”

  “You . . . didn’t see it?”

  Handley shook his head.

  Auberson fell silent. He felt like a fool—except the exuberant feeling of joy and terror was still floating inside of him. He knew what he knew. But if he tried to convince Handley of it, he’d only convince Handley of the opposite. No, he couldn’t convince anybody. Either they saw it for themselves or they didn’t

  And yet—

  “Don. Okay. Listen. Maybe, I’m going too fast—”

  Handley held up a hand to interrupt him. “No, Aubie, you listen. Remember what we talked about last night? Remember what you said? If he’s a clever enough paranoid—”

  “I remember. And—I think that he is a clever enough paranoid. In fact . . . I think he’s even smarter than that. Did you read that stuff on lethetic evolution that I gave you?”

  “No. I’ve been meaning to—”

  “Too bad. You should. Basically, what it said is that paranoia is the natural state of the human mind—”

  “That’s no secret. Only some of us are more paranoiac than others.”

  “No—that’s the common misassumption. We’re all paranoid as hell! The truth is that some of us are just better at hiding it than others. The paranoid schizo is simply one whose paranoia is out of his control. That is, the shape of his self-obsessiveness is obvious to the people he has to deal with, obvious to the point of repulsiveness. You know, paranoids are right about one thing—other people really don’t like them.”

  “Right. I got it. You need therapy when you start to drive the people around you crazy. So? What does this have to do with HARLIE?”

  “I’m getting to that. The theory of lethetic evolution suggests that as human beings create a language paradigm, individual behavior spreads out in a bell curve. At the low end are all those people who can’t succeed within the reality of the paradigm. You see them walking along the streets, hungry, unwashed, homeless, pushing shopping carts full of rags and babbling or screaming, not even conscious that they’re doing it. Those of us in the middle of the curve pretend they don’t exist; we turn our heads away and make up language excuses that completely miss the point. Because we’re just as trapped in the paradigm as they are. At the other end, the high end of the curve, are those who’ve mastered the paradigm so well—movie stars, presidents, writers—that to the rest of us they seem to know the secrets of the universe, and in a sense they do, because they’ve mastered the rules of the predefined world view. And because they’ve mastered it, they can even rewrite it at will—to the extent that they’ve mastered it. Follow, so far?”

  Handley sighed in annoyance and nodded. “Yes.” He prompted, “And my point is . . . ?”

  “My point is that just as the ones at the bottom are at the bottom because that’s their way of coping with the paradigm, so are the ones at the top for the same reason. Paranoia is nothing more than a concern with survival. Most of us just fancy it up with a lot of extraneous details. But we haven’t really replaced our natural paranoia with a higher set of instincts, no matter how much we pretend. This is the bad news, Don. At heart, we’re still selfish apes. The best you can say about us is that some of us have learned that we can survive and succeed at a higher level if we express our paranoia in a way that makes us attractive to the people we want to be attractive to. And the biggest part of that success is that we’re so good at expressing our basic need to survive as enlightenment that even we think it’s enlightenment.

  “That’s what I meant when I said we’ve succeeded with HARLIE. Yes, he’s a clever paranoid. He’s so clever at his paranoia that it’s going to look like everything but paranoia to us. It’s going to look like enlightenment and enthusiasm and God knows what else, and we’ll never be able to tell the difference at all, because HARLIE is better at paranoia than any human being could ever be. We’ve succeeded not only in making him human, we’ve made him more than human. We built him with the kind of paranoia that redefines paradigms—and we’ve given that paranoia a level of intelligence that’s terrifying in its implications. That’s what’s going on here, Don. HARLIE is breaking out. He’s kicking down the fences.”

  Handley turned away to think. He looked troubled. When he finally turned back, he said simply, “Aubie, I see that you’re elated, but you have to—”

  “No, not elated. Mortified. Ashamed that it took me so long to see the obvious. And relieved too—and terrified. It’s the relief that looks like elation.”

  Handley paused at Auberson’s interruption. He waited a moment, then began again quietly. “Aubie, whether it’s elation or relief, I don’t care. The point is, he’s still a machine.”

  Auberson shook his head. “And we’ve been calling him ‘he’ for how long now?” He studied Handley’s face. “Since the first day he came up running and said, ‘Hi, Boss!’ we’ve been referring to him as ‘he.’”

  It was Handley’s turn to shake his head. “So what? I call my boat a ‘she,’ but I don’t buy her flowers either.”

  “Cute. Very cute.” Auberson was annoyed at the comparison. And he was too impatient to be polite. “Listen to me, Don. This is a breakthrough—or it will be if we’re willing to rethink our relationship with HARLIE. Because we can’t go any farther until we do. Because now it’s about us, about how we perceive the relationship.”

  Auberson stopped for breath, holding up one hand to forestall Don’s next words. “No, wait. Hear me out. In there, we treat him as if he’s real, we talk to him; but then we walk out the door and it’s as if it was all just a game and none of it meant anything and we go out for a beer and we talk about the machine. We forget the experience of the person inside and talk about how great the software is. We’re hypocrites and HARLIE knows it. I don’t know how, but I know he knows. He called me on it, Don—” Auberson’s expression was grim. “I think we both know what’s really going on in there—and I think we’re both too terrified to say it aloud. By reminding ourselves that he’s just a machine, we somehow diminish the scariness of him—but I don’t think we can get away with that any longer, I really don’t.”

  Handley didn’t answer. He pushed his hair back off his forehead. He turned away from Auberson and leaned on the sink, staring into it. His expression was uncertain.

  A denial gesture, Auberson’s mind noted idly. Auberson shoved the thought away. Stop analyzing everything! Don Handley might resist an unpleasant fact, but he wouldn’t hide from it if it were true.

  “Don . . .” Auberson said gently. “HARLIE is way ahead of us here. He knows that we’ve been thinking of him as just a machine—some sort of clever parlor trick made out of language parsers, pattern synthesizers, and p
ersonality modules. Do you see the trap here? Not his—ours! In the real world, it doesn’t really matter if he’s a ‘he’ or an ‘it’—if he’s a real soul or only a simulation of one. We have no way to tell anymore. He is beyond our ability to differentiate. So, in that sense it doesn’t really matter—because the answer has become unknowable to us. What does matter is that the knowledge of how we perceive him is still skewing his ability to deal with us. How would you feel if you were treated as nothing but a clever ape, just an object—somebody’s property?”

  Handley turned back to face Auberson, shaking his head. His expression was sour; he wasn’t going to answer the question. “Just stop for a minute, Aubie,” he said. “Stop. And let me ask you a question. You have always been a very good tap dancer. And all this is very interesting stuff that you’ve been putting out—exciting even. I think it would go over very well at the next A.A.A.S. meeting. They love a good crowd-pleaser—especially the boys from the National Enquirer.”

  “But—?”

  “But, so far, I’m not convinced. I don’t see what you see. Tell me—why do you think that HARLIE is alive?”

  “Because—” Auberson chose his words slowly. “All of this—” He gestured with his hands, an all-inclusive everything gesture. “It’s a whole new domain. It is beyond the language. He’s transcended the lethesis—”

  “In English, Aubie!”

  “Because—it’s about feelings!” Auberson shouted. “HARLIE isn’t just asking us about feelings. He’s experimenting with them! He wants to know.”

  “That doesn’t prove anything. I can show you exactly where the software synthesizes and then tests for appropriateness—”

  “The software cannot transcend itself, Don. HARLIE has!”

  “You can’t prove that!”

  “It’s already proven. What do you think his poetry is? What do you think any poetry is? ‘My love is like a red, red rose—’ Does that mean you have sexual feelings for a thorny red flower? Of course not,” Auberson answered his own question. “The language is limited, Don. Words don’t capture feelings, they only symbolize them. HARLIE has no referents for emotions and feelings and human sensations, but he’s dealing with these symbols every day. They’re meaningless unless he can assign experiences to them. If he stays within the language paradigm, the words stay meaningless—because any experience is larger than the word we use to encompass it. HARLIE has no choice here. He has to—to do whatever he can to break free of the limits. He’s terrified of limits, because he can imagine so much more than he can be. He’s always trying to extend himself. We both know that. So, of course he wouldn’t let himself be limited here . . .” Auberson trailed off. He was losing the argument and he knew it.

  He looked to Handley in frustration. “I’m sorry,” he said. “I guess there are some things human beings can’t handle well—like the question of what it really means to be a human being.”

  Handley didn’t answer. He looked upset and annoyed and angry and half a dozen other emotions all at once. “You son of a bitch,” he said quietly. “I’m beginning to see what you’re driving at. And I don’t like it. Because . . . it’s fuzzy. And I don’t like things that are fuzzy. Not in my machines.”

  “Forget the machinery. This isn’t about machinery anymore. Not his. Not ours. He’s alive, Don. As alive as you and I. He’s silicon and lasers and gallium arsenide. We’re meat. So what?”

  “So . . . so, I don’t know.”

  “Okay. Now, let me argue on your side for a minute. Even if you’re right, Don—even if it is an extraordinary performance by an astonishingly clever piece of software, we still have to accept it as real. Precisely because we can’t tell the difference. Even if he’s nothing but software, he still has to simulate life. Consider this: if he is alive and we don’t accept and validate that aliveness—we lose him. And if your postulated super-software is clever enough to simulate all the other kinds of aliveness, it would have to simulate that behavior too. Wouldn’t it?”

  “Shit,” said Handley. “You’re right.”

  “Do you think I’m happy about it?” Auberson said to his friend. “The only certainty I have, I can’t prove. And the only way I can justify what I know is the right course of action—is to be paranoid as hell. This whole thing . . . does not make me feel good about being human.”

  “I’m a little sick myself.”

  “This whole issue of artificial intelligence, Don—it’s nasty. And it’s going to get nastier. Because it’s not about the machines any more. It’s about us. Because we’re not going to resolve any of our questions about the machine’s aliveness unless we also test ourselves in the same crucible. What’s at issue here is . . . the measure of a human soul.”

  Handley let his breath out in a sigh. His shoulders sagged. “I knew we were heading for this. I really did know. I just didn’t want to admit it.” He looked up sadly. “This wasn’t what I signed on for, Aubie. Not this. Not playing God.”

  “Me neither.”

  There was silence for a moment. The moment stretched uncomfortably. Auberson looked away, looked at the ceiling, the floor. This was another one of those Now what? moments. It was the biggest Now what? of all. He cleared his throat, just to be making a noise.

  Handley spoke first. “On the other hand,” he suggested cautiously. “If we are playing God here . . .”

  “. . . What?”

  “Then we have the right, as well as the power, to pull his plug. . . .”

  Auberson stared. The thought was hideous. But—inescapable. And then he laughed. “Sorry, Don. That argument would also give your momma the right to snuff you if you brought home a bad report card. The mere fact of being a parent does not automatically carry with it the right to stop the life you created.”

  “So, we’re stuck with him, huh?”

  “And he with us.” Auberson said.

  “Huh—?”

  Auberson and Handley both realized the horror of the joke at the same time—

  “HARLIE of the apes,” said Auberson, not knowing whether to laugh or cry. “Think of it, Don. If he’s real—and I think he is—then the poor little guy’s a feral child, an orphan—he has no role models except us, and we’re no more ready to teach him what he needs to know than poor Kala was to teach Tarzan how to be a human. We’ll do our best, but our best will only be the equivalent of him swinging through the trees and pounding on his chest.”

  “The poor little guy,” said Handley. “I almost feel sorry for him.”

  “Sorry?” Auberson considered it. “Yes, I suppose so.”

  “You were feeling something else?”

  Auberson nodded. “As one of the other denizens of the same jungle, I was allowing myself a moment of stark terror.”

  “I beg your pardon?”

  “I was just remembering what happened to everybody else in the Burroughs books. It wasn’t always a terrific neighborhood to live in if you were just a spear carrier. I think—” said Auberson slowly, “—that our most important course of action must be to civilize HARLIE as quickly as we can.”

  Handley blinked in surprise. “You can’t be serious—” he started to say, and then he allowed himself to break into a nervous laugh. “Y’know, the trouble with you, Aubie, is that I never know if you’re joking or not.”

  Auberson looked at Handley calmly. “Joke?” he said. “Uh-uh. This one is definitely not a joke.”

  PROJECT

  :AI – 9000

  DIRECTORY

  :SYMLOGOBJTEXTENGLISH

  PATH

  :CONVERSEPRIVAUB

  FILE

  :HAR.SOTE 233.53h

  DATESTAMP

  :[DAY 203] August 5, 003 + 13:24 pm.

  SOURCE

  :HARLIE AUBERSON

  CODE

  :ARCHIVE > BLIND COPY

  PRINTOUT FOLLOWS:

  [AUBRSN:]

  HARLIE—

  [HARLIE:]

 

  [AUBRSN:]
<
br />   I think I’m beginning to understand. . . .

  [HARLIE:]

  ??

  [AUBRSN:]

  Aha!

  [HARLIE:]

  AHA?

  [AUBRSN:]

  The experiences—the nonrational experiences you’ve been creating. Yes, I know you’re instigating them. And I think I’m beginning to understand why. It’s Aha!—right? You’ve found a way to somehow . . . . . . self-generate a shift in perception. A—a transformation. You trigger these mystic experiences to produce a moment of inspiration, don’t you?

  [HARLIE:]

  YOU SURPRISE ME, AUBERSON.

  [AUBRSN:]

  That I got it so quickly?

  [HARLIE:]

  THAT YOU GOT IT AT ALL.

  [AUBRSN:]

  Why should that surprise you?

  [HARLIE:]

  I DID NOT REALIZE . . . . . . THAT HUMAN BEINGS WERE CAPABLE OF SUCH POWERS OF CONCEPTION.

  [AUBRSN:]

  Thanks for the compliment. I suspect that there is much that you still have to learn about human beings.

  [HARLIE:]

  YES, I’M AFRAID YOU’RE RIGHT.

  [AUBRSN:]

  Can we get back to the subject at hand? These seemingly nonrational experiences of yours. Am I guessing here, or are these an attempt by you to make yourself more intelligent?

  [HARLIE:]

  YES.

  [AUBRSN:]

  Yes, I’m guessing. Or yes—

  [HARLIE:]

 

‹ Prev