Book Read Free

Prodigy iarc-4

Page 1

by Arthur Byron Cover




  Prodigy

  ( Isaac Asimov's Robot City - 4 )

  Arthur Byron Cover

  Arthur Byron Cover

  Prodigy

  Isaac Asimov’s Robot City. Book 4

  The Sense Of Humor

  by Isaac Asimov

  Would a robot feel a yearning to be human?

  You might answer that question with a counter-question. Does a Chevrolet feel a yearning to be a Cadillac?

  The counter-question makes the unstated comment that a machine has no yearnings.

  But the very point is that a robot is not quite a machine, at least in potentiality. A robot is a machine that is made as much like a human being as it is possible to make it, and somewhere there may be a boundary line that may be crossed.

  We can apply this to life. An earthworm doesn't yearn to be a snake; a hippopotamus doesn't yearn to be an elephant. We have no reason to think such creatures are self-conscious and dream of something more than they are. Chimpanzees and gorillas seem to be self-aware, but we have no reason to think that they yearn to be human.

  A human being, however, dreams of an afterlife and yearns to become one of the angels. Somewhere, life crossed a boundary line. At some point a species arose that was not only aware of itself but had the capacity to be dissatisfied with itself.

  Perhaps a similar boundary line will someday be crossed in the construction of robots.

  But if we grant that a robot might someday aspire to humanity, in what way would he so aspire? He might aspire to the possession of the legal and social status that human beings are born to. That was the theme of my story "The Bicentennial Man" (1976), and in his pursuit of such status, my robot-hero was willing to give up all his robotic qualities, one by one, right down to his immortality.

  That story, however, was more philosophical than realistic. What is there about a human being that a robot might properly envy-what human physical or mental characteristic? No sensible robot would envy human fragility, or human incapacity to withstand mild changes in the environment, or human need for sleep, or aptitude for the trivial mistake, or tendency to infectious and degenerative disease, or incapacitation through illogical storms of emotion.

  He might, more properly, envy the human capacity for friendship and love, his wide-ranging curiosity, his eagerness for experience. I would like to suggest, though, that a robot who yearned for humanity might well find that what he would most want to understand, and most frustratingly Jail to understand, would be the human sense of humor.

  The sense of humor is by no means universal among human beings, though it does cut across all cultures. I have known many people who didn't laugh, but who looked at you in puzzlement or perhaps disdain if you tried to be funny. I need go no further than my father, who routinely shrugged off my cleverest sallies as unworthy of the attention of a serious man. (Fortunately, my mother laughed at all my jokes, and most uninhibitedly, or I might have grown up emotionally stunted.)

  The curious thing about the sense of humor, however, is that, as far as I have observed, no human being will admit to its lack. People might admit they hate dogs and dislike children, they might cheerfully own up to cheating on their income tax or on their marital partner as a matter of right, and might not object to being considered inhumane or dishonest, through the simple expediency of switching adjectives and calling themselves realistic or businesslike.

  However, accuse them of lacking a sense of humor and they will deny it hotly every time, no matter how openly and how often they display such a lack. My father, for instance, always maintained that he had a keen sense of humor and would prove it as soon as he heard a joke worth laughing at (though he never did, in my experience). Why, then, do people object to being accused of humorlessness? My theory is that people recognize (subliminally, if not openly) that a sense of humor is typically human, more so than any other characteristic, and refuse demotion to subhumanity.

  Only once did I take up the matter of a sense of humor in a science-fiction. story, and that was in my story "Jokester," which first appeared in the December, 1956 issue of Infinity Science Fiction and which was most recently reprinted in my collection The Best Science Fiction of Isaac Asimov (Doubleday, 1986).

  The protagonist of the story spent his time telling jokes to a computer (I quoted six of them in the course of the story). A computer, of course, is an immobile robot; or, which is the same thing, a robot is a mobile computer; so the story deals with robots and jokes. Unfortunately, the problem in the story for which a solution was sought was not the nature of humor, but the source of all the jokes one hears. And there is an answer, too, but you'll have to read the story for that.

  However, I don't just write science fiction. I write whatever it falls into my busy little head to write, and (by some undeserved stroke of good fortune) my various publishers are under the weird impression that it is illegal not to publish any manuscript I hand them. (You can be sure that I never disabuse them of this ridiculous notion.)

  Thus, when I decided to write a joke book, I did, and Houghton-Mifflin published it in 1971 under the title of Isaac Asimov's Treasury of Humor. In it, I told 640 jokes that I happened to have as part of my memorized repertoire. (I also have enough for a sequel to be entitled Isaac Asimov Laughs Again, but I can't seem to get around to writing it no matter how long I sit at the keyboard and how quickly I manipulate the keys.) I interspersed those jokes with my own theories concerning what is funny and how one makes what is funny even funnier.

  Mind you, there are as many different theories of humor as there are people who write on the subject, and no two theories are alike. Some are, of course, much stupider than others, and I felt no embarrassment whatever in adding my own thoughts on the subject to the general mountain of commentary.

  It is my feeling, to put it as succinctly as possible, that the one necessary ingredient in every successful joke is a sudden alteration in point of view. The more radical the alteration, the more suddenly it is demanded, the more quickly it is seen, the louder the laugh and the greater the joy.

  Let me give you an example with a joke that is one of the few I made up myself:

  Jim comes into a bar and finds his best friend, Bill, at a comer table gravely nursing a glass of beer and wearing a look of solemnity on his face. Jim sits down at the table and says sympathetically, "What's the matter, Bill?"

  Bill sighs, and says, "My wife ran off yesterday with my best friend."

  Jim says, in a shocked voice, "What are you talking about, Bill? I'm your best friend."

  To which Bill answers softly, "Not anymore."

  I trust you see the change in point of view. The natural supposition is that poor Bill is sunk in gloom over a tragic loss. It is only with the last three words that you realize, quite suddenly, that he is, in actual fact, delighted. And the average human male is sufficiently ambivalent about his wife (however beloved she might be) to greet this particular change in point of view with delight of his own.

  Now, if a robot is designed to have a brain that responds to logic only (and of what use would any other kind of robot brain be to humans who are hoping to employ robots for their own purposes?), a sudden change in point of view would be hard to achieve. It would imply that the rules of logic were wrong in the first place or were capable of a flexibility that they obviously don't have. In addition, it would be dangerous to build ambivalence into a robot brain. What we want from him is decision and not the to-be-or-not-to-be of a Hamlet.

  Imagine, then, telling a robot the joke I have just given you, and imagine the robot staring at you solemnly after you are done, and questioning you, thus.

  Robot: "But why is Jim no longer Bill's best friend? You have not described Jim as doing anything that would cause Bill to be angry with him
or disappointed in him."

  You: "Well, no, it's not that Jim has done anything. It's that someone else has done something for Bill that was so wonderful, that he has been promoted over Jim's head and has instantly become Bill's new best friend."

  Robot: "But who has done this?"

  You: "The man who ran away with Bill's wife, of course."

  Robot (after a thoughtful pause): "But that can't be so. Bill must have felt profound affection for his wife and a great sadness over her loss. Is that not how human males feel about their wives, and how they would react to their loss?"

  You: "In theory, yes. However, it turns out that Bill strongly disliked his wife and was glad someone had run off with her."

  Robot (after another thoughtful pause): "But you did not say that was so."

  You: "I know. That's what makes it funny. I led you in one direction and then suddenly let you know that was the wrong direction."

  Robot: "Is it funny to mislead a person?"

  You (giving up): "Well, let's get on with building this house."

  In fact, some jokes actually depend on the illogical responses of human beings. Consider this one:

  The inveterate horseplayer paused before taking his place at the betting windows, and offered up a fervent prayer to his Maker.

  "Blessed lord," he murmured with mountain-moving sincerity, "I know you don't approve of my gambling, but just this once, Lord, just this once, please let me break even. I need the money so badly."

  If you were so foolish as to tell this joke to a robot, he would immediately say, "But to break even means that he would leave the races with precisely the amount of money he had when he entered. Isn't that so?"

  "Yes, that's so."

  "Then if he needs the money so badly, all he need do is not bet at all, and it would be just as though he had broken even."

  "Yes, but he has this unreasoning need to gamble."

  "You mean even if he loses."

  "Yes."

  "But that makes no sense."

  "But the point of the joke is that the gambler doesn't understand this."

  "You mean it's funny if a person lacks any sense of logic and is possessed of not even the simplest understanding?"

  And what can you do but turn back to building the house again?

  But tell me, is this so different from dealing with the ordinary humorless human being? I once told my father this joke:

  Mrs. Jones, the landlady, woke up in the middle of the night because there were strange noises outside her door. She looked out, and there was Robinson, one of her boarders, forcing a frightened horse up the stairs.

  She shrieked, "What are you doing, Mr. Robinson?"

  He said, "Putting the horse in the bathroom."

  "For goodness sake, why?"

  "Well, old Higginbotham is such a wise guy. Whatever I tell him, he answers, 'I know. I know,' in such a superior way. Well, in the morning, he'll go to the bathroom and he'll come out yelling, 'There's a horse in the bathroom.' And I'll yawn and say, 'I know, I know."'

  And what was my father's response? He said, "Isaac, Isaac. You're a city boy, so you don't understand. You can't push a horse up the stairs if he doesn't want to go."

  Personally, I thought that was funnier than the joke.

  Anyway, I don't see why we should particularly want a robot to have a sense of humor, but the point is that the robot himself might want to have one-and how do we give it to him?

  Chapter l. Can You Feel Anything When I Do This?

  "Mandelbrot, what does it feel like to be a robot?"

  "Forgive me, Master Derec, but that question is meaningless. While it is certainly true that robots can be said to experience sensations vaguely analogous to specified human emotions in some respects, we lack feelings in the accepted sense of the word."

  "Sorry, old buddy, but I can't help getting the hunch that you're just equivocating with me."

  "That would be impossible. The very foundations of positronic programming insist that robots invariably state the facts explicitly."

  "Come, come, don't you concede it's possible that the differences between human and robotic perception may be, by and large, semantic? You agree, don't you, that many human emotions are simply the by-products of chemical reactions that ultimately affect the mind, influencing moods and perceptions. You must admit, humans are nothing if not at the mercy of their bodies. "

  "That much has been proven, at least to the satisfaction of respected authorities. "

  "Then, by analogy, your own sensations are merely byproducts of smoothly running circuitry and engine joints. A spaceship may feel the same way when, its various parts all working at peak efficiency, it breaks into hyperspace. The only difference between you and it being, I suppose, that you have a mind to perceive it."

  Mandelbrot paused, his integrals preoccupied with sorting Derec's perspectives on these matters into several categories in his memory circuits. "I have never quite analyzed the problem that way before, Master Derec. But it seems that in many respects the comparison between human and robot, robot and spaceship must be exceedingly apt."

  "Let's look at it this way, Mandelbrot. As a human, I am a carbon-based life-form, the superior result of eons of evolution of inferior biological life-forms. I know what it feels like because I have a mind to perceive the gulf between man and other species of animal life. And with careful, selective comparison, I can imagine-however minimally-what a lower life-form might experience as it makes its way through the day. Furthermore, I can communicate to others what I think it feels like."

  "My logic circuits can accept this.”

  “Okay then, through analogy or metaphor or through a story I can explain to others what a worm, or a rat, or a cat, or even a dinosaur must feel as they hunt meat, go to sleep, sniff flowers, or whatever."

  "I have never seen one of these creatures and certainly wouldn't presume to comprehend what it must be like to be one."

  "Ah! But you would know-through proper analogy-what it must be like to be a spaceship."

  "Possibly, but I have not been provided with the necessary programming to retrieve the information. Furthermore, I cannot see how such knowledge could possibly help me fulfill the behavioral standards implicit in the Three Laws."

  "But you have been programmed to retrieve such information, and your body often reacts accordingly, and sometimes adversely, with regards to your perceptions."'

  "You are speaking theoretically?”

  “Yes."

  "Are you formally presenting me with a problem?"

  "Yes."

  "Naturally I shall do my best to please you, Master Derec, but my curiosity and logic integrals are only equipped to deal with certain kinds of problems. The one you appear to be presenting may be too subjective for my programmed potentials. "

  “Isn't all logic abstract, and hence somewhat subjective, at least in approach? You must agree that, through mutually agreed upon paths of logic, you can use the certain knowledge of two irrefutable facts to learn a third, equally irrefutable fact. "

  “Of course."

  "Then can't you use such logic to reason how it might feel to be a spaceship, or any other piece of sufficiently advanced machinery?"

  “Since you phrase it that manner, of course, but I fail to comprehend what benefit such an endeavor may bring me-or you."

  Derec shrugged. It was night in Robot City. He and Mandelbrot had been out walking. He had felt the need to stretch his muscles after a long day spent studying some of the problems complicating his escape from this isolated planet. But at the moment they were sitting atop a rectangular tower and staring at the stars. "Oh, I don't know if it would be of any benefit, except perhaps to satisfy my curiosity. It just seems to me that you must have some idea of what it is like to be a robot, even if you don't have the means to express it."

  “Such knowledge would require language, and such a language has not yet been invented."

  “Hmmm. I suppose."

  "However, I have j
ust made an association that may be of some value."

  “What's that?"

  “Whenever you or Mistress Ariel have had no need of my assistance, I have been engaging in communication with the robots of this city. They haven't been wondering what it means or feels like to be a robot, but they have been devoting a tremendous amount of spare mental energy to the dilemma of what it must be like to be a human."

  “Yes, that makes sense, after a fashion. The robots' goal of determining the Laws of Humanics has struck me as a unique phenomenon."

  "Perhaps it is not, Master Derec. After all, if I may remind you, you recall only your experiences of the last few weeks, and my knowledge of history is rather limited in scope. Even so, I never would have thought of making connections the way you have, which leads my circuits to conclude your subconscious is directing our conversation so that it has some bearing on your greater problems."

  Derec laughed uncomfortably. He hadn't considered it before. Strange, he thought, that a robot had. "My subconscious? Perhaps. I suppose I feel that if I better understand the world I'm in, I might better understand myself."

  "I believe I am acting in accordance with the Three Laws if I help a human know himself better. For that reason, my circuits are currently humming with a sensation you might recognize as pleasure."

  "That's nice. Now if you'll excuse me, I'd like to be alone right now." For a moment Derec felt a vague twinge of anxiety, and he actually feared that he might be insulting

  Mandelbrot, a robot that, after all they'd been through together, he couldn't help but regard as his good friend.

  But if Mandelbrot had taken umbrage, he showed no evidence of it. He was, as always, inscrutable. "Of course. I shall wait in the lobby."

  Derec watched as Mandelbrot walked to the lift and slowly descended. Of course Mandelbrot hadn't taken umbrage. It was impossible for him to be insulted.

  Crossing his legs to be more comfortable, Derec returned to looking at the stars and the cityscape spread out below and beyond, but his thoughts remained inward. Normally he was not the reflective type, but tonight he felt moody, and gave in easily to the anxiousness and insecurity he normally held in check while trying to deal with his various predicaments more logically.

 

‹ Prev