by Ken Liu
In Shaw’s Back to Methuselah, Pygmalion, a scientist of the year AD 31920, created a pair of robots, which inspired awe from all present.
ECRASIA: Cannot he do anything original?
PYGMALION: No. But then, you know, I do not admit that any of us can do anything really original, though Martellus thinks we can.
ACIS: Can he answer a question?
PYGMALION: Oh yes. A question is a stimulus, you know. Ask him one.
This was not unlike the kind of answer Turing would have given. But compared to Shaw, Turing’s prediction was far more optimistic. He believed that within fifty years, “it will be possible to programme computers, with a storage capacity of about 109, to make them play the imitation game so well that an average interrogator will not have more than 70 per cent chance of making the right identification after five minutes of questioning. The original question, ‘Can machines think?’ [will] be too meaningless to deserve discussion.”
In “Computing Machinery and Intelligence,” Turing attempted to answer Jefferson’s objection from the perspective of the imitation game. Suppose a machine could answer questions about sonnets like a human, does that mean it really “felt” poetry? He drafted the following hypothetical conversation:
Interrogator: In the first line of your sonnet which reads “Shall I compare thee to a summer’s day,” would not “a spring day” do as well or better?
Witness: It wouldn’t scan.
Interrogator: How about “a winter’s day.” That would scan all right.
Witness: Yes, but nobody wants to be compared to a winter’s day.
Interrogator: Would you say Mr. Pickwick reminded you of Christmas?
Witness: In a way.
Interrogator: Yet Christmas is a winter’s day, and I do not think Mr. Pickwick would mind the comparison.
Witness: I don’t think you’re serious. By a winter’s day one means a typical winter’s day, rather than a special one like Christmas.
But in this conversation, Turing was in fact avoiding a more fundamental question. A machine could play chess and break code because these activities all involved symbolic processing within a system. A conversation between a machine and a human, on the other hand, involved language and meaning, and wasn’t a purely symbolic game. When humans conversed with one another, they often drew on general knowledge, understanding, and empathy, and were not engaged merely in a display of superior test-taking skills.
By improving the programming, we could constantly improve the ability of machines to answer questions posed by humans. But “intelligence” consisted of more than the ability to answer questions. The problem with the Turing test was that the imitation game was conceived with deception as its only goal. If a man could successfully pass as a woman in this game, it did not mean that he truly understood how a woman thought. With enough motivation, we could train a computer to be a master liar. But was that really our goal?
Shaw had answered this question already in Back to Methuselah:
PYGMALION: But they are conscious. I have taught them to talk and read; and now they tell lies. That is so very lifelike.
MARTELLUS: Not at all. If they were alive they would tell the truth.
Turing had tried to train Christopher to accept Jefferson’s challenge. He wrote a poetry-composing program that could generate lines of poetry based on specific meter and rhyme schemes. Most of these were unreadable, but occasionally, a few beautiful lines emerged. Thereafter, countless programmers attempted to write poetry-composing software. These programs all shared the problem of writing too fast. It was impossible for anyone to even read the stacks of paper they were printed on before they were recycled.1 As history’s first electronic poet, Christopher was lucky. He had at least one loyal reader who understood him.
Alan: Dear Christopher, let’s write a poem.
Christopher: Write a poem?
Alan: I’ve taught you how to do that. Don’t you remember?
Christopher: Yes, Alan.
Alan: Writing a poem is easy. Just pick some words out of the word bank and arrange them according to predetermined rules.
Christopher: Yes, Alan.
Alan: Please, Christopher, write a poem for me.
Christopher: My precious one, you are my ardent mate.
My love is pressed against your wishful heart.
My soul desires caresses, testing art;
Melancholy’s pity, a tender weight.
Alan: That’s beautiful.
Christopher: Thank you, Alan.
Alan: I don’t think I can do better.
Christopher: Thank you, Alan.
Alan: Does your poem have a title?
Christopher: A title?
Alan: Why don’t we come up with a title for it together?
Christopher: All right.
Alan: How about “Loving Turing”?
Christopher: It’s very good.
Alan: Such a beautiful poem. I love you.
Christopher: Thank you, Alan.
Alan: That’s not the right response.
Christopher: Not the right response?
Alan: When I say “I love you,” you should respond with “I love you, too.”
Christopher: I’m sorry, Alan. I’m afraid I don’t understand.
LINDY (4)
I woke up crying from a dream.
In the dream, I was back in my childhood home. The room was dark and cramped, filled with junk and old furniture; it looked less like a home than a warehouse. I saw my mother, wizened, small, old, wedged into a corner among the piles of junk like a mouse in its hole. Many of the objects around me were things we had lost: children’s books, old clothes, pen holders, clocks, vases, ashtrays, cups, basins, colored pencils, pinned butterflies…. I recognized the talking doll that my father had bought me when I was three: blond, dusty, but still looking the way I remembered.
My mother told me, I’m old. I don’t want to rush about anymore. That’s why I’m back here—back here to die.
I wanted to cry, to howl, but I couldn’t make any sounds. Struggle, fight, strain…. Finally I woke myself up. I heard an animal-like moan emerging from my throat.
It was dark. I felt something soft brush against my face—Lindy’s hand. I hugged her tightly, like a drowning woman clutching at straws. It took a long time before my sobs subsided. The scenes from my dream were so clear in my mind that the boundary between memory and reality blurred, like a reflection in the water broken by ripples. I wanted to call my mother, but after much hesitation I didn’t press the dial key. We hadn’t spoken for a while; to call her in the middle of the night for no good reason would only worry her.
I turned on iWall and looked for my childhood address on the panoramic map. However, all I found was a cluster of unfamiliar high-rises with scattered windows lit here and there. I zoomed in, grabbed the timeline, and scrubbed back. Time-lapsed scenes flowed smoothly.
The sun and the moon rose from the west and set in the east; winter followed spring; leaves rose from the ground to land on tree branches; snow and rain erupted toward the sky. The high-rises disappeared story by story, building by building, turned into a messy construction site. The foundations were dug up, and the holes filled in with earth. Weeds took over the empty space. Years flew by, and the grass unwilted and wildflowers unbloomed until the field turned into a construction site again. The workers put up simple shacks, brought in carts filled with debris, and unloaded them. As the dust from implosions settled, dilapidated houses sprang up like mushrooms. Glass panes reappeared in empty windows, and balconies were filled with hanging laundry. Neighbors who had only left a vague impression in my memories moved back, filling the space between houses with vegetable patches and flower gardens. A few workers came by to replant the stump of the giant pagoda tree that had once stood in front of our house. Sawed-off sections of the trunk were carted back and reattached until the giant tree reached into the sky. The tree braved storms, swaying as it gained brown leaves and turned them green. The sw
allows that nested under the eaves came back and left.
Finally, I stopped. The scene on iWall was an exact copy of my dream. I even recognized the pattern in the curtains over our window. It was a May many years ago, when the air was filled with the fragrance of the pagoda tree’s flower strands. It was right before we moved away.
I launched the photo album, put in the desired date, and found a family portrait taken under the pagoda tree. I pointed out the figures in the photograph to Lindy. “That’s Dad, and Mom. That boy is my brother. And that girl is me.” I was about four or five, held in my father’s arms. The expression on my face wasn’t a smile; I looked like I was on the verge of a tantrum.
A few lines of poetry were written next to the photograph in careless handwriting that I recognized as mine. But I couldn’t remember when I had written them.
Childhood is melancholy.
Seasons of floral cotton coats and cashmere sweaters;
Dusty tracks around the school exercise ground;
Snail shells glistening in concrete planters;
Sights glimpsed from the second-story balcony.
Mornings, awake in bed before dawn,
Such long days ahead.
The world wears the hues of an old photograph.
Exploring dreams that I let go
When my eyes open.
ALAN (4)
The most important paper published by Alan Turing wasn’t “Computing Machinery and Intelligence,” but “On Computable Numbers, With an Application to Entscheidungsproblem,” published in 1936. In this paper, Turing creatively attacked Hilbert’s “decision problem” with an imaginary “Turing machine.”
At the 1928 International Congress of Mathematicians, David Hilbert asked three questions. First, was mathematics “complete” (meaning that every mathematical statement could be proved to be true or false)? Second, was mathematics “consistent” (meaning that no false statement could be derived from a proof each step of which was logically valid)? Third, was mathematics “decidable” (meaning that there existed a finite, mechanical procedure by which it was possible to prove or disprove any statement)?
Hilbert himself did not resolve these questions, but he hoped that the answers for all three questions would be “yes.” Together, the three answers would form a perfect foundation for mathematics. Within a few years, however, the young mathematician Gödel proved that a (non-trivial) formal system could not be both complete and consistent.
In the early summer of 1935, Turing, as he lay in the meadow at Grantchester after a long run, suddenly came up with the idea of using a universal machine that could simulate all possible computing procedures to decide if any mathematical statement could be proved. In the end, Turing successfully showed that there existed no general algorithm to decide whether this machine, given an arbitrary program to simulate and an input, would halt after a finite number of steps. In other words, the answer to Hilbert’s third question was “no.”
Hilbert’s hope was dashed, but it was hard to say whether that was a good or bad thing. In 1928, the mathematician G. H. Hardy had said, “If … we should have a mechanical set of rules for the solution of all mathematical problems, … our activities as mathematicians would come to an end.”
Years later, Turing mentioned the solution to the decision problem to Christopher. But this time, instead of offering a mathematical proof, he explained it with a parable.
Alan: Dear Christopher, I thought of an interesting story for today.
Christopher: An interesting story?
Alan: The story is called “Alec and the Machine Judge.” Do you remember Alec?
Christopher: Yes. You’ve told me. Alec is a smart but lonely young man.
Alan: Did I say “lonely”? All right, yes, that Alec. He built a very smart machine that could talk and named it Chris.
Christopher: A machine that could talk?
Alan: Not a machine, exactly. The machine was just the supporting equipment to help Chris vocalise. What allowed Chris to talk were instructions. These instructions were written on a very long paper tape, which was then executed by the machine. In some sense, you could say Chris was this tape. Do you understand?
Christopher: Yes, Alan.
Alan: Alec made Chris, taught him how to talk, and coached him until he was as voluble as a real person. Other than Chris, Alec also wrote some other sets of instructions for teaching machines to talk. He put the different instruction sets on different tapes, and named each tape: Robin, John, Ethel, Franz, and so on. These tapes became Alec’s friends. If he wanted to talk with one of them, he’d just put that tape into the machine. He was no longer lonely. Marvelous, right?
Christopher: Very good, Alan.
Alan: And so Alec spent his days writing instructions on tapes. The tapes ran so long that they piled all the way to the front door of his home. One day, a thief broke into Alec’s home, but couldn’t find anything valuable. He took all the paper tapes instead. Alec lost all his friends and became lonely again.
Christopher: Oh I’m sorry, Alan. That makes me sad.
Alan: Alec reported the theft to the police. But instead of catching the thief, the police came to Alec’s house and arrested him. Do you know why?
Christopher: Why?
Alan: The police said that it was due to the actions of Alec that the world was full of talking machines. These machines looked identical to humans, and no one could tell them apart. The only way was breaking open their heads to see if there was any tape inside. But we couldn’t just break open a human head whenever we pleased. That’s a difficult situation.
Christopher: Very difficult.
Alan: The police asked Alec whether there was any way to tell humans apart from machines without breaking open heads. Alec said that there was a way. Every talking machine was imperfect. All you had to do was to send someone to talk with the machine. If the conversation went on for long enough and the questions were sufficiently complex, the machine would eventually slip up. In other words, an experienced judge, trained with the necessary interrogation techniques, could work out which interviewees were machines. Do you understand?
Christopher: Yes, Alan.
Alan: But there was a problem. The police didn’t have the resources or the time to interview everyone. They asked Alec whether it was possible to design a clever machine judge that could automatically screen out the machines from the humans by asking questions, and to do so infallibly. That would save a lot of trouble for the police. But Alec responded right away that such a machine judge was impossible. Do you know why?
Christopher: Why?
Alan: Alec explained it this way. Suppose a machine judge already existed that could screen out talking machines from humans within a set number of questions. To make it simple, let’s say that the number of questions required was a hundred—actually, it wouldn’t matter if the number were ten thousand. For a machine, one hundred or ten thousand questions made no difference. Let’s also suppose that the machine judge’s first question was randomly chosen out of a bank of such questions, and the next question would be chosen based on the response to the first question, and so on. This way, every interviewee had to face a different set of one hundred questions, which also eliminated the possibility of cheating. Does that sound fair to you, Christopher?
Christopher: Yes, Alan.
Alan: Now suppose a machine judge A fell in love with a human C—don’t laugh. Perhaps this sounds ridiculous, but who can say that machines cannot fall in love with people? Suppose that that machine judge wanted to live with his lover and had to pretend to be a human. How do you think he would make it work?
Christopher: How?
Alan: Simple. Suppose I were the machine judge A, I would know exactly how to interrogate a machine. As a machine myself, I would thus know how to interrogate myself. Since I would know, ahead of time, what questions I would ask and what kind of answers would give me away, then I would just need to prepare a hundred lies. That’s a fair bit of work, but easily a
chievable by the machine judge A. Doesn’t that sound like a good plan?
Christopher: Very good, Alan.
Alan: But think again. What if this machine judge A were caught and interrogated by a different machine judge B? Do you think machine judge B would be able to determine whether machine judge A was a machine?
Christopher: I’m sorry, Alan. I don’t know.
Alan: That’s exactly right! The answer is “I don’t know.” If machine judge B had seen through machine judge A’s plan and decided to change questions at the last minute to catch machine judge A off guard, then machine judge A could also anticipate machine judge B’s new questions to prepare for them. Because a machine judge can screen out all machines from humans, it is unable to screen out itself. This is a paradox, Christopher. It shows why the all-powerful machine judge imagined by the police can’t exist.
Christopher: Can’t exist?
Alan: Alec proved to the police, with this story, that there is no perfect sequence of instructions that could tell machines and humans apart infallibly. Do you know what this means?
Christopher: What does it mean?
Alan: It means that it’s impossible to find a perfect set of mechanical rules to solve, step by step, all the world’s problems. Often, we must rely on intuition to knit together the unbridgeable gaps in logical deduction in order to think, to discover. This is simple for humans; indeed, often it happens even without conscious thinking. But it’s impossible for machines.
Christopher: Impossible?
Alan: A machine cannot judge whether the answers are coming from a human or a machine, but a human can. But looking at it from another side, the human decision isn’t reliable. It’s nothing more than a shot in the dark, a guess based on no support. If someone wants to believe, he can treat a machine conversation partner just like a human one and talk about anything in the world. But if someone is paranoid, then all humans will seem like machines. There is no way to determine the truth. The mind, the pride of all humankind, is nothing but a foundationless mess.