But even off the track they found a lot in common. They both liked to hoist a cold one at the Avalanche after school. They both enjoyed the threedies, crude as they were in those days. They both had the same taste in women, which led to their one serious quarrel in college.
Jared was short and dark, and Kyle was tall and fair. They differed on any number of topics, and many were the debates that made short the night—first in the dorm; later in the commune, which is where I met them and became their referee. In three of four pro sports, they rooted for different teams.
But they did not insist on mutual agreement in all matters. If a friendship cannot withstand a difference of opinion, it is not a friendship at all.
Kyle majored in computer science, which is the 100-meter dash of contemporary technology, while Jared majored in philosophy, surely a marathon among human endeavors. Kyle used to twig Jared about entering a field of no practical use, but Jared told him that while good computer science might stop our machines from making mistakes, good philosophy might stop people from making them.
Then came graduation, and they went their separate ways—Jared to graduate school and Kyle to his parents’ garage, where he set up a software company in the usual fashion. They promised to stay in touch, and for the most part they did.
Later, one of them decided to live forever.
TURISTS
The European Philosophical Association held its meeting in Vienna one summer, ten years after graduation, and Jared was a featured speaker. Even at that point in his career, it was inconceivable to hold such a conference and not invite the enfant terrible of metaphysics. And it so happened that there was enough overlap between analytical philosophy and mathematical logic that I had been invited to speak. A few texts and tweets discovered that Kyle was in Leipzig that week on business, and a few more arranged a get-together for the three of us. Kyle took the Bullet to Vienna and then a cab to the University and was waiting for us in the Stadtpark outside the campus when the conference broke up.
Jared had made reservations at the Palmenhaus, and it was such a brilliant afternoon that we walked to the Burggarten from the University. That section of the Ring features brightly flowered parks, outsized statues of half-forgotten heroes, and immense buildings in the ponderous Hapsburg style. Amidst these great stone giants, the Butterfly House was a glittering confection of iron and glass. We strolled down the barrel-vaulted greenhouse, through a rain forest, under waterfalls, past hollow trees, surrounded all the while by a kaleidoscope of tropical butterflies. The Palmenhaus itself had been at one time the Imperial-and-Royal orangery, and the long summer day flowed through the glass walls and ceiling and coated everything in the afternoon light.
Talk ran high. We hashed over Old College Days and brought one another up to speed on life since then. That was mostly for my benefit. Kyle and Jared had stayed in touch, and I was the third man in a duet.
Kyle told me about his latest app; Jared tried to explain something or other metaphysical; and I learned that his wife, Gladdys, was even then roaming the Kärtner Strasse, credit card in hand, and would join us in time for after-dinner drinks.
“Which reminds me,” Kyle said to me. “How’s your sister?”
I should explain that my sister Maddy had been the source of that one true quarrel the two friends had shared in college; and Kyle’s question was a none-too-subtle reminder to Jared that Gladdys had been his second choice.
Jared ignored the jibe and asked about my paper at the conference, so I explained as much about paired adjoint functors as dinner conversation could handle. One map runs from problem to solution and the other from solution to problem. The solving function finds the best solution to a given problem, and the posing function finds the biggest problem that a given solution solves. So if a set of things can be described by a collection of properties, then by Stone-Čech compactification, a finite collection will suffice, and that meant you could test for properties in a finite time.
Kyle sat with his hands pressed together as if in prayer, tapping his teeth with them. Then he pulled a card from his wallet and handed it to me. “Get in touch with me later, Mac. I’d like you to go over that with some of my people, both barrels. I don’t know that it can solve my problem, but I trust my intuitions.”
Dessert had arrived by then. “What problem is that?” Jared asked over a forkful of Malakoff dumpling.
Kyle grimaced. “Oh, reason I was in Leipzig. I tried to hold a meeting with a client over the Net using a simulacrum, but he twigged to the gig and I had to fly over personally to smooth feathers.”
“I can see where he might be ruffled,” said Jared. “You disrespected him, sending a sim to do a man’s job.”
Kyle flipped a hand. “Yeah, so I’m telling him now it was a marketing gimmick for our next-gen simulations. It did take him fifteen minutes before he figured it out.”
I cut a slice of Alsatian munster off the cheese board and ate it with a soda cracker. “You mean the Turing test? Fifteen minutes to realize he’s talking to a machine? Not bad.”
“Yeah, but he shouldn’t have realized it at all. My problem—aside from selling him a new simulator app for his auto assembly design team—is to figure out how he figured it out. I mean, the in-house alpha tests were almost perfect. So why the fail on the first beta?”
Jared smiled briefly over his dessert. “I can tell you why.”
Kyle was drinking tawny port and savored it before setting the snifter down. “Okay. This ought to be good. I’ve had people going over the architecture, the output screens, the response matrices…And you know philosophically why the alphas worked and the beta failed.”
“Well, yes,” said Jared. “Your staff thought the sim was acting human because…that’s the way they act.”
I couldn’t help the laugh, and Kyle gave me an unhappy glance. “You never did have a great sense of humor, Mac.”
Jared was very pleased with his jest, and wiped his lips with his napkin. But as he lifted his port he grew more serious. “It’s human nature, Kyle. No utterance is self-explaining. The listener always hears things through the filter of his own preconceptions. You and your internal testers expected the simulation to work, so you ‘heard’ the simulated responses as human-like. Your customer in Leipzig had no such expectations, and something struck him as false.”
“Yeah, but what?” Kyle leaned forward and put both hands on the table. He said, pronouncing each word carefully, “The cat sat on the mat.”
Jared blinked. “What?”
“You said the listener always brings his own expectations to the—what’d you say?—the utterance. So what do you get out of that utterance. ‘The cat sat on the mat.’ Go ahead.” He sat back and crossed his arms.
Jared frowned for a moment, then looked to the glass ceiling of the restaurant and mused: “Perhaps it was for counterfeiting…”
Now it was Kyle’s turn. “What?”
“The mat,” said Jared. “Mats are plates used in flexographic printing. But why would the dude hide them and not give them up…?”
“The dude?”
“Yes, the ‘cat.’ Don’t hear that slang much since the 70s, except among jazzmen, so maybe he was a musician in a jazz band, or an old hippie. But what sort of flexible printing plates would you ‘sit on’ for someone?”
“Counterfeit plates,” said Kyle slowly.
“Well, that was just a guess. But you see my point? The meaning of a text depends on the con-text, and that includes the listener’s expectations. Among hippies, ‘cat’ meant one thing; among boatmen, it would mean the tackle used to hoist an anchor to the cathead.”
Kyle picked up his glass and swirled the port around. Then he sighed and drank. “Jared, I hate to say this, but…You may be onto something.” He set the snifter down and fell silent as his gaze lit on a small white butterfly that had escaped the Butterfly House. Then, he leaned across the table, taking us into his confidence.
“I’m going to do it, guys. I’m going t
o create the first artificial intelligence.”
Jared grinned. “I’m still looking for the natural kind.”
But Kyle was in earnest. “Don’t laugh, Jared. Children will study about me in their schoolbooks.”
“A helluva thing to do to the poor kids,” I suggested.
Jared pursed his lips and looked distant. “I don’t know. The airlines use flight simulators and you can get inside one and it’s just like piloting a 787 from New York to LA; but with one crucial difference.”
“What’s that?” Kyle asked.
“When you get out of the simulator, you won’t actually be in LA.”
Kyle thought for a while, finished his port, then leaned forward again. “Tell you what. Why don’t you come out and see me in St. Louis. You too, Mac. We’ll do a Turing Test on my best sim, and you can tell me what strikes you as false.”
That was how it all started. It seemed innocent enough at the time.
We had just agreed on a date when Gladdys found our table. “Hello, boys,” she said and laid a stack of packages beside the fourth chair. “You can run, but you can’t hide.” She gave Jared a peck on the cheek and turned to me with that look of vague recognition that announced that she knew she knew me but could not recall my name.
“Mac,” I said. “John MacKenzie. The math whiz,” I added.
“Oh! Yes.” She pointed a finger at me. “Maddy’s brother. How are you?” But she did not wait for an answer before turning to Kyle, who had risen from his chair. “Kyle!” Then, in a stage whisper: “We’ve got to stop meeting like this. I think Jared is beginning to suspect!” Kyle grabbed her around the waist and swung her horizontal in a parody of an old romance movie.
“Don’t worry about Jared,” Kyle said. “I’ll distract him. Jared, look, a butterfly!”
He returned Gladdys to vertical and, laughing, she took the fourth seat. The waiter swooped in and she ordered a glass of sherry. “So, where’s Denise?” she asked when all was settled. “She couldn’t come?”
Kyle shrugged and tossed his head. “She and I aren’t together anymore.”
While Gladdys expressed sympathy, Jared leaned toward me and whispered. “He was always good at the hundred-meter dash.”
ORIGINAL SIM
Two weeks later we visited Kyle in St. Louis. He was still Vaporetti back then—vapors, cloud computing, get it?—and still pushing the eccentricity of “Silicon Prairie.” I gave the seminar on adjoint functors to his staff. I think two of them understood it and one of them may have eventually made something of it.
Kyle wanted to apply the theorem to the frame problem. After each action, the AI has to update its “inventory” of what the world is like. But how does it know which items to update? The “common sense law of inertia,” also known as the “let sleeping dogs lie” strategy, is for the system to ignore all states unaffected by the action. The problem is: How many non-effects does an action have? Using the Harris-MacKenzie Theorem, that infinite set might be compactified in practice to a finite set, thus reducing response times.
The day after the seminar, Jared showed up and Kyle steered us into a conference room where two wide-screens perched on a boardroom table surrounded by comfortable black leather high-backed chairs. Each monitor displayed a human figure against an office cubicle backdrop. Both seemed very busy at something, but glanced out of the screen from time to time as if waiting for the session to begin.
“Behold, Adam and Bob!” Kyle announced with a sweep of his arm. “Those are code names,” he added as an aside, and Jared said with mock incredulity, “No, really?”
“Why two?” I asked.
Kyle perched himself on the edge of the board table. “Simple. Since you guys already know you’re here to do a Turing test, you might bring—what’d you call ’em, Jared?—‘filters of your own preconceptions’ to the way you perceive them. So, I decided to present you with one sim and one real dude. ‘Your mission, if you decide to accept it,’ is to tell me which is which.”
That seemed fair enough. Jared said I could go first and Kyle asked him to go next door to the break room so he would not be influenced by my session. Jared clapped me on the shoulder as he left and said, “Go get ’em, tiger.”
Kyle would not think he had a foolproof sim if the image itself was an obvious animation; and indeed both Adam and Bob seemed realistic. Adam’s office had no obviously personal décor—no children’s drawings, no award certificates, no vacation threedies pinned to the wall—but that proved nothing. If Adam were on staff at Vaporetti, he might not have a life.
I said Hi to Adam and he asked how I was doing, and I said fine, and things went on from there. Because the sim was a prototype designed to discuss software design needs with clients, Kyle had asked us to confine our discussions to that area and the normal chit-chat such a call might entail. That seemed fair enough. So I asked Adam about his family, and he responded that his mother was under the weather. Any kids? No, he said, and I thought that might explain the absence of cubicle art. I asked him what project he was working on and he told me and I asked him how far along he was and he said eighty-five percent by the PERT, and so it went. After ten minutes of this, I thanked him, and he said I was welcome, and I turned to Bob.
Bob’s office did contain personal items: a certificate hanging on the wall behind him, a child’s crayon drawing, a trophy on the bookshelf. We went through the same pleasantries and I learned that his daughter Carolyn wanted to be a dancer. Then I asked him the same project questions I had asked Adam.
When I had finished, I decided that Bob was the human because his background seemed more fleshed out and because I thought I had detected a slippage between lip movement and voice on Adam. I wrote my choice on a piece of paper and folded it.
Then it was Jared’s turn. Kyle fetched him from the break room and he took the seat in front of Adam’s monitor and looked into the screen camera.
And said nothing.
The silence stretched on, and I began to wonder if Jared were at a loss for the right questions to ask. But after a few minutes, he said, “Hey, how about those Cards?”
Adam frowned and said, “What do you mean?”
Jared nodded and, without saying good-bye, rolled his chair over to face Bob. He repeated the same procedure: a long silence, and a question about the local baseball team.
“They’re both sims,” Jared announced when he had finished and spun his chair around. Kyle’s sour look was sufficient confirmation. I quietly slipped my own choice into my jacket pocket, hoping no one would ask to see it.
Jared said, “You thought that once I had decided one of them was a sim, I would automatically accept the other as human, and you would have your Aha! moment.”
“You suspected right off.”
Jared shrugged. “I suspected before I came here. I knew you would try to stack the deck. I didn’t know just how.”
Kyle flushed and looked away for a moment. “All right, you got me. Mea culpa. But what was that business sitting there like a bump?”
Jared just looked at him. After a while, Kyle said, “Well?” in a very impatient tone.
“I was waiting to see if your sim would do what you just did. Humans grow impatient. Computers simply idle.”
Kyle closed his eyes and let his breath out. “Okay. You’re right, as usual. We can work up some subroutines for that. Throw in some random number generators…” He did not make notes, but I thought an audio pick-up might be recording him. “I notice you didn’t say ‘the cat sat on the mat.’ I was ready for that one. But…”
“How about those Cards?” Jared chuckled. “I wondered if your sims might not be able to deal with equivocation—where a word like ‘card’ might have more than one sense…”
“I know what equivocation is,” Kyle said with a flash of irritation.
“Based on syntax, it might have been a question about baseball or a question about poker; so your sims were cued to say they ‘didn’t understand the question’ to resolve the
ambiguity. But syntax isn’t semantics, and no one living near St. Louis could be confused about which Cards I meant. Even if they didn’t follow baseball, they would say something like, ‘I don’t follow baseball’ or ‘I didn’t catch the game.’”
“Okay. I need better context filters for ambiguities. More fuzz in the fuzzy logic…”
Jared hesitated, then shrugged. “You can’t program intellect, Kyle. As Feser explained, intellect is the ability to grasp abstract concepts—like ‘man’ or ‘being mortal,’ to put them together into complete thoughts—like ‘all men are mortal,’ and to reason from one thought to another—like when we infer from ‘All men are mortal’ and ‘Socrates is a man’ to ‘Socrates is mortal.’”
“Computers eat logic for breakfast,” Kyle responded. He looked to me as for support, but I kept out of it.
“Green is an electromagnetic wave,” said Jared. “Grass is green. So grass is an electromagnetic wave. The syntax is the same, but no human would make the mistake. We would recognize that ‘is’ has more than one meaning. Socrates is a man in a different way than grass is green. But to a computer, it looks the same.” He spread his arms, “Kyle, how is it even possible for holistic, open-ended, context-sensitive relevance to be captured by a set of propositional, language-like representations of the sort used in classical AI?”
But teeth showed in the patented Kyle Buskirk grin. “Look. You can do long division in your head, right? Well, so do computers. So, obviously human mental processes can be described by algorithms.”
“Some human mental processes,” Jared allowed.
Kyle ignored him. “And for each algorithm there’s a Turing machine that can implement it.”
“A credo of faith. The Church-Turing thesis can’t be formally proven.”
“Yet we’ve developed Boltzmann machines,” Kyle said, “that can categorize, perceive visible objects, understand language…”
“Scanning and pronouncing text is not the same thing as ‘understanding language.’”
Captive Dreams Page 21