Captive Dreams

Home > Other > Captive Dreams > Page 23
Captive Dreams Page 23

by Michael Flynn


  “So you’ve decided to simulate a specific human being, complete with backstory…” He nodded and I said, “It sounds like science fiction.”

  “Everything does,” Kyle said, “until it gets done.”

  THIS GÖDEL IS KILLING ME

  Years later, historians would say that it had been the “Flash Flu” that really gave the impetus to telepresence. People grew reluctant to gather in confined spaces—like offices, airplanes, and the like. Flashmobs, even after the ban was lifted, became smaller and fewer. A great many theaters and concert halls closed. Easier—and safer—to work solo in the Cloud—which by then was being called the Grid. Other historians pointed out that the magnitude of the epidemic had been greatly exaggerated by the blogosphere, and did not justify the degree of anxiety that gripped the country. Yet, what happens in history is to a great extent inseparable from what people think has happened.

  And so when Kyle indicated a need to consult with me, I struck a blow for personal contact and flew out from Chicago for a colloquium with his math people. Only two came in person; the rest “fibered,” as we used to say. We discussed some new theory I had developed from Savage that treated decisions as mappings from the system state Y into the space of consequences Z, and used a proximity on the consequences to determine which decisions were “close” to one another.

  Kyle called Jared to let him know, and our philosophical friend drove up from Princeton to spend what was supposed to be a fun weekend.

  Kyle’s home was also Flapjack, which was the name of his company du jour. Since graduation, he had made four fortunes and lost three. “Companies don’t matter,” he had explained once. “They’ve become commodities. I’ve invented the virtual company, the flash-corp. When I need the talent, I bring them together. We incorporate, finish the job, then disband, each to our own affairs.”

  The house sat on a half-acre lot in a township along the Northeast Corridor. Two roads entered the area and curved around and into each other so that between them they formed an irregular oval with two stems. The houses lined the inside of the oval, enclosing a woodland in the center. “The woods are sealed off from the township,” Kyle told us as he showed us about the property, “so it can never be developed.”

  It was not very large as wildernesses went—in fact, you could make out the backs of the houses on the far side of the oval—but Jared thought this an endearing and unexpected feature and I was inclined to agree with him.

  Kyle was then living with a network architect named Ling-ling, though he had not made the arrangement formal. Companies were not the only things in his life that came and went. She was pleasant to Jared, but just a little reserved, as if sensing that the short, intense philosopher occupied a niche in her lover’s life that she could never fill.

  We spent the night in two guest rooms and in the morning Kyle and Jared went out on a run. The irregular oval formed by the two roads provided a track, and both of them had kept up the practice since their championship track days. I left running and jogging to those better suited to it, and sat on the patio in Kyle’s back yard and thought long thoughts. Or else I napped. Later, Kyle and Jared joined me there to await dinner, which Ling-ling was preparing. The sun was setting behind us, so the treetops glowed in an unseen, faerie light and the shadow of the house advanced across the lawn toward the woods like the army of darkness.

  We talked a bit about Kyle’s line of “InterFaces,” simulations that people used as avatars. Why answer your vidphone in your own persona when you could answer as Napoleon or the latest gaming hero? Since they were supposed to be masks, neither autonomy nor deep realism mattered.

  Kyle drank “treated” coffee, infused with low-lactose whey protein and a blend of compounds from acai, pomegranate, blueberry, grape seed and green tea. “Full of anti-oxidants,” he declared as he lifted the cup in toast. “Never grow old.”

  Jared returned the salute with a cup of Earl Grey. “Too bad. I’ve been looking forward to becoming an éminence grise. But, Kyle, all plants contain antioxidants.”

  “And Rust-Oleum is an antioxidant, too,” I added. “Drink it, and I guarantee you’ll never grow old.”

  Kyle laughed, then frowned a little into his cup, perhaps wondering at its cost/benefit ratio.

  I handed him my beer. “Here, try this. No organism harmful to man can live in beer.”

  We japed some more, but inevitably the talk turned to Kyle’s project. He had given up on directly modeling human behavior and was focusing now on emergent properties, using methods a neighbor of his was developing at SingerLabs.

  “Emergent properties,” mused Jared. “That’s what we used to call formal causes in the old days.”

  “I thought it meant ‘then a miracle happens,’” I grinned.

  “Isn’t SingerLabs a biotech firm?” said Jared.

  “Bio- and nano-,” Kyle told us. “You know how complex behavior, like flocking in birds, can emerge from a set of simple rules…?”

  “Mmm.” Jared put his tea down. “Right. What is it, three rules?”

  Kyle nodded. “Yeah. So…How do you derive a complex organism from a simple genome.”

  “Genomes are simple?” I asked.

  “They aren’t big enough to contain a full set of blueprints and step-by-step construction drawings,” Kyle insisted. “So there must be some way to ‘unfold’ them from a generating set. Now, that’s not my problem; it’s Henry’s. My problem is to simulate a complete personality. The trick is to discover the smaller set of recursive instructions that it emerges from, then use subsumption architecture to build it up in layers. That’s where Ling-ling comes in. Reflex behaviors—that’s the generating set—go on the lowest layers; like ‘avoid crowding your neighbors’ or ‘steer towards the average heading of your neighbors.’ The more abstract behaviors are layered incrementally atop the simpler ones, and they control the direction to be taken to achieve an overall task.”

  “In other words,” Jared responded, “your higher layers deal with final causation. That makes sense to me. But I thought your customers weren’t looking for well-rounded sims for their InterFaces.”

  “They aren’t. I am.”

  “Oh, that’s right. You want to download a mind into the computer.”

  “You’ve been talking to Mac.” Kyle shook his head. “Who needs the Grid when we have Mac? Did he tell you his sister is a widow? Her husband died in the Flu.”

  “I think Mac was trying not to tell either one of us. Tell me you’re not heading for Elmira to make a play for the grieving widow. She has two kids, you know; and you don’t seem the paternal type.”

  “Me? No. I’ve got Lorraine.”

  “Ling-ling.”

  “Right. Ling-ling. No, I just thought you’d want to know.”

  “College was years ago, Kyle. I’m happily married.”

  “Sure, you are; but is Gladdys?”

  An uncomfortable silence enveloped us. Either Kyle was making a bad joke—like the man who says of his seven-year marriage that it was the happiest two years of his life—or he was making a serious observation. Gladdys was outgoing, a musician, and Jared was not exactly your party animal. I remembered how Kyle and Gladdys had been tête-à-tête in the hospital cafeteria. In her anxiety, had she whispered some confidence to him, some discontentment? Kyle began to redden slowly under Jared’s flint-faced scrutiny.

  I didn’t like the way they sometimes ignored me when I was with them. After all, it was my sister they were talking about, and in the end, Maddy had rejected both of them. Jared was dating Gladdys within the week, while Kyle had flitted to Denise, then Lorraine, Audrey, Katya, and now Ling-ling. I had thought Maddy long forgotten.

  So, as I had done so often in the past, I put myself verbally between them.

  “Will subsumption architecture help you download minds into computers?”

  Kyle shot me a grateful glance. He could be offensive, but usually not by intention, and he was always surprised when others took his jokes to he
art.

  “There are several promising avenues,” Kyle assured us. “We figure on success in maybe fifteen years.” Then he turned to Jared with the peace offering. “What do you think, Jared?”

  What Kyle wanted was that Jared should give him some encouragement, or at least wish him good luck. But Jared’s one great character flaw was his remorseless honesty. Perhaps he was nettled by Kyle’s jape and decided to give him both barrels of the philosophical shotgun.

  “Not in fifteen years,” he said. “Not in fifteen hundred years. It’s flat-out impossible.”

  That was when Ling-ling came to the patio door and told us that dinner was on the table. There was a certain hardness to her features, and I wondered if she had earlier overheard Kyle garbling her name. It was not unlikely, as she maintained a stony silence during much of the meal that followed.

  The table was long red maple, with ceramic tiles inlaid down the center on which hot dishes might be placed. Jared’s seat faced the French doors and the wooded lot that was the kernel in the residential shell. Kyle sat across from him, while Ling-ling sat at the head, near the kitchen door, a little apart physically as well as symbolically. I was relegated to the foot.

  “So,” Kyle said, “tell Ling-ling what you meant when you said it’s impossible to download a mind into a computer.” As if it was Ling-ling who cared.

  Jared already regretted his acerbic dismissal. He should have said something vague and non-committal. He did in fact think that AI was impossible and it would be a waste of Kyle’s time and talent; but Kyle would not be the first to hare off after El Dorado. What he should have said, as he admitted to me later, was “If anyone can do it, you can.”

  Instead, he sighed. “How much do you know about Gödel?”

  Kyle was feeling snappish by then. “Never heard of it.” Of course, being in AI, he had heard of Gödel’s theorem and the Lucas-Penrose thesis; but being in AI, he had also long ago dismissed it. One marker distinguishing the scientist and engineer from the logician and mathematician is the use of the term “impossible.” To Jared—and to me—it meant just that, like “a married bachelor.” But to Kyle, it meant “we don’t know how to do it yet.”

  But Jared took him at his word. “Gödel showed that any consistent computational system complex enough to support simple arithmetic will produce true sentences that cannot be proved within-the-system.

  Kyle scoffed. “If a proposition can’t be proved, how do you know it’s true?”

  “I said the proposition cannot be proven-within-the-system. Say you specify a computational system Wi. It consists of a finite set of axioms and a finite set of rules for developing propositions from those axioms. Those proof-sequences are like roads leading from the axioms to the propositions. The propositions at the end of the roads are ‘provable.’ What Gödel proved was that there are places where the roads don’t go. Some propositions are undecidable; and the statement of their undecidability must lie outside the system.”

  Kyle threw his hands up at that point. “That’s the catch. Expand the system. Why not a chain of machines, Wi, Wj, Wk,…, each proving the consistency of the preceding one. At some point they reach ‘critical mass’ and consciousness emerges.”

  Ling-ling spoke for the first time. “Even I see why that not work.” Yes, she had heard his earlier remark. I could hear it in her voice and deliberately exaggerated accent.

  Kyle shot her a look, but he could not have achieved his successes had he been unable to reason clearly. He sighed, and dropped his fork to the plate. “Right. Then the bigger system will also be incomplete and have true, but unprovable sentences.”

  “And that leads to an infinite regress,” said Jared, “and to an infinitely large machine.”

  “But why does that mean downloading is impossible? It’s just a matter of running the mind-program on a different substrate.”

  Jared had gone too far to back off now, and sometimes the quickest way out of a bad situation is to push on through to the other side. “Because a computer is a physical embodiment of a formal system. So Gödel’s theorem applies, and it follows that the computer will be incapable of generating a proof of consistency without external help. But the human mind is so capable, and from that it follows that no machine can be a complete model of the human mind.”

  “But your reasoning applies to any statement-maker. What about ‘Jared can’t assert the truth of this statement’? It’s true, but you can’t assert it. So, you’re subject to the same limitation. The whole Lucas-Penrose argument is vacuous.”

  But Jared was already shaking his head. “Assertion is not the same as proof. Provable statements are a subset of true statements; but a mere assertion need not be true at all. A computer can’t even see an unprovable statement.”

  Kyle laid down his knife and fork. “It can if we insert the unprovable propositions into the system as axioms.”

  Jared cocked his head. “Which one, the proposition or its negation? They’d both be undecidable, you know.”

  Kyle made a gesture. “Both!” Ling-ling sighed in an exaggerated fashion.

  Jared said, “Then the system would be inconsistent.”

  “So what? Humans are inconsistent, too.”

  “So nice,” said Ling-ling. “Inconsistent AI.”

  “Humans are rational,” Jared said. “Human reasoning is not just a set of formal steps but includes the ability to reflect on the correctness of those steps. And this is precisely what a purely formal system cannot do.”

  Kyle sighed and turned to me. “What do you think, Mac?”

  Long ago, in college, I had fallen into the role of referee, and so expecting the question, I had been mulling over what my answer ought to be. I did not know any mathematical logician who failed to find the Gödelian argument at least “interesting,” which is a term mathematicians use that means “I want to believe it’s wrong, but I don’t yet see how.” As Polyani once said, a formalized deductive system is an instrument which requires for its logical completion a mind using the instrument in a manner not fully determined by the instrument. And Jared had said something about subsets, so…

  “There ought to be a proximity on the set of all coherent propositions within the system,” I hazarded, thinking out loud. “Suppose Jared is right, and your AI can only yield provable propositions. But if these sentences are topologically dense in the space of all propositions, then every true proposition would be arbitrarily close to a provable one. So you could both be right. You can’t make an AI ‘just like’ Jared—though God knows, one of him is enough—but it might be possible to construct an AI indistinguishable for all practical purposes from a human intelligence.”

  The secret to getting on with Kyle was to tell him not the unvarnished truth, but the varnished truth. It helped if you slapped a coat of paint on it, too. Jared had told him that what he wanted most in the world was impossible. I had told him he might be able do something indistinguishable from it; and that was good enough. I’m not going to get into Harris proximities, closure operators, or the function space 2Y, or for that matter how I extended the concept of Dedekind cuts. That sort of thing is an acquired taste. Suffice it to say that I received the Fields Medal for the work a few years later. The interesting thing is that if Jared had not taken the liberty of friendship to slap him in the face with the intellectual equivalent of a dead fish, Kyle would not have turned to me, and I would never have achieved my brief moment of fame. Such are the vagaries of fate.

  But that came later. At dinner that evening, the silence was finally broken by the sound of Ling-ling’s silverware as she finished her meal. “All word-play,” she said when she looked up.

  Kyle took heart. “That’s right,” he said, gathering confidence. “Word-play, like all of philosophy. You’re asking me to drop my lifelong dream because of a damned metaphor?”

  “I’m not asking,” Jared said quietly, “that you do anything. Look. Even if it is impossible, so what? You can still dream the impossible dream.”


  “Well thanks a whole freaking lot.”

  Afterward, as he was gathering the empty dishes together, Kyle smiled, though it was a rueful smile. “You had me going, Jared. You really had me going. But what about the brain? When you get right down to it, the brain is a computational engine, too. That means there’s a flaw in your argument somewhere.”

  “No,” said Jared. “It means minds are not brains.”

  Kyle stared at him for a moment, then threw his head back and laughed. “You really had me going,” he said again.

  SLEEPING DOGS LIE

  A few years later, Kyle used his fourth fortune to fund the independent NM Foundation. By that time, he was simply Kyle Buskirk and Associates, having nothing more to prove. Sims had become in that decade what operating systems had been in the previous century, and Kyle’s underlying protocols had become by popular demand so much the standard for the industry that the government had contemplated intervention in the name of his less popular competitors. Kyle, who had never before contributed to a political cause, got the message and began to do so. Replacing his fourth fortune with his fifth was not a problem.

  NM stood for Nou Mechanima, which most took as a fanciful rendition of “New (or Now) Mechanisms.” Humanists knew that it would have meant “Mind Machine” in Greek, had -anema not been coyly misspelled as -anima, which in Latin meant “life.” Packing mind, machine, and life into the name seemed more subtle than Kyle’s usual playful approach to nomenclature. I suspected Jared’s hand in it.

  Kyle did not broadcast the NM Foundation’s goals, but neither did he try to hold them secret, and eventually the word spread. “Sim Guru to Build Brain in a Box” will give you a rough idea of the newsfeeds that followed. That set off the blogosphere. On the one hand were those accusing him of blasphemy for trying to create a human soul; and on the other hand were those who volunteered to cut off and freeze their own heads so that their minds could be downloaded when Kyle finally succeeded. Jared and Kyle had disagreed on more than a few things over the years, but that both these groups were nutty beyond all measure was something on which they found common ground. They don’t even know what a soul is, Jared complained to me in an email. They think it’s a substance. Since I didn’t know what he meant either, I let it ride.

 

‹ Prev