Uncanny Valley

Home > Other > Uncanny Valley > Page 18
Uncanny Valley Page 18

by C. A. Gray


  Once the A.E. chip was off again, I pulled my handheld out of my pocket and wrote to Liam, “I’m alive, calm down. I’ll be there when I can.”

  A few seconds later, the screen blinked. “Well, if you don’t update me for hours, I start imagining you getting kidnapped seven ways from Sunday. Sue me.”

  I scoffed out loud, but a tiny smile tugged at the corners of my lips anyway. I still wasn’t quite used to this character trait of his. Some small part of me found it kind of endearing… a very small part. I wrote, “We’re on a university campus, Liam. I’m fine. You’re so paranoid.”

  “Yes, we’ve established this,” he wrote back, “and I accept my flaws. What did you find out?”

  I didn’t want to try to compose my findings, or my subsequent musings in a comm, but he clearly wasn’t going to leave me alone for long. I’d have to just go tell him in person. With a twinge of dread, I enabled the A.E. chip one more time just to pull up the map so that I could find my way back to the library. I wasn’t very good at directions.

  But between the physics building and the library, I saw one labeled, “Philosophy.”

  I couldn’t explain why, but I felt drawn to it. Maybe I was preordained to go there, I thought, with a humorless little laugh. Maybe there was no point to anything I was doing, because what would be would be, no matter what I did. Or maybe instead, my preordained choices would ‘make a difference,’ if anything could truly be said to make a difference, but only because they were already woven into the tapestry of the universe from the foundation of time…

  On my way to the Philosophy building, I searched the labyrinth and found a professor also hosting office hours, who happened to be a dual chair in psychology. I made my way to his office, praying that he wasn’t another bot.

  “Enter,” called a husky voice, and I pushed open the old fashioned wooden door with clouded glass bearing his name in black letters. The man sitting before me—he was definitely man and not bot—had thinning salt and pepper hair, an oversized gut, and loose skin about his neck and above his eyelids. But he smiled when he saw me, spreading a kind of fatherly glow to all who entered his domain. A wave of relief hit me; I hadn’t known exactly how much I’d hoped for someone older and wiser to guide me until that moment.

  “Sir,” I began, as he gestured for me to sit in the upholstered chair across from him, beside the warm orange light of an incandescent bulb. “My name is Rebecca Cordeaux. I’m a neuroscience major.” I didn’t need to tell him in what school.

  “Ah, so you’re here for psychology and not philosophy, then. Which class are you in?”

  I opened my mouth and closed it again, deciding in a split second not to lie. “In—well, I’m visiting, sir. I’d just hoped I might be able to ask you a few questions, for… for a project I’m working on. I wondered what you can tell me about… free will. I mean, what is it? What do we know about it?”

  Professor Willit tilted his head to the side, inspecting my face. “Is this an academic question, or a personal one?”

  “Both,” I admitted.

  He nodded. “Philosophers have struggled with this question throughout the ages, of course. But the definition comes down to this: free will is when an agent has the ability to both recognize a choice between at least two alternatives, and is not coerced by any external force, but can choose between them of its own volition.”

  “But according to Professor Reddy, every choice is predetermined,” I blurted. “Do you believe that?”

  Right at that moment, another comm appeared on my retinas—I’d forgotten to disable the A.E. chip again after using it for the maps. “Bec, seriously? Are you dead, or are you trying to drive me crazy?” I dismissed it, and tapped my temple to turn it off.

  Professor Willit considered my question. “Well, yes and no,” he said. “If what she means by ‘predetermined’ is that our choices are a result of our past experiences and our character, then largely, yes. But I would argue that there is a distinct difference between being predictable and being coerced. Do I have a choice about whether or not to pick up this letter opener and throw it through my window right now?” He demonstrated, gesturing at the window. “Sure. But my character, my experiences, and my emotions are such that I won’t make that choice, because I have no good reason to do so. No one makes that choice for me; I make it for myself, but I am influenced by the whole of my history in making it.”

  “So you’re saying that free will arises automatically from having a choice between at least two alternatives. Even if you’re essentially… programmed to make one choice over another. By your history.”

  Professor Willit’s eyes sharpened at the word programmed. “Correct…” he said slowly. I had the sense that he was trying to guess my meaning. Then he ventured, “This doesn’t only apply to humans, though.”

  My heart skipped a beat. “What do you mean?”

  “Well, take companion bots, for instance,” he said. “They are programmed with social and moral ‘rules,’ as well as the core program of protecting the interests of the person to whom they are given as a companion. But what happens when these two things come into conflict? If the moral rule dictates one thing, but protecting the interest of her master dictates another, this sets up a choice, doesn’t it?”

  For some reason, Julie’s comm sprung to mind again when he said that. “Her mom never even sent the message. Weird, huh?”

  “But there are if/then statements in companion bot programming that determine which rule to follow when they come in conflict,” I pointed out. “So that isn’t truly free will, because the choice is coerced by their programmer.”

  Professor Willit shrugged. “You may be correct. I’m not a programmer. But…” he leaned forward, conspiratorially, “If I may say so, if Halpert has his way and the bots acquire emotion, then they will have a true dichotomy: the choice between following their programming, and following their desires.” He watched me very hard as he sat back in his chair again. “Which they choose at that point will be anybody’s guess.”

  “So… let me switch over to cognitive neuroscience, for a second,” I added. “This may be a stupid question, but are there… specific parts of the brain that deal with free will?”

  “You mean so that we can override it?”

  I nodded, and the professor tilted his head to the side, regarding me with a somber expression. “I wish I could help you. I really do.”

  My heart sank. “What do you mean?”

  He shook his head. “There are several parts of the brain associated with free choice, to be sure: decisions in the anterior cingulate cortex, timing of decisions in the supplementary motor area, and whether or not to act in the first place in the dorsal medial prefrontal cortex… but alas, Rebecca. The fact that we know which parts of the brain make free will decisions does not mean that we can eliminate free will by creating lesions in those areas. Quite the contrary: lesions in the prefrontal cortex tend to lead to socially inappropriate or immoral behavior. Rather than imposing an absolute mandate, such a lesion would exclude morality altogether.”

  I deflated. “But—there must be a way!”

  Professor Willit shook his head. “If there is, it will not be found by imitating the human brain. One can eliminate all restraint by damaging the brain, but we cannot impose moral behavior without the free choice of whether or not to obey it, once emotion gets involved. We use brain structures in the manifestation of our free will, but the reason free will is there in the first place does not appear to reside in the brain itself. I’m afraid you are seeking a scientific answer to a fundamentally spiritual question.”

  The conversation was clearly over. I mumbled my thanks, and slung my backpack over one shoulder as I left Professor Willit’s office.

  Once in the hallway, I closed my eyes, resting the back of my head against his door like I couldn’t bear the weight of it anymore.

  “Then we’re all doomed,” I whispered.

  Chapter 22


  It took me only a few minutes to find Liam, Francis, and Larissa in the library, in the glass conference room like Liam had told me. Liam’s eyes flashed when he caught sight of me.

  “I was just about to go looking for you! How hard is it to comm me and just tell me where you are?”

  “You’d think you were his stray teenage daughter or something,” Larissa commented, resting her chin on her fist and blinking up at me placidly.

  Liam shot her a sour look. “I’ll thank you to never make that reference again.”

  Francis narrowed his eyes at me while Liam and Larissa had this exchange, and I was just about to reply to Liam and beg exasperated forgiveness yet again when Francis’s stare became so annoying I could no longer ignore it.

  “What?” I demanded.

  “On a scale of one to ten, how important would you rate what you've found out so far?”

  To our mission, or to me personally? I thought but did not say. What I said was, “It was important, but not helpful, really.” I sighed, and turned to Liam, just about to summarize my conversations when Francis cut me off.

  “Okay, then I go first. Mine is an eight.” He pulled out a chair next to him, indicating that I should sit. I gave an incredulous little harrumph, but did as I was told, glancing at Liam as I did so. His lips twitched in amusement, his irritation evidently forgotten. At least there was that.

  “Liam told me about the salt and sulfuric acid purchase,” Francis said, showing me his netscreen.

  “Wait, I thought you guys were working on the Commune?”

  “Francis is phenomenal at multitasking,” sighed Larissa. “I think intense concentration on just one subject would bore him to tears.”

  Francis nodded, very seriously. “That’s true, unless the one subject is a mystery in need of solving, in which case I can concentrate for up to twelve hours at a time with hardly a bathroom break. Although very few subjects warrant such attention. I solve most dilemmas long before they reach even two hours, let alone twelve—”

  “Oh my goodness,” I said, widening my eyes at him. “Get to the point!”

  Francis, for once, did as he was bid. “Salt and sulfuric acid produce hydrochloric acid.”

  “I already figured that out.”

  “I know you did, that’s not the discovery, if you would just wait a minute,” Francis sai impatiently. “Some decades ago, there was a theory that a weak acid like hydrochloric might be able to artificially produce ATP…”

  I shook my head. “What’s ATP?”

  Francis gave a superior little snort. “I forget, you don’t know biochemistry. ATP, Adenosine Tri-Phosphate. It’s your body’s energy currency, produced by mitochondria in all of your cells. In our bodies we need protons in order to drive its production. A weak acid like hydrochloric would have protons that could be stripped relatively easily. Way back before the Council of Synthetic Reason, before it became illegal, everybody was trying to produce humanoid bots. The assumption was that if they looked human, they would have to be a combination of silicon and wires and biochemistry like ours—essentially like a cyborg.”

  “So Halpert is building, or he’s helping someone else to build, illegal humanoid robots!” Larissa finished for him. “The weak acids are essentially its food!”

  I blinked at her, not understanding. “But why would Halpert be building illegal bots? What would be the point?”

  “I have a theory,” murmured Francis. “It’s a good one.”

  “Oh, that’s a given,” I rolled my eyes. “Fine. Let’s have it.”

  “I’m not ready to share until I have more evidence,” he declared placidly. “Liam has your Odessa researching a few things for me to see if it bears up.”

  “All right, Rebecca’s turn,” said Liam, turning to me. “What did you find out?”

  I sighed, ticking off my discouraging findings on my fingers as I related them. “One: according to physics, there’s no such thing as free will. Not even for us. No, no, I amend: it’s possible that we could do something completely independent of all our previous experiences and external influences, but it is so unlikely as to be essentially impossible, for all intents and purposes.”

  “I could’ve told you that,” Francis muttered, not even looking at me. He was already apparently engrossed in an A.E. search, on to some other more interesting task. I shot him a scathing look that he didn’t even see.

  “Was the physics professor a bot?” Liam asked.

  “Yeah. How did you know?”

  “Only a bot would reason like that,” he shrugged. “It’s a deductive approach to what ought to be an inductive question. Humans know we have free will, because we experience it. A human would start with the conclusion in mind and work backwards, to try to explain it. But a bot can’t do that, because free will is a subjective experience that it does not share. So instead, it will try to use laws of physics to determine whether free will is possible.”

  “Francis,” I turned to him, deadpan. “Are you a bot?”

  He arched an eyebrow at me. “Ha, ha. I meant I could have told you that is the conclusion you’d arrive at purely from the standpoint of physics.”

  I exchanged a nettled smile with Liam, and told him, “Well, I’m glad you feel that way, because it freaked me out a little, honestly. So I went to see a dual philosophy and psych professor after that…”

  “Ah, was that where you were when you were ignoring my last comm?” Liam asked pointedly.

  I rolled my eyes and went on, “Professor Willit was a human, thankfully. He said that free will just means an agent has a choice between this or that, and is not coerced by any external force to choose either. Even if his choice is predictable based upon his past experiences, that makes him no less free to make it. He agreed that giving bots emotion will set up a dichotomy between the machine’s programming and its desires.” I thought about mentioning what he’d said about companion bots. But vocalizing it would somehow make it seem more real, and I didn’t want it to be real.

  “So basically he said giving bots emotion will create free will, but there’s nothing we can do about it,” Francis summarized, now scrolling through his netscreen like I was boring him to tears.

  “Yes, I said it wasn’t helpful, didn’t I? What are you doing?”

  Francis turned his screen to show me, and my heart skipped a beat. A photograph filled the screen of a large group of people, most of whom I did not recognize. But dead center was my dad.

  I shook my head at Francis, not wanting him to see that he’d unnerved me. “Why are you showing me—?”

  He pointed at the screen again, but his finger landed on an image beside my dad’s. “That is Randall Loomis.”

  It took me a second to comprehend this. When I did, I pulled Francis’s netscreen toward me, and enlarged the picture. This photo had been taken probably ten years ago, judging by my dad’s features, so I’d have to account for some aging, of course. But Randall Loomis was a young man, not much older than my dad had been. He was rather handsome too, with a crop of dark hair and laughing hazel eyes.

  “That’s not John Doe,” Liam guessed.

  I shook my head slowly, meeting Liam’s eyes and feeling a knot in the pit of my stomach, like I’d swallowed a stone.

  “No,” I said at last. “It isn’t.”

  Chapter 23

  “Who’s John Doe?” Larissa piped up, but Francis didn’t ask, I noticed. I knew Liam must’ve told him, or Francis wouldn’t have gone to the trouble to show me the picture in the first place. That was why he did it.

  My eyes flashed at Liam, gesturing at Francis. “You told him?”

  “He’s on our side, Bec,” Liam reminded me.

  “I’m on your side, too!” Larissa piped, raising her hand.

  “Can I talk to you?” I said to Liam through gritted teeth. “Alone?”

  “Why, so you can chew him out in private?” Francis commented, his tone flat and bored. “We can imagine
what you’re going to say, anyway. Might as well get it out of the way here.”

  “Nobody asked you!” I snapped.

  “Or, you can yell at me instead,” Francis suggested, nonplussed. He made a reeling motion with his hand. “Whatever gets it out of your system.”

  “Come on,” Liam took me by the elbow before I could do just that, leading me out of the conference room. Evidently deeming the general library a poor choice for a shouting match, he led me to the lawn outside. I wanted to yank my elbow away indignantly—the trouble was, Liam wasn’t gloating. It made it harder to be mad at him, and I really wanted to be mad.

  “Congratulations. You were right and I was wrong. But did you have to tell Francis?” I hissed.

  “Like it or not, he’s brilliant, which makes his a good opinion to have,” Liam ran a hand through his hair, turning imploring blue eyes upon me full force. “Look, I know I should have asked you before I told him, even so. I’m sorry I didn’t.”

  “Why didn’t you?” I demanded.

  “Because you’d have said no, and because there’s something fishy about that guy. He knows everything about you, knows where to find you, but won’t give you his name? And you’re meeting him in these dark alleys and anything could happen… I just wanted Francis’s help in figuring out who he is. I don’t know why I didn’t think of just showing you a picture of Loomis for comparison in the first place. I’m sorry, Bec.” He took my hand, raising his eyebrows and tilting his head down in a perfect imitation of a puppy. “Forgive me?”

  “Fine, sure,” I muttered, irritated that he wasn’t provoking me.

  He smiled, tucking a stray strand of my hair behind my ear. I felt the heat rise to my cheeks in response, annoyed with my own body for its betrayal. Liam could surely see it, and would probably take it to mean something it most certainly did not. I tried to drop his other hand, but he held mine fast.

 

‹ Prev