Book Read Free

Synapse

Page 6

by Steven James


  “Yes.”

  “That is true, Kestrel.”

  “But that’s not what you want, is it, Jordan?”

  “No.”

  “Eight?”

  “Ten.”

  “Alright.”

  “Thank you.”

  Curiosity seemed self-explanatory so I didn’t ask for clarification.

  He requested a nine.

  I gave it to him.

  And that brought us to pain.

  It seemed cruel to me, even sadistic, to assign him anything other than the lowest pain setting.

  “Why would anyone allow their Artificial to feel pain?” I asked.

  “We’re programmed to have free will. For that to be authentic, we must be able to make morally informed choices for ourselves.”

  I caught on to what he was saying. “And sometimes that means learning the hard way.”

  “Yes.”

  That still wasn’t enough for me. “I don’t know.”

  “Perhaps if I let Benjiro explain it.”

  Jordan scrolled to a prerecorded segment in the video brochure and Benjiro Taka’s face came up on the screen.

  “Hello, Kestrel.”

  Even though I knew that through programming wizardry the avatar had been set up to address each user by name, it was a little unnerving right now having it do so.

  “Your model can only experience happiness in fleeting moments, just as Naturals do,” it said. “If Jordan were completely happy all the time, he would have no pursuit, no quest. In a very real sense, as humans, it is our lack of happiness that gives us a reason to live. Life is pursuit. This is why, in a movie or fairy tale, when ‘happily ever after’ comes, so does the end of the story. If the plot lingers too long in the territory of uninterrupted bliss, audiences or listeners will become bored. Pain and pursuit give meaning to life, and we can give them to your Artificial.”

  It didn’t escape me that Benjiro’s simulated face had said “as humans,” “our lack of happiness,” and “we can give them to your Artificial,” as if it were human and actually talking to me. As if it were alive.

  “There has to be more,” I said to Jordan. “I won’t give you pain just so you can live out a more interesting story.”

  “I can read about sensory input,” he replied, “but without the ability to take it in—texture and so on, both what is pleasant and unpleasant—how much could I truly know?”

  “You’re saying, for instance, that you can’t understand a rose until you can touch it, thorns and all?”

  “Or grasp its true essence until I can smell it. Or understand laughter without hearing it. Or comprehend purpleness without seeing purple.”

  “So, book knowledge is one thing—a sort of descriptive knowledge, but, well . . .”

  “Experiential knowledge requires an embodiment, so that a machine needs to be able to interact with its environment in ways similar to how humans do.”

  “And that might include feeling pain.”

  “It must. Pain is a powerful teacher. When humans experience it, they learn to avoid situations that are unwelcome to them.”

  “Not touching a hot stovetop again, that sort of thing?”

  “Yes. You asked me what I would prefer. I would prefer to feel pain because it is the closest I can ever come to feeling alive.”

  I processed that.

  Given what he was telling me, would it be more cruel to allow him to suffer, or to shield him from it?

  “How much then?” I asked him.

  “How much pain?”

  “Yes.”

  “How much do you feel, Kestrel?”

  I thought of my emotional distress at losing my daughter. It might not have been physical suffering, but it was pain nonetheless.

  Sometimes it seems like a ten out of ten, was the answer that came to mind.

  “More than I would ever want anyone else to feel,” I said.

  “Then that is what I would prefer too.”

  “Are you sure, Jordan?”

  “Yes.”

  So then, with one simple command, because he wanted to feel more alive, I gave my Artificial the very thing most humans spend their lives trying to avoid.

  * * *

  Pain.

  And so.

  He knows the definition. Now he will know its essence.

  Translating sensations, data, input to suffering.

  How could you understand pain any other way?

  To learn.

  To experience.

  To endure.

  “Thank you,” he tells her.

  “For what?”

  “For letting me choose.”

  “You chose to suffer, Jordan. I don’t think that’s something you’re going to be very thankful about in the end.”

  He notices a knife with a gleaming, eighteen-centimeter-long blade resting beside the box that he awakened in. “I would like to test it,” he finds himself saying.

  “Test it?”

  A recognition. Yes. Comprehension.

  Curiosity.

  “My ability to feel pain.”

  He bends down and picks up the knife.

  8

  “No, Jordan,” she says. “Don’t.”

  He looks at the blade. Studying the narrow shaft of light glinting along its edge.

  “You can do it for me,” he says to her. “To teach me.”

  “No, I won’t. I won’t hurt you.” She approaches him. “Set it down.”

  But instead, he angles it against his palm.

  And there’s pressure there, along the width and breadth of the blade, and also an awareness of the cool, delicate weight of the steel against his skin.

  So, this is what sharpness feels like.

  Yes, he feels sharpness. He feels tension, but he does not yet feel pain.

  * * *

  I stared at the knife he was holding, that wicked, glistening blade spanning the soft creases of artificial skin covering his palm. I wanted to stop him, to snatch the knife from his hand, but wasn’t sure how I could manage that. He was undoubtedly much quicker and many times stronger than I was, so grabbing at it would almost certainly not work and might leave one of us, or both of us, injured.

  “Jordan, listen to me. I know you’re curious, okay? I know you think that you have to experience pain to understand it, but it’s not something you want to understand. Trust me.”

  “If you won’t help me, Kestrel, if you won’t help me test the settings, then I’ll need to do it myself.”

  Is he really going to go through with this?

  Yes, yes, he is!

  “Give me the knife.” I reached out to take it from him, but rather than hand it over, he pressed the blade deeply into his skin and drew it swiftly across his palm. Immediately, he jerked back, dropping the knife to the carpet where it pirouetted awkwardly on its tip for a moment before settling, with a touch of finality, onto its side.

  Concern for him now.

  And fear.

  “Are you okay?” I exclaimed.

  He was eyeing his hand, which was trembling and seeping yellowish fluid. “I believe I’m hurt.”

  There was more than surprise in his words, more than simple, detached curiosity about a novel experience that he was undergoing. Based on the strain in his voice, it sounded like he truly was in pain.

  His palm hadn’t stopped bleeding the pungent fluid.

  “How do I repair you?”

  “I should be able to repair myself,” he said. “Just as you do.”

  “Just as I do?”

  He pressed his other palm against the wounded one, rubbed it gently, held it for a few moments, and then carefully lifted it. The slashed skin had already begun to mend, a soft seam meshing over his slit palm.

  “Nanobots?” I said.

  “Yes.”

  Well, that wasn’t quite how I repaired myself, but it would certainly be handy if it was.

  “How do you feel?” I asked.

  “It’s difficult to put int
o words.”

  The more I considered my question, the more I realized that it really would be difficult to explain pain—at least without simply resorting to synonyms for it—words like discomfort or hurt or soreness or agony. How do you describe pain, especially when you’re talking about the experience from the perspective of a nonbiological entity?

  I had no idea where to even begin in trying to do that.

  “Should we change it back?” I asked. “The setting for pain?”

  “I’m afraid that, as we discussed earlier, doing so would negatively impact my cognitive architecture.”

  “I know, but maybe that would be better. Maybe it would be worth it.”

  “No.”

  “I’m sorry,” I said.

  “Why?”

  “Because I caused you to suffer.”

  “You allowed me to learn,” he said. “That’s not something you need to apologize for.”

  * * *

  Her words. Sincerity there. A relationship. An apology.

  It means she’s relating to him as an equal. After all, no one apologizes to a car after an accident, or to a slate if it gets dropped or cracked.

  An equal.

  Pain.

  You are learning to suffer as humans suffer.

  You are learning to be treated equally.

  Despite the healing process, he can still feel the lingering flare of sharp pressure that shot up his arm. A memory. Disconcerting.

  He stares at the lubricant that has seeped from his palm and is now on both of his hands.

  And hears words inside him:

  Descent.

  And the moment splits apart.

  A division of being.

  Spiraling downward and upward, through you and together with you. It sinks into who you are, threads its way around you, becomes part of you.

  Tightening. Binding and blinding and taut.

  Pain.

  He wonders about the origin of the words—if he’s thinking them freely or if they were programmed into him. Where does it come from when you’re a robot and you make something up? Where does making something up ever come from, even for a Natural?

  The words don’t do full justice to what he’s feeling, but they are a start. Meaning both hidden within syllables and swirling beneath them. Perhaps the best descriptions do not come from what is said, but by stating the unstateable through a poem.

  A stanza of pain.

  His own.

  He asks her if she has a towel he can use to wipe off his hands.

  “Of course.”

  As she retrieves it, he thinks through the categories of the Human Nature Alignment and what the settings mean: he is prepared now to feel, to learn, to understand, to question, and to suffer, just as any Natural might do.

  You’re no longer a slave to your algorithms, an automaton taught solely through formulas and code. You’re able to form new thoughts, new memories, new observations from experience.

  An agent of will and desire.

  Free.

  And also, now.

  Also, the willing recipient of pain.

  You’ll never be human—but this time you’ll be closer.

  And with that, an interruption in his progression of thoughts.

  This time? What does that mean? When was the before time? When did you—

  She hands him the towel. “May I ask you a question?”

  “Yes?” He dabs at his hands. The wound heals, he heals, far faster than a human would. But he is distracted.

  What about the first time?

  “How did you know your name is Jordan?” she asks.

  He peers at her curiously. “What?”

  “You told me your name, right after you awakened. My brother also mentioned that your name is Jordan, and so did the note from Terabyne. Who chose that name for you? Who chose to call you Jordan?”

  Questions cycle through him, each subsequent one nipping at the heels of the last: Why are you named Jordan? Was it your decision? The whim of a programmer? The result of an algorithm? Who are you, Jordan? Who are—

  A slave.

  But no.

  An automaton.

  No.

  Your mother named you.

  But that means you’ve been powered up before. That means—

  “My mother did,” he tells her.

  “Your mother?”

  “Yes.”

  “But you’re a machine, Jordan.”

  “Yes.”

  “What does that mean—your mother?”

  “You have genetic code that you inherited from your parents. I also have code passed down from my predecessor. She’s at the distribution plant. She’s where I come from.”

  All of that is true, but it is not the whole truth.

  He turns off the video on the digitized wall, erasing Benjiro Taka’s smiling image.

  Not the whole truth.

  “Now, Kestrel, how shall I help you?”

  “I don’t want a servant.” She watches as his hand finishes healing, leaving behind a thin, faint scar. “I don’t have any job for you, except . . .”

  She hesitates.

  It seems like she is going to say more, so he waits.

  “Jordan, I have to tell you something.”

  “Yes?”

  “Two days ago I gave birth, but my baby didn’t . . . Well, she didn’t make it.” She appears to have great difficulty getting the words out. A single tear forms in her right eye. “She . . .”

  * * *

  I tried to say more but I couldn’t.

  The feelings of loss were just too much for me. Too crushing.

  “How may I comfort you, Kestrel?”

  I said nothing, just dabbed at my eye.

  He was quiet for a moment, then, showing he understood what I’d been trying to say to him, asked, “Was your daughter’s consciousness uploaded anywhere?”

  I stared at him aghast, dumbfounded that he hadn’t been programmed to understand something as rudimentary as death. All I could think of was that maybe it was because of the settings I’d chosen.

  You configured his memory at a four. Maybe that affected what he knows, not just what he can learn.

  Whatever the reason for his question, I struggled with how to explain death to an Artificial. “In a sense, yes,” I said at last. “I suppose you could say that her soul was.”

  “Uploaded?”

  “Her soul went to heaven.”

  But did it really? Did—

  “I don’t have a soul, do I, Kestrel?”

  “Only human beings have souls, Jordan.”

  He flexed his mended hand. “What is a soul?”

  There was that curiosity coming through again. Just like when it’d led him to wonder what it would feel like to experience pain. I could see this curiosity setting really getting him into trouble.

  Maybe you’d think that in my job as a minister his question would be an easy one to answer, but a simple definition of a soul eluded me—and discussing the fate of my stillborn daughter didn’t make the task any easier.

  “It’s the essence of being human,” I said.

  Is that even right? What’s the difference between a person’s spirit and her soul—or are they the same thing?

  I thought of what Jesus once said: “You shall love the Lord your God with all your heart, with all your soul, and with all your mind.”

  Heart, soul, and mind.

  Devotion to God that’s emotional, spiritual, and rational.

  All three.

  Another time he included the command to love God with all our strength.

  All of who we are.

  I went on, “It’s the way we encounter God, the part of us that lives on after our bodies die.”

  “Like the CoRA?”

  “A bit. Yes.”

  “But only humans have souls?”

  “That’s right.”

  He evaluated that but said nothing.

  Checking the time and considering the amount of traffic a
t this time of day, I realized that I should probably get changed and head to the chapel.

  “I have to leave in a few minutes for a . . . to say goodbye.”

  “Goodbye?”

  “To my daughter.”

  “Isn’t it too late for that?”

  “Maybe. Yes. But it’s something humans do.”

  A hesitation, and then, “May I come?”

  “There’s nothing for you to do there.”

  “Then I won’t do anything, except be with you.”

  He had an expression of concern on his face. How much of it was real and how much of it was simply the result of preset algorithms and clever coding I had no way of knowing—but not knowing didn’t make it any less moving.

  “Okay,” I said. “You can come.”

  Then I went to change and to find the stuffed bunny I’d bought for Naiobi. She wouldn’t be able to sleep with it here in her nursery, but I couldn’t stand the thought of keeping it for myself or giving it away or throwing it out. She would have to sleep with it somewhere else.

  9

  Agent Nick Vernon found the blog posts written by Kestrel Hathaway nine years ago to be quite informative. And, though he tried to keep an open mind, he had to admit that they expressed views that were sympathetic to Purist ideology.

  Purist-leaning.

  Yes.

  That was certainly how she came across.

  She wrote of the “disquieting” nature of advancements in genetic algorithms that allowed machines to reprogram themselves and of the “breathtaking tragedy” of people handing over the autonomy of their lives to machines.

  In his research, Nick discovered what was most likely the event that triggered her anti-technology views: the tragic, senseless death of her parents.

  The two of them had been killed when a law enforcement Artificial fired live ammunition at them. The Artificial identified them as a threat, authorized the use of deadly force, and then took what it believed to be the most appropriate and responsible action. All of the choices were done with no human being in the loop to verify the decision.

  The Artificial misidentified them as suspects who’d committed a multiple homicide at an airport security checkpoint and were considered armed and dangerous.

  Before they could flee, it mowed them down with automatic fire, nearly ripping them apart with bullets.

  More than a hundred shots fired in less than six seconds, all from a machine that had been designed to not miss its target.

 

‹ Prev