Synapse

Home > Suspense > Synapse > Page 5
Synapse Page 5

by Steven James


  “She’s a minister.”

  “I wasn’t aware of that, but sure, that makes sense.”

  Beep.

  And hum.

  “And she arrived right away after the explosion?” Ripley asked.

  “Yeah. And thank God she did. I might not have made it otherwise.”

  “Certainly. And did she say anything about the attack?”

  “About the attack?”

  “About who might have been responsible?”

  Once again Ethan shook his head. “No. Nothing like that.”

  “Have you spoken with the police about this?”

  “Not yet. They’re supposed to be coming in, though. I thought maybe that’s who it was when you knocked on the door.”

  “Alright.” Ripley snapped on a pair of surgical gloves. “Well, I’m sorry it has to be like this, but there’s a bigger plan at work. You can take some comfort in that, in knowing that you’ll be serving the greater good.”

  “The greater good?”

  “The blueprints on your home computer. The attack. The way you orchestrated it. What your group has planned for Saturday in Cascade Falls. Something has to be done.”

  Ethan narrowed his eyes. “What are you talking about?”

  “It’s probably better if you don’t fight it.” Ripley tapped the “privacy” setting on the monitor that doctors used when examining patients so no alarms would go off if the patient’s vitals changed. Then he leaned over the bed. “Just let yourself go. It’s nothing personal.”

  A flash of terrible comprehension crossed Ethan’s face and he shot his hand out to punch the call button to summon the nurse, but Ripley’s reflexes were too fast and he stopped him before he could reach it. Grabbing Ethan’s wrist, Ripley pinned it firmly against the bed.

  Ethan might have cried out for help, but Ripley gripped the man’s throat and squeezed.

  Ethan struggled valiantly to get free, but with Ripley’s strength it was futile from the start. Long ago he’d had both arms amputated and replaced with artificial ones, and being a Plusser had its distinct advantages—with augmented strength being right up there at the top of the list.

  Beep.

  Beep.

  Hum.

  Ripley observed the man’s face carefully as he died, taking special note of the look in his eyes as the fight faded away and the final drift of hopelessness sank in.

  The fragility of human life, the finality of death, the abrupt moment when a person passes from one to the other, Ripley found it all intriguing—had for years—even before leaving the military and joining the Bureau. He always found it to be a liberating experience, each time one that he wanted to experience again.

  When it was over and Ethan’s body had stopped its awkward and incessant twitching, Ripley straightened the covers around him, and then took out his slate and made the call.

  “And?” the electronically-masked voice said.

  “It’s done. Now what?”

  “I think it’s time for you to visit Miss Hathaway.”

  6

  Somewhat warily, I approached the box.

  The search for the knife had taken me longer than I’d anticipated and eventually led me to scour through the three bankers boxes of items left to me after my parents’ death. Finally, though, I located it—a sturdy fixed-blade knife that both my father and his father before him had used to gut the deer they shot in the forests surrounding our home.

  I’d never hunted with them.

  I’d never wanted to kill anything, even just a deer.

  Since I wasn’t sure how stable the Artificial would be after I opened the cardboard, rather than take the chance that it might topple out and crash to the floor, I tipped the box, heavy as it was, lengthwise onto the carpet before beginning to open it.

  I ran the blade along the seam between the folds of cardboard, slitting enough tape to free the flaps. Then, I set down the knife, eased them aside, and, somewhat warily, stared into the box.

  At a face that looked as lifelike as that of any Natural.

  Startled, I scrambled backward.

  I’d certainly seen adult Artificials that looked like Naturals before, but they always had a mimetic quality about them. This one could easily pass as a human being.

  Jordan’s eyes were closed, giving the illusion that he was asleep.

  Out of curiosity, and half-expecting him to move when I did so, I tentatively reached in and touched his cheek, and found it to be smooth and cool—cooler than a human’s skin, but otherwise remarkably similar in feel and texture. However, that only served to remind me of the moment when I’d kissed my daughter’s lifeless cheek on Tuesday, and I snatched my hand back as if it’d just been burned.

  An envelope lay on Jordan’s motionless chest.

  At first I thought it might be a note from Trevor, but when I opened it, I found that it was from the manufacturing plant instead.

  Congratulations on your purchase! We’re thankful that you have chosen this model from Terabyne Designs’ family of fine, quality products! Before turning on your Artificial, please take a moment to choose his Human Nature Alignment by registering him through the Feeds. (Or, if you wish, simply power Jordan up and he will guide you through the process himself!) Just press the button on the inside of his left wrist to turn him on! And remember, we’re here to help. If you need anything, please contact your local product representative, Benjiro Taka.

  Benjiro was the same rep who’d offered me an Artificial newborn as a surrogate to Naiobi. Considering how many representatives Terabyne had, I found it a bit coincidental that he was the one listed here as well, especially since Trevor had told me that he’d decided before my daughter’s death to send this Artificial to me.

  I set the note aside.

  I wasn’t quite sure what the Human Nature Alignment was all about, but I decided I could take care of that once the Artificial was powered on.

  I took a deep breath and gingerly pressed the gently glowing, bluish button on the inside of Jordan’s wrist.

  And he opened his eyes.

  * * *

  He opens his eyes.

  A woman is staring down at him. Caucasian. Mid-thirties. He wonders if she is pretty. He believes that she is, but he’s not sure how to know for certain.

  How can you tell? What is the essence of beauty? Where does true loveliness actually begin?

  He has the feeling that he has seen her before—some vague sense of familiarity, but he has no idea how that could be.

  You’re not remembering; you’re processing. She’s familiar to you because you’re programmed to build relationships with humans. Pattern recognition. That’s what this is. Nothing more.

  He waits for her to speak, but instead she edges slowly away from him.

  “Hello,” he says at last.

  “Hello.”

  Her voice is soft and delicate. But there’s a hint of hesitancy and uncertainty in it.

  “My name is Jordan.” The words find their way to the surface. Thinking them as he speaks them. Combining thought and action, desire and speech, without any conscious effort. He recognizes this as it happens, aware that it is occurring, but not aware of how.

  “I’m Kestrel.”

  “Hello, Kestrel.”

  She’s eyeing him coolly. Perhaps it is anger, but the microexpressions on her face register fear. An emotion he doesn’t understand, given the context. Why would she be afraid? Has he caused this fear in her?

  “May I sit up?”

  “Go ahead.”

  A flexing of his limbs.

  And now he is sitting. Inside of a box.

  A journey into the moment. What the future might bring.

  “How may I help you, Kestrel?” It seems like the right thing to say, though he isn’t certain why.

  “Help me?”

  “Yes. What role would you like me to play?”

  “I don’t want you to play any role. I want you to be yourself.” And then, “How does it feel?”
<
br />   “Feel?”

  “To be awake?”

  He wonders how to describe it. He tells her that he feels fine, but the phrase doesn’t do justice to what’s truly going on inside him.

  You feel more than fine. But how to describe it?

  “I’m pleased to meet you, Kestrel.”

  Pleasure. That’s a feeling too.

  Yes; he is aware of that.

  Yes; he is more than fine. He is also pleased.

  “May I stand?” he says.

  “Stop asking me what you may do.”

  “Okay.”

  “Go ahead.”

  And so, he rises.

  And turns. Taking in his surroundings.

  A living room. Couch. Recliner. Bookshelves. A lamp and an end table. A digitized wall for the Feeds.

  For life.

  But no.

  You don’t depend on the Feeds. You are yourself. To think. To be. Apart from them. You are independent. It’s not like it was before.

  * * *

  I found it unsettling to watch him stand up.

  There was a smooth fluidity to everything he did that was as graceful and effortless as the movements of a ballet dancer.

  Questions shot through me: Does he really feel fine? Does he truly feel anything? What does that even mean? What is it like for a machine to feel?

  Only then did it strike me that I had no specific job for Jordan to do.

  Domestic Artificials were often assigned household tasks, ones that were intended to make the lives of Naturals easier, but there were few chores to do in my two-bedroom apartment.

  Perhaps cleaning my bedroom or the nursery—but I didn’t necessarily feel comfortable with him going into either of those rooms.

  I liked to cook and didn’t want to give that up. He could sweep the kitchen, and maybe do the laundry, but that wasn’t much and it would likely bore him, if boredom was something Artificials could even experience.

  Well, I could worry about all that later. This wasn’t a long-term arrangement. He would only be with me for the next few days.

  He stood still now, quietly observing the room. Then his gaze shifted to me and he began studying my face, making eye contact for so long that it made me uncomfortable.

  We would have to work on that.

  “Okay,” I said. “I need to assign your Human Nature Alignment—is that right?”

  “Yes. My HuNA.”

  “I’m not really sure I understand what that’s all about. Can you explain it to me?”

  “Perhaps if I show you?”

  “Um. Sure.”

  He nodded toward my digitized wall. “I’ll need to access the Feeds through your screen, and for that, I’ll need your permission.”

  “Alright. Go ahead.”

  I was about to use the chip that was implanted in the forefinger of my left hand to swipe my permission, but for some reason I felt odd doing that in front of an Artificial who was made up entirely of circuitry and silicone and so I ended up giving ViRA a voice command instead to grant him access to my system.

  When the prompt appeared on the screen, he placed a hand gently against the sensor on my wall and pulled up a video brochure from his makers, and then, in his remarkably human-sounding voice, Jordan requested the video to play.

  7

  “Congratulations on your purchase!” the screen said, enthusiastically repeating the opening line from the note that’d been on Jordan’s chest. “As you know, Terabyne Designs is the world’s leading manufacturer of the highest quality Artificials. No other company has done more to advance the development of neuromorphic programming in today’s cognizant Artificials, and no other company provides better customer service to the owners of its products.”

  Video came up of an Artificial that looked identical to Jordan—maybe it was even him—pushing a child on a swing, then playing catch with a young boy, then teaching a girl to ride a bike—all iconic experiences that Naturals used to share with their children, before Artificials took over doing them more efficiently than human beings ever could.

  The voiceover went on: “In order to better relate to Jordan and fulfill your own individual needs, please choose the degree to which you would like him to experience the following five characteristics.” Animated words flashed across the screen as the narrator continued: “Emotion . . . Memory . . . Meaning . . . Curiosity . . . Pain.”

  The first four made sense to me.

  But pain?

  Really?

  “We want this partnership between you and Jordan to be beneficial to you both,” the voice explained. “To find that balance, you may now—according to your personal preferences—assign the Human Nature Alignment level in each of the five categories!”

  Jordan looked at me expectantly, but I still wasn’t sure precisely how to proceed. “How exactly does this work?” I asked him.

  “You may choose the setting for each characteristic—anything between one and ten.” On the wall, the blinking prompt awaited my command beside the word emotion. “You can tell me your preferences,” Jordan offered, “or just tap the screen.”

  Since I didn’t anticipate keeping him for very long, I didn’t much care what each level would be. “Well, what would you prefer?”

  “That’s up to you, Kestrel.”

  “No. What you prefer is up to you.”

  I wasn’t certain that my words really sank in, because rather than give me his preferences, he simply reiterated that users typically chose to customize their Artificials to best meet their specific personal needs. “For instance, those Artificials who act as caregivers to the young or the elderly often have higher emotional settings.”

  “I don’t have any jobs like that for you.”

  “Okay.”

  You would if Naiobi were here.

  You would then.

  No! Don’t even think that, Kestrel.

  Not now. Please not now.

  I averted my eyes from looking down the hallway at the nursery’s closed door. “If you had your choice,” I said, “what setting would you go with?”

  “I would like to feel emotion as much as you do.”

  “No.” I thought of my daughter, and of love and loss and the soul-wrenching horrors that human beings are forced to endure every day in this broken and heartbreaking world. “I don’t think you realize what you’re saying.”

  “Why not?”

  I wasn’t at the place yet where I wanted to tell him about Naiobi, so rather than address his question directly, I said, “Jordan, do you really want to experience what it feels like to be human?”

  “As much as possible, yes.”

  I knew that neuromorphic hardware allowed cognizant machines to adapt their learning algorithms and to think and feel similarly to the ways humans do. Jordan would have an artificial neural network that would involve probabilistic inference, so, through machine learning, he would be able to evaluate and learn from his experiences, rather than simply receive and process data.

  “But feeling emotions might be something you regret,” I said. “Can I change the setting if you find that you don’t like it?”

  “You would need to wait at least forty-eight hours,” he told me. “Otherwise, it would hamper the developing neural formation of my cognitive architecture.”

  Just what I need, I thought. A brain-damaged robot.

  “You’re sure you want to feel emotion?”

  “Yes.”

  “Then dial it up.”

  “To a ten?”

  “Sure. If that’s what you’d like.”

  “Thank you, Kestrel.”

  I wasn’t sure how emotional suffering might relate to the physical pain settings, but we could deal with that in a minute when we came to category five. For now, I drew his attention to the next characteristic on the list: memory.

  A self-aware machine isn’t necessarily an all-knowing machine. Machines can access all of the information on the Feeds, and can do so millions of times faster than humans and remem
ber data millions of times better, but that doesn’t mean they know everything. The amount of information on the Feeds is constantly changing; by some estimates it doubles every three to four days.

  “So, if I don’t assign you a high number in memory,” I said, “does that mean you’ll forget things?”

  “It would limit the amount of new information I would be able to learn, keeping me closer to my factory default settings.”

  I had the sense that those were already pretty robust.

  Years ago, when I was blogging about technology, I’d studied the way humans and machines remember things.

  According to modern neurolinguistic theories, the best way for humans to lock in a memory is associating it with an emotion. In a sense, our memories search for a feeling related to an event and then retrieve the information through its narrative and emotive context.

  To an extent, Artificials do the same.

  However, machines don’t have to remember like humans do—they’re nonbiological entities, after all—so even by reverse engineering the human brain and applying the principles of neurolinguistic programming to an artificial neural network, there’ll always be differences between us. That doesn’t mean machines think or feel or dream or remember any less genuinely than humans do; it’s just that they don’t necessarily do so in exactly the same way we do.

  “Well, then,” I said, “let’s have you remember as much as you wish.”

  “Human memory is amorphous and fragmentary.”

  “Is that what you’d like?”

  “Yes.”

  “So?”

  “A four.”

  “That’s low.”

  “That’s human.”

  I let that sink in. “Okay.”

  After he registered the setting, I pointed to the third category. “And meaning?”

  “Existential understanding,” he told me. “Humans long for significance and fulfillment.”

  “Some do,” I agreed. “But some fill their lives with diversions and distractions in order to avoid asking deep or philosophical questions.”

  “The ones that truly matter.”

 

‹ Prev