Book Read Free

Fantasy & Science Fiction - JanFeb 2017

Page 18

by Spilogale Inc.


  Valentin went to where Pepe was sleeping, rummaging the medicine kit out of his new case. The wilder was on his side, showing only the perfect side of his face, the faultless bones and dark lashes. Valentin touched his chin, turning his head. The jagged smile reappeared and Pepe's eyes flicked open.

  "I got disinfectant for your hand," Valentin said.

  "From the autofab? The god spoke to you?" His voice was hoarse with sleep.

  "I'm a prophet, aren't I?" Valentin shook the tube of disinfectant spray. "This is going to sting a bit."

  Valentin helped him wrap his hand and sling it up as he told him, in fragments, about the conversation with the god in the autofab. The whisper in his implant grew louder and louder, and by the time they stole away into the night, heading north to the band's last campsite to give them the warning, it was a chorus of furious voices.

  Valentin had his own concerns.

  * * *

  The Regression Test

  By Wole Talabi | 3964 words

  We asked Wole Talabi about his inspiration for this next piece. "The idea for this story came to me," he explained, "while I was trying to rederive an equation for fluid flow that was originally designed for liquids, in order to adapt it for cases where the medium is constantly transitioning from liquid to gas as it moves. I eventually found the Sorites paradox and then, reading further, I started trying to apply that idea to slow, gradual change in the behavior of people."

  The Nigerian-born Talabi has degrees in chemical and petroleum engineering—he currently works as an engineer in Malaysia—and a growing bibliography of both academic papers and science fiction credits. He recently finished editing Lights Out: Resurrection, an anthology of horror stories by African writers that was published last October.

  THE CONFERENCE ROOM IS white, spacious, and ugly.

  Not ugly in any particular sort of way: it doesn't have garish furniture or out-of-place art or vomit-colored walls or anything like that. It's actually quite plain. It's just that everything in it looks furfuraceous, like the skin of some diseased albino animal, as if everything is made of barely attached bleached Bran Flakes. I know that's how all modern furnishing looks now—SlatTex, they call it—especially in these high-tech offices where the walls, doors, windows, and even some pieces of furniture are designed to integrate physically, but I still find it off-putting. I want to get this over with and leave the room as soon as possible. Return to my nice two-hundred-year-old brick bungalow in Ajah where the walls still look like real walls, not futuristic leper-skin.

  "So you understand why you're here and what you need to do, Madam?" Dr. Dimeji asks me.

  I force myself to smile and say, "Of course—I'm here as a human control for the regression test."

  Dr. Dimeji does not smile back. The man reminds me of an agama lizard. His face is elongated, reptilian, and there is something that resembles a bony ridge running through the middle of his skull from front to back. His eyes are sunken but always darting about, looking at multiple things, never really focused on me. The electric-blue circle ringing one iris confirms that he has a sensory-augmentation implant.

  " Sorites regression test," he corrects, as though the precise specification is important or I don't know what it is called. Which I certainly do—I pored over the damn data-pack they gave me until all the meaningless technobabble in it eventually made some sense.

  I roll my eyes. "Yes, I'm here as a human control for the sorites regression test."

  "Good," he says, pointing at a black bead with a red eye that is probably a recording device set in the middle of the conference room table. "When you are ready, I need you to state your name, age, index, and the reason why you are here today while looking directly at that. Can you do that for me, Madam?"

  He might be a professor of memrionics or whatever they're calling this version of their A.I. nonsense these days, but he is much younger than I, by at least seven decades, probably more. Someone should have taught him to say "please" and to lose that condescending tone of voice when addressing his elders. His sour attitude matches his sour face, just like my grandson Tunji, who is now executive director of the research division of LegbaTech. He's always scowling, too, even at family functions, perpetually obsessed with some work thing or other. These children of today take themselves too seriously. Tunji's even become religious now. Goes to church every Sunday, I hear. I don't know how my daughter and her husband managed to raise such a child.

  "I'll be just outside observing if you need anything," Dr. Dimeji says as he opens the door. I nod so I don't accidentally say something caustic to him about his home training or lack thereof. He shuts the door behind him and I hear a lock click into place. That strikes me as odd but I ignore it. I want to get this over with quickly.

  "My name is Titilope Ajimobi," I say, remembering my briefing instructions advising me to give as much detail as possible. "I am one hundred and sixteen years old. Sentient Entity Index Number HM033-2021-HK76776. Today I am in the Eko Atlantic office of LegbaTech Industries as the human control for a sorites regression test."

  "Thank you, Mrs. Ajimobi," a female voice says to me from everywhere in the room, the characteristic nonlocation of an ever-present A.I. "Regression test initiated."

  I lean back in my chair. The air conditioning makes me lick my lips. For all their sophistication, hospitality A.I.s never find the ideal room temperature for human comfort. They can't understand that it's not the calculated optimum. With human desires, it rarely is. It's always just a little bit off. My mother used to say that a lot.

  Across the conference room, lines of light flicker to life and begin to dance in sharp, apparently random motions. The lights halt, disappear, and then around the table, where chairs like mine might have been placed, eight smooth, black, rectangular monoliths rise, slowly, as if being extruded from the floor itself. I don't bother moving my own chair to see where they are coming from; it doesn't matter. The slabs grow about seven feet tall or so then stop.

  The one directly across from me projects onto the table a red-light matrix of symbols and characters so intricate and dense it looks like abstract art. The matrix is three-dimensional, mathematically speaking, and within its elements patterns emerge, complex and beautiful, mesmerizing in their way. The patterns are changing so quickly that they give the illusion of stability, which adds to the beauty of the projection. This slab is putting on a display. I assume it must be the casing for the memrionic copy being regression tested.

  A sorites regression test is designed to determine whether an artificial intelligence created by extrapolating and context-optimizing recorded versions of a particular human's thought patterns has deviated too far from the way the original person would think. Essentially, several previous versions of the record—backups with less learning experience—interrogate the most recent update in order to ascertain whether they agree on a wide range of mathematical, phenomenological, and philosophical questions, not just in answer, but also in cognitive approach to deriving and presenting a response. At the end of the experiment, the previous versions judge whether the new version's answers are close enough to those they would give for the update to still be considered "them," or could only have been produced by a completely different entity. The test usually concludes with a person who knew the original human subject—me, in this case—asking the A.I. questions to determine the same thing. Or, as Tunji summarized once, the test verifies that the A.I., at its core, remains recognizable to itself and others, even as it continuously improves.

  The seven other slabs each focus a single stream of yellow light into the heart of the red matrix. I guess they are trying to read it. The matrix expands as the beams of light crawl through it, ballooning in the center and fragmenting suddenly, exploding to four times its original size, then folding around itself into something I vaguely recognize as a hypercube from when I still used to enjoy mathematics enough to try to understand this sort of thing. The slabs' fascinating light display now occupies more than half of the
table's surface and I am no longer sure what I am looking at. I am still completely ensorcelled by it when the A.I. reminds me why I am here.

  "Mrs. Ajimobi, please ask your mother a question."

  I snap to attention, startled at the sentence before I remember the detailed instructions from my briefing. Despite them, I am skeptical about the value of the part I am to play in all this.

  "Who are you?" I ask, even though I am not supposed to.

  The light matrix reconstructs itself, its elements flowing rapidly and then stilling, like hot water poured onto ice. Then a voice I can only describe as a glassy, brittle version of my mother's replies.

  "I am Olusola Ajimobi."

  I gasp. For all its artifice, the sound strikes at my most tender and delicate memories and I almost shed a tear. That voice is too familiar. That voice used to read me stories about the tortoise while she braided my hair, each word echoing throughout our house. That voice used to call to me from downstairs, telling me to hurry up so I wouldn't be late for school. That voice screamed at me when I told her I was dropping out of my Ph.D. program to take a job in Cape Town. That voice answered Global Network News interview questions intelligently and measuredly, if a bit impatiently. That voice whispered, "She's beautiful," into my ear at the hospital when my darling Simioluwa was born and I held her in my arms for the first time. That voice told me to leave her alone when I suggested she retire after her first heart attack. It's funny how one stimulus can trigger so much memory and emotion.

  I sit up in my chair, drawing my knees together, and try to see this for what it is: a technical evaluation of software performance. My mother, Olusola Ajimobi—"Africa's answer to Einstein," as the magazines liked to call her—has been dead thirty-eight years and her memrionic copies have been providing research advice and guidance to LegbaTech for forty. This A.I., created after her third heart attack, is not her. It is nothing but a template of her memory and thought patterns which has had many years to diverge from her original scan. That potential diversion is what has brought me here today.

  When Tunji first contacted me, he told me that his team at LegbaTech has discovered a promising new research direction—one they cannot tell me anything about, of course—for which they are trying to secure funding. The review board thinks this research direction is based on flawed thinking and has recommended it not be pursued. My mother's memrionic copy insists that it should. It will cost billions of Naira just to test its basic assumptions. They need my help to decide if this memrionic is still representative of my mother, or whether it has diverged so much that it is making decisions and judgment calls of which she would never have approved. My briefing instructions told me to begin by revisiting philosophical discussions or debates we had in the past to see if her positions or attitudes toward key ideas have changed. I choose the origins of the universe, something she used to enjoy speculating about.

  "How was the universe created?" I ask.

  "Current scientific consensus is—"

  "No," I interrupt quickly, surprised that her first response is to regurgitate standard answers. I'm not sure if A.I.s can believe anything and I'm not supposed to ask her questions about such things, but that's what the human control is for, right? To ask questions that the other A.I.s would never think to ask, to force this electronic extrapolation of my mother into untested territory and see if the simulated thought matrix holds up or breaks down. "Don't tell me what you think. Tell me what you believe ."

  There is a brief pause. If this were really my mother she'd be smiling by now, relishing the discussion. And then that voice speaks again: "I believe that, given current scientific understanding and available data, we cannot know how the universe was created. In fact, I believe we will never be able to know. For every source we find, there will be a question regarding its own source. If we discover a god, we must then ask how this god came to be. If we trace the expanding universe back to a single superparticle, we must then ask how this particle came to be. And so on. Therefore, I believe it is unknowable and will be so indefinitely."

  I find it impressive how familiarly the argument is presented without exact parroting. I am also reminded of how uncomfortable my mother always was around Creationists. She actively hated religion, the result of being raised by an Evangelical Christian family who demanded faith from her when she sought verifiable facts.

  "So you believe god could exist?"

  "It is within the realm of possibility, though highly unlikely." Another familiar answer with a paraphrastic twist.

  "Do you believe in magic?"

  It is a trick question. My mother loved watching magicians and magic tricks but certainly never believed in real magic.

  "No magical event has ever been recorded. Cameras are ubiquitous in the modern world and yet not a single verifiable piece of footage of genuine, repeatable magic has ever been produced. Therefore it is reasonable to conclude, given the improbability of this, that there is no true magic."

  Close enough but lacking the playful tone with which my mother would have delivered her thoughts on such matters.

  I decide that pop philosophy is too closely linked to actual brain patterns for me to detect any major differences by asking those questions. If there is a deviation, it is more likely to be emotional. That is the most unstable solution space of the human equation.

  "Do you like your great-grandson, Tunji?"

  Blunt, but provoking. Tunji never met his great-grandmother when she was alive and so there is no memory for the A.I. to base its response on. Its answer will have to be derived from whatever limited interaction he and the memrionic have engaged in and her strong natural tendency to dislike over-serious people. A tendency we shared. Tunji is my daughter's son and I love him as much as our blood demands, but he is an insufferable chore most of the time. I would expect my mother to agree.

  "Tunji is a perfectly capable executive director."

  I'm both disappointed and somehow impressed to hear an A.I. playing deflection games with vocabulary.

  "I have no doubt that he is," I say, watching the bright patterns in the light matrix shift and flow. "What I want to know is how you feel about him. Do you like him? Give me a simple yes or no."

  "Yes."

  That's unexpected. I sink into my chair. I was sure she would say no. Perhaps Tunji has spent more time interacting with this memrionic and building rapport with it than I thought. After all, everything this memrionic has experienced over the last forty years will have changed, however minutely, the system that alleges to represent my mother. A small variation in the elements of the thought matrix is assumed not to alter who she is fundamentally, her core way of thinking. But, like a heap of rice from which grains are removed one by one, over and over again, eventually all the rice will be gone and the heap will then obviously be a heap no more. As the process proceeds, is it even possible to know when the heap stops being, essentially, a heap? When it becomes something else? Does it ever? Who decides how many grains of rice defines a heap? Is it still a heap even when only a few grains of rice are all that remain of it? No? Then when exactly did it change from a heap of rice to a new thing that is not a heap of rice? When did this recording-of-my-mother change to not-a-recording-of-my-mother?

  I shake my head. I am falling into the philosophical paradox for which this test was named and designed to serve as a sort of solution. But the test depends on me making judgments based on forty-year-old memories of a very complicated woman. Am I still the same person I was when I knew her? I'm not even made of the exact same molecules as I was forty years ago. Nothing is constant. We are all in flux. Has my own personality drifted so much that I no longer have the ability to know what she would think? Or is something else going on here?

  "That's good to hear," I lie. "Tell me, what is the temperature in this room?"

  "It is twenty-one-point-two degrees Celsius." The glassy iteration of my mother's voice appears to have lost its emotional power over me.

  "Given my age and physical
condition, is this the ideal temperature for my comfort?"

  "Yes, this is the optimum."

  I force a deep breath in place of the snort that almost escapes me. "Olusola." I try once more, with feeling, giving my suspicions one more chance to commit hara-kiri. "If you were standing here now, beside me, with a control dock in your hand, what temperature would you set the room to?"

  "The current optimum—twenty-one-point-two degrees Celsius."

  There it is.

  "Thank you. I'm done with the regression test now."

  The electric-red hypercube matrix and yellow lines of light begin to shrink, as though being compressed back to their pretest positions, and then, mid-retraction, they disappear abruptly, as if they have simply been turned off. The beautiful kaleidoscope of numbers and symbols, flowing, flickering and flaring in fanciful fits, is gone, like a dream. Do old women dream of their electric mothers?

  I sigh.

  The slabs begin to sink back into the ground, and this time I shift my chair to see that they are descending into hatches, not being extruded from the floor as they would if they were made of SlatTex. They fall away from my sight leaving an eerie silence in their wake, and just like that, the regression test is over.

  I hear a click and the door opens about halfway. Dr. Dimeji enters, tablet in hand. "I think that went well," he says as he slides in. His motions are snake-like and creepy. Or maybe I'm just projecting. I wonder who else is observing me and what exactly they think just happened. I remember my data-pack explaining that regression tests are typically devised and conducted by teams of three but I haven't seen anyone except Dr. Dimeji since I entered the facility. Come to think of it, there was no one at reception, either. Odd.

  "Your questions were few, but good, as expected. A few philosophical ones, a few personal. I'm not sure where you were going with that last question about the temperature, but no matter. So tell me, in your opinion, Madam, on a scale of one to ten, how confident are you that the tested thought analog thinks like your mother?"

 

‹ Prev