by Neil Clarke
She paid for the brown ones. Maya bagged them while the woman dumped the rest back into the canister.
Maya understood in principle how AIs were trained even as an undergrad. It wasn’t that different from how this woman trained herself to recognize those brown lollipops. In practice, the woman had to trust Maya. She could have been an asshole and the woman would have given her daughter a bag of red lollipops. Or, worse, there could have been no feedback at all. The server who didn’t bother putting Maya’s gluten-free pasta order as gluten-free might wrongly learn that it didn’t matter since he wasn’t the one who got sick. None of this sank into Maya until grad school.
Swear words and slurs fill Sammy’s responses to Maya’s questions. This is not what she wants. If it were, all she’d need to do is leave the lab. Strangers “but you speak English so well” her. Kids spontaneously pull their eyes into slits and start ching-chonging at her as she walks to and from the university. Where do kids even pick up those stereotypes these days?
When she is Sammy’s trust quantification function, its responses are adequate. Nothing anyone would confuse for intelligent, but useable. But then Sammy just reflects her biases. Let Sammy roam free by itself on the internet and it’s clear that even if Maya understands what data sources to trust, she’s failed to compose the set of logic functions that can make Sammy understand or can even be evolved to make Sammy understand. Maybe she’d be better off defining trustworthiness and let Sammy evolve the trust quantification functions from that. She taps the back of Sammy’s neck to shut it down. Wiping out Sammy’s training again can happen later. She sets her forehead on her desk so she’s staring down at the floor. This is oddly comforting.
Jake walks in. From her angle, Maya can make out the calves stretching against his rolled-up jeans and the low-cut sneakers that have to be too small for his feet. Or maybe they just look too small in relation to his legs. He is still carrying the same empty mug. A hairline crack now runs down one side.
“Temporal mechanics.” She’s now resorting to a topic that’s not a thing.
“Well, I’m here, right?” He laughs. “Seriously, the foundational research is just getting started but no one doing it will recognize that time-travel is even a ramification for a few more years. Would you like some citations?”
His tone is so matter-of-fact that for a moment Maya can believe that Mr. Uncanny Valley actually does come from the future and has the citations at the ready. Or maybe his CV also includes years in acting school. At this point, she can believe either.
“Wouldn’t that cause some sort of temporal paradox or something?”
“Yes.” The sarcastic tone is pitch-perfect. It plays against his word but Maya doesn’t feel mocked. “In fact, I’ve already said too much. The universe has imploded and you have been trapped in a time loop reliving the same two minutes for the past 45 years, seven months, 23 days, four hours, 56 minutes, and twelve seconds.”
“That doesn’t sound encouraging. It would explain the progress I’m not making on Sammy though.”
“Tell me about it. All I get to do is walk through that door and explain to you that we’re in a time loop over and over again.”
She laughs as she sits back up. Her jaw drops when she sees the rest of Jake. Dark brown stains are streaked across his light blue T-shirt. Some of the blood still looks wet.
“Wow, what happened?”
“What?” He follows her gaze to his T-shirt. “Oh, this. Nosebleed. Sensitive nasal membranes. Snorted too much cocaine at Studio 54 in the 1980s.”
Her gaze narrows. Jake does not look like he’s old enough to be alive in the 1980s much less old enough to be in a nightclub back then. For a brief moment, Maya thinks about ways to make him bleed. It’d be definitive, but it’d also be no fun. Not to mention wrong. Besides, she’d rather keep playing the game where he’s wedged himself in the uncanny valley and she tries to pull him out.
“You don’t lie, do you?” She pulls a notebook out of her desk. “You just say something obviously implausible instead.”
“Oh, that’s just when I’m around you.” He shrugs, looking a little deflated. “Would you lie to your mother?”
“I’m not your mother.”
“Of course not.” He looks a little puzzled. “It’s a metaphor.”
“What’s a metaphor?”
“An implied comparison without using ‘like’ or ‘as.’”
Laughter erupts from her. Jake is just standing there watching Maya fall out of her chair. His “who me? Did I say something funny?” body language is both subtle and over-the-top. It’s as though he genuinely doesn’t understand what’s so funny. He’s quite the actor, she decides.
Jake offers Maya a hand. She refuses and Jake goes to his desk. She pushes herself back onto her chair then flips through her notebook for a while before she shuts it.
“Jake, how do you know who to trust?”
He turns to look at her. She thinks she can see his reaction happen in rapid, discrete steps. His brow furrows. His lips purse. His mouth opens then closes again. His actions get the message across, but the effect feels more studied than organic.
Finally, he shrugs. “How does anyone know?”
After fifteen minutes, if any undergrad still didn’t get the memo that the theme of the first digital systems lecture of the semester was feedback, they weren’t paying attention as far as Maya was concerned. She sat off to the side in the front row of the gigantic lecture hall, her fist propping up her chin. She’d twigged onto the theme after about three minutes. Professor Schmidt always insisted whichever teaching assistant assigned to her for the semester attend all her lectures. Otherwise, she’d be doing literally anything else. She was still a first-year grad student then and lucky not to be assigned to a course with a lab component.
Professor Schmidt was talking about feedback in the mathematical sense: some version of the output of a function also became an input to the same function. Twenty minutes in, schematic diagrams of storage elements for sequential logic filled the chalkboard. She had this thing about showing how a storage element a modern digital designer might actually use was derived from the simplest possible storage element. Even as the circuits grew more complicated, at their heart were cross-coupled logic elements. The output of each logic element was also fed back into the input of the other. The cross-coupling kept one output stable at the stored state and the other output at its opposite. In order to remember something, you had to keep reminding yourself, to keep regenerating the memory.
Digital logic is built on the clean, shiny lie that there are only two possible states: on and off. The real world is analogue. The number of states, if that’s even a meaningful concept, between on and off is infinite. When Professor Schmidt drew transfer curves to characterize the relationship between input and output for the circuits, Maya perked up and other students’ eyes began to glaze. No one expected Professor Schmidt to analyze analog circuits in a digital systems class. Most of the students were probably there because they didn’t like analogue circuit analysis in the first place.
Thirty minutes in, Maya decided that feedback was building to a larger point, the ultimate instability of the building blocks of digital logic. Idealized digital circuits pretended transistors were simple switches that were either open or closed. Real life was much more complicated in ways that Professor Schmidt had no qualms racing through. After all, everyone in the room had passed transistor theory and while transistors could behave like switches, they could also behave like amplifiers. Professor Schmidt gleefully listed the ways a simple storage element could fail to store the right value. Most of them were some variant of “things take time.” Any switch, as it opened or closed, was neither opened nor closed. What happened in the meantime through was also fed back. The value stored could be neither “on” nor “off” for a good long time.
The class met for fifty minutes a day on Mondays, Wednesdays, and Fridays. Forty-five minutes in, Maya finally leaped to Professor Schmidt’s a
ctual point. The job of engineering was to create stable systems out of fundamentally unstable elements. No girder by itself was reliable, but a well-designed bridge could be. The course was called “Digital Systems,” but what took up the rest of the semester was building stable systems.
The other lectures were about how to keep elements out of unreliable regions of operations, how to structure elements together to compensate for their relative weaknesses and so on. They were, however, just expansions of this lecture.
Years later, as a researcher in grad school, Maya lays her head down, staring at her desk in frustration. When she wonders whether building an AI that can trust correctly is even possible, what comes back to her is this lecture.
The good news is that Sammy has stopped swearing at Maya. The bad news is that now Sammy won’t talk to Maya at all. It’s stopped responding completely. It just stands on Maya’s desk failing to look angry. Its stubby arms are folded across its chest and its mouth curves down into a frown. It can’t do anything about its big eyes, but Maya can’t look at them without imagining that any moment they’ll be filled with tears.
The VR goggles have to go on. She flies into Sammy’s hardware. Even at a distance, the interconnection of logical functions and memories isn’t quite as she remembers but it’s not supposed to be. It’s supposed to evolve as Sammy learns. The evolution is not supposed to render Sammy non-functional. Only dark, stray dots flow through the tubing.
The door clicks, followed by silence. It has to be Jake. Anyone else would have made the tiles on the lab’s somewhat loose raised floor rattle and squeak.
Sammy comes alive. Long streams of glowing dots rush through the layers of thin tube and slide through the functional units. Maya rips off her VR goggles to see Sammy pivot towards Jake, smile, wave, then swear at him. Sammy is not non-functional, just angry at her. This means Sammy can get angry. Or something that Maya interprets as anger.
Jake casually transfers a stack of boxes five high from one hand to the other so that he can close the door. Each arm bulges as it curls up the weight. The stack doesn’t teeter as he goes to his desk then sets it down. Maya steps over to his side. The boxes are full of books. She eyes him skeptically. No one can support that much weight with just their biceps. Well, perhaps someone who has taken literally all the steroids. From certain angles, his arms do look about as wide as his head.
“Hey, you’re the one who told me you didn’t think I was human. Maybe you were right.” Jake shrugs. “Aside from some outrageous things that you weren’t supposed to believe and you don’t, I have been completely honest with you. It’s not my fault that the truth is unbelievable. For now, anyway.”
He unpacks his books onto the bookshelves. They are an odd and intimidating lot, ranging from slim volumes on ethics and religion to thick tomes on quantum mechanics.
“If you are an android from the future, aren’t you supposed to make some pretence of acting human?”
“And, around you, the point of that would be what?”
She finds herself thinking he has a point. Game or no game, his story is self-consistent. Regardless, it seems Sammy will talk to him when it won’t talk to her.
“Jake, can you please talk to Sammy for me?”
Jake stops in mid-step and turns to Maya, a copy of La Guerre du Golfe n’a pas eu lieu in hand. He holds it out, two-handed, like a talisman against evil.
“No, I’m not creating a predestination paradox that leads to my own creation.” He looks down at the book then lowers it to his side. “Work out Sammy’s problems yourself. For my sake, if no one else’s.”
Jake continues to unpack. Maya trains the VR goggles on him and doesn’t see anything. All that means is that he is not constructed out of prototyping hardware. She doesn’t see anything when she looks at herself or the air conditioning unit through her VR goggles, either.
With the VR goggles trained on Sammy, slides and tubing fill her field of view. Occasionally, she can hear the hiss as Jake places a book on a bookshelf. That’s the only sign that he is still in the lab. Either Jake does have that much casual dexterity or he’s working extremely hard to show off. She’s tempted to throw her VR goggles or Sammy at him just to see what happens. She half-expects he would catch either without even looking. However, if it turns out “android from the future” has just been this game they’ve been playing with each other, replacing either is not in the budget.
In VR, Maya flies inside Sammy. She glides along tubing, hovers over memories, and zooms into logical functions. How the machinery that drives Sammy has evolved becomes clearer over the course of hours. Parts of it are so changed from what she remembers and so odd, she may never understand them. She realizes why Sammy has become non-responsive to her and, apparently, only her.
Sammy knows. It’s not stored in any of the memories but assumed by all logical functions and baked into how they are interconnected. The knowledge that Maya has been resetting Sammy’s training whenever she feels it’s going wrong is innate in Sammy. It’s ingrained into the hardware and predicates everything it does. Sammy doesn’t trust Maya. The trust functions are working, just not in the way Maya expected.
Maya can’t remove its assumptions about her without modifying every logical function and rewiring the connections between them. That’s clearly impractical. She can nuke Sammy completely, reload her original design, then reapply the improvements she’s come up with since. The result would no longer be Sammy. That said, Sammy behaves like the kids who ching-chong at her as she walks to school. That makes nuking Sammy oddly tempting. It’s not her kindest thought.
“I’m an idiot.” She buries her head in her hands.
“Nonsense. You’re one of the smartest humans I know.”
Maya jumps at Jake’s voice. She pulls off her VR goggles. His boxes are now empty. The bookshelves are full.
“You’ve been so quiet. I forgot you were still here.”
“Yup, that’s me.” He puts his hands on his waist. “Silent. Efficient. Deadly?” He smiles at the last. The smile disappears and his hands fall when she doesn’t reciprocate.
“Every year, one of the professors here gives a lecture whose point, when she gets around to it, is that engineering is the art of making stable systems out of unstable devices.”
“Oh, Professor Schmidt.” His smile comes back and his eyes light, not literally, with recognition.
“I’ve mentioned this to you before?”
“Well, strictly speaking, not yet. It depends on whose ‘before’ you mean, I guess.”
He is utterly straight-faced. Maya can’t tell whether he is serious or if cracking a smile would ruin the joke.
“Anyway, it’s just hit me that it’s unfair to expect Sammy to resist the stereotypes that humans fall for all the time.”
Jake opens his mouth, starts to speak, closes his mouth, then opens his mouth again. His actions are perfectly timed to convey indecision. If anyone else had done exactly the same, Maya might just think they were being indecisive.
“Down that road lies predestination paradox.” He taps the fingers of his left hand on his thigh. “I don’t know what answering you will do to me.”
“Oh, I’m going to erase Sammy and bring it back up from scratch no matter what you say.” She puts the VR goggles back on and turns to Sammy. “Having screwed up with Sammy, I need a fresh start. The trust mechanisms work. It’s now a matter of the right feedback.”
Jake makes a noncommittal noise. Maya has stretched her hands out and is making fine gestures with her fingers. The prototyping system throws several “Are you sure?” warnings at her. After she agrees to all of those, the tubing and functional units all disappear then re-emerge in the configuration she originally established. They are lined up in perfectly regular matrices on a perfectly flat plane. Each plane is stacked exactly on top of the next. It’s all absurdly unnatural, but it’s a new start.
Maya takes off the VR goggles. There are changes that still need to be applied before Sammy re
-activates, but those can happen without her watching. This time, it will subject itself to its own randomly generated statements until it can reliably generate trustworthy statements before she lets the internet loose on it. No bias from the human world to throw off its notions of trust.
“Jake, are you really here to organize the March for Truth?”
“Well, I’m here and I’m helping with the March for Truth. I never explicitly stated a relationship between the two. But if you don’t believe who I am, there’s no point in getting into why I’m here. Sorry.” He shrugs. “If it helps, what I’m actually doing here has a shelf life. I don’t expect to survive it much longer. Don’t examine my remains. Predestination paradox and all that.”
Maya’s eyebrows rise. This is a bit dark for their game. “I was just thinking that we deserve the AI we create. It’d be easier to filter out the racism and the rest if, systemically, there was less of it to begin with.” She straightens up her desk, filing her papers, stashing her notebooks in desk drawers. “Not everyone has to work towards a better world for the world to work, but it’d be better if everyone did.”
“I’m a machine intelligence, not a mind-reader.”
“Oh, two things. One, is the March for Truth looking for volunteers? And, two. You know.” She blows air through her lips and looks down before she meets his gaze again. “If I could close my eyes but still sense your body language and if we’re just talking—not wandering in some obscure corner of knowledge no one but you has ever visited before—I can just about believe you’re a person.”
“Are you implying that I don’t pass the Turing Test?” Jake looks indignant. His hands are on his waist.
“Actually, I’m saying that you do. Passing for human is over-rated. You’re a more convincing conscious intelligence than any number of people I know.”