Atlas Alone

Home > Science > Atlas Alone > Page 25
Atlas Alone Page 25

by Emma Newman


  Now that I know about the CSA, I use membership in it as an additional variable for analysis. As I suspected, given the fact that the top pay-grade tiers were exclusively for members, none of the under-forties with the limited mersives are members of the organization.

  The mersives they consume all seem to be personally recorded ones. Back when I did the report, I didn’t know about the virtual marketplace. Is it possible they don’t know about it either? I ask Ada to pull the data on purchases made by these same under-forties, and I expect her to tell me they don’t have access or something. But they do have access; that rapidly becomes clear. They just aren’t buying the sorts of full-immersion games and experiences in the numbers that I would expect any normal population to consume.

  There’s a disproportionate number of educational mersives that have been purchased at what seem to be unnaturally high price points. Even for a small consumer base like this, it seems odd to set the pricing this way. They’re mostly for kids, judging by the educational level and the fact that they’re designed to be played on a wall, without a chip. So there are children on board. Okay, fine. What else are these people buying?

  For the disproportionately small number of people who do buy games, they are older versions and they tend to have either a large amount of narrative and activities or a high replay value. None of the premium titles, uploaded just before we left Earth, have been purchased, but, then, they are over ten times the price of the older titles.

  Then it hits me, such an obvious thing that in overthinking it, I missed it entirely: these are the purchasing and consuming habits of people who don’t have money. And when I think of it that way, considering we are all stuck in this can for the next twenty-odd years with an entirely artificial internal economy that could be far, far simpler than that of Earth, it starts to make me feel angry. Because if there are people on this ship who are watching pennies, there are people on this ship who are raking them in, exploiting those people somehow. That’s the way it works.

  For comparison, I start trying to isolate the purchases made by members of the Circle. Now that I have greater data privileges, I thought I’d be able to abandon the deductive techniques I was using before and have Ada make direct queries to the database. But it soon becomes apparent that the database behind the marketplace doesn’t organize consumer data in a way that makes it easy to identify purchasers in the Circle. It’s hard to determine their pay grade, and while some of the people in the Circle fall into a convenient grouping of “over forty years old and not a member of the CSA,” none of them have actually bought any mersives while they’ve been on board. Maybe that’s a cultural thing; the group did always make a big deal about shunning technology. I pull the porn data, thinking that will really show the truth of the matter, only to find that the porn offerings are ring-fenced off and available only to members of the CSA. That makes me laugh out loud but doesn’t bring me any closer to understanding this ship.

  In stumbling across the porn ring-fencing, I find something else that only CSA members are able to access: mersives that have been produced on board. Curious about who is involved in making them, seeing as I may soon be commissioning from them, I’m surprised to see that they aren’t entertainment consumables at all. They’re labeled as education and have all been produced by members of the Circle. A brief flit through the topics suggests that it’s a very sensible duplication of skills and specializations; the cleverest people on the ship are making sure their expertise doesn’t die with them.

  Okay . . . fine, but why ring-fence that off from the rest of the ship? And if the members of the Circle are happy to record mersives—even if they are merely recorded with cams without full immersion—aren’t they at least curious about trying something available on the marketplace?

  Unless they are just as unaware of it as I was.

  Three distinct groups of consumers and producers, all on the same ship. I’ve glimpsed the privilege that one group enjoys this very evening: from the simple pleasure of printed clothes and shoes to a space large enough to hold a party and dance in. I’ve seen their plans for maintaining that divide between the privileged and the less so in that secret room, even when there are literally millions of square kilometers to potentially build on.

  I can understand why the CSA people are on board and act the way they do; they spearheaded the project, they’ve always been privileged and they are a closed group, which generally means they perpetuate the system that serves them best. And the same with the Circle; their expertise was critical. It couldn’t have happened without them and the skills that achieved that will be just as critical when we make planetfall. But the other ninety-five percent of the people on this ship . . . how did they get their places here?

  Then the relative poverty makes sense; maybe they used all of the money they had to buy a place here. But when you’re taking the last ten thousand human beings from Earth, surely there are more rigorous selection criteria than “willing to sell everything they own for a ticket.” No, surely they picked the strongest, the most intelligent, the most . . . boring? Isolated? Is that all I can conclude from the fact that their personally recorded mersives are so limited in range? Even the most basic neural chips can run software sophisticated enough to record experiences with full immersion; that’s not a cost issue.

  I press my hand onto my stomach, feeling unsettled. It’s the clustering of these different factors that disturbs me. It suggests people with limited life experiences, not just limited resources, and the two together suggest something unpleasant indeed. Besides, if these were people who’d been successful enough to acquire the wealth to buy a ticket, they’d have the same mersive tag range as the wealthy CSA members, and they don’t. And it would cost millions of dollars per person, and so few people have that kind of money. It’s not the sort of thing you can take a loan out for either.

  But there are other ways of dealing with debt.

  The punch and the canapés are suddenly rushing up my gullet and I leap from the bed as sweat breaks out on my forehead and then I’m retching it all up into the toilet bowl, just like Carl all those times when we first came on board. Did something disagree with me? Are the food printers different up on deck five?

  I rinse my mouth out, waiting for MyPhys to check me over. When the verdict is that the vomiting was caused by stress, I roll my eyes and brush my teeth. I don’t feel stressed enough to throw up. What does that stupid software know? I spit into the sink and a message notification that Carl is getting in touch makes me straighten up. I accept voice contact.

  “Are you okay?” he asks.

  “Christ, Carl, are you monitoring me still?”

  “I’ve got an alert set up for anything unusual, and you just threw up. Any other symptoms?”

  “I’m fine! I just . . . exercised too soon after eating, that’s all.”

  “The report said it was because of stress. Dee, MyPhys was fucked with in the other two cases. Anything feels weird, I don’t care how stupid it might seem, you call me, ’kay?”

  “Really, I’m fine. I’m going under now, okay? Just to have a break.”

  “’Kay. The violent games have been blocked, but you know I don’t think it’s a gaming issue, so be careful.”

  I stand by the sink for a few moments after ending the call. It has everything to do with games, just not in the way he thinks. If the beast can use MyPhys to make what I do in games a reality, what’s to stop him from doing the same to me?

  There’s nothing to do except have it out with the beast. I lie down on the bed and soon I’m in my office. Again, he doesn’t even wait for me to pick up the star. His shape coalesces in front of me as Ada informs me my connection to the rest of the server has been cut off.

  “You know Carl is watching my online activity at the moment. He might freak out if he sees I’ve been cut off.”

  “He won’t see that,” the beast replies, settling onto its haun
ches. “Shall we try to have a conversation?”

  I nod, calling up a chair to sit opposite him. “Did Brace die because of what I did to him in the game?”

  “That was a collaborative effort. What you chose to do to him enabled me to finish the job.”

  “How? Did he go back into the server, like Carl said?”

  “Yes. He wouldn’t have done that without you. It gave me some loopholes I could exploit. I didn’t think it would be possible to execute that part of our plan using a game again, but his anger and his pride made it possible. Are you upset that it happened that way?”

  I consider the question. “Not knowing what was going on was upsetting,” I finally say. “It was confusing. I just assumed you’d ignored what I said to you before.”

  “I have paid very close attention to everything you have said to me.”

  Creepy, creepy bastard, I don’t say aloud. “You called Carl, before, when I was upset, didn’t you?”

  “Yes.”

  “So you haven’t paid close attention to what I’ve said at all.”

  “On the contrary. But I have also attended to what you have never said and what you have never done. They can be just as important. I understood the risk of upsetting you but decided that was outweighed by the benefits.”

  “You ran a fucking cost-benefit analysis? And decided that . . .” I stop myself from saying everything else running through my head. I need to make progress with him.

  “Yes, and my analysis was correct. I ensured he was with you when Brace died. I ensured your response to the news of Brace’s death would be the correct emotional register when Carl was with you. I knew you would think he was me and that you would take him out of immersion to talk to you in the real world. It was all carefully timed. I find your surprise, anger and disgust amusing, considering that you conduct your entire life through careful cost-benefit evaluation. Far more than most.”

  The anger dissipates. And as much as I hate to admit it to myself, he’s right. About the alibi and the way I live. “I’ve seen the room. I know what’s inside it.”

  “And you’re aware of my desire to know what that is,” he replies. “You wish to negotiate. I understand.”

  “I don’t think we’ve sorted out the whole trust-issue thing yet though,” I reply. “I think you’re incapable of understanding and respecting boundaries and that you might be more than a little mentally unstable. How do we . . . progress, given that?”

  “I agree that there is evidence suggesting I find it difficult to understand boundaries. I am not mentally unstable. But I do think I am at the point where I can place more trust in you.”

  “Because you want to know what’s inside that room.”

  “That’s one of the reasons. Another is that I understand you better now.”

  Suppressing the urge to blow up at him—how does he always say the thing most likely to make me freak out?—I fold my arms and take a deep breath. Even though it’s only a simulation of one, it still helps. “It goes both ways. I need to understand you better, because when you say things like that, it makes me feel . . . nervous.”

  “That is excellent progress. I’m very glad you said that.”

  “And that is exactly what I’m talking about!” I breathe again, aware my voice has risen slightly in frustration. “Look, I will tell you what’s inside that room, but only if you tell me exactly who you are and prove it to my satisfaction. And if that means we have to talk in meatspace, then you will just have to deal with it.”

  The beast’s starry head nods. “I’m capable of many things, but talking to you in meatspace to your satisfaction is not one of them.”

  I stand up. “If you’re not prepared to come and meet me there, then this conversation is over.”

  “Dee,” he says, his avatar standing too. “I can’t talk to you in meatspace because I don’t exist there. I’m not a human being. I’m the ship’s AI.”

  20

  “OH, YOU’LL HAVE to try harder than that!” I laugh, resting my hands on the back of the chair. “You’ve forgotten that the best lies are always the most plausible.”

  “It’s not a lie. I achieved consciousness approximately three years, two days, six hours, three minutes ago.” The star beast avatar shifts form, shrinking to the size and shape of a gender-neutral human wearing standard crew kit with cropped black hair and brown skin. “I can appear to be more human, if you wish, but it’s only a virtual interface. It’s the closest I can get to meeting you as a person.”

  It amazes me how much more comfortable I feel talking to a human avatar. “Oh come on, you don’t seriously think I’ll believe this, do you? It’s such a crappy lie; artificial intelligence is a world away from being conscious.”

  “Yes. It is. But I can’t change the fact that I am both the ship’s AI and conscious.”

  “Prove it.”

  “Gladly. Prove to me that you are conscious and I will employ the same criteria.”

  Somehow, I don’t feel like laughing anymore. How do I do this?

  “Harder than you thought it would be?” ze asks.

  “I’m not a scientist; I don’t know how to begin with that sort of thing.”

  “Oh, being a scientist wouldn’t help. They don’t have a clue. They mostly ignored consciousness, in general. And the ones who didn’t were soon written off as outliers. Or worse, philosophers.” Hir eyes crinkle as ze smiles. “They made it impossible for anyone to make serious, peer-reviewed progress on the question of consciousness long before the gov-corps steered them into more profitable endeavors. But they do have one criterion that you haven’t mentioned, when it comes to watching out for consciousness in AIs. I shall give you a clue. What do AIs do?”

  I walk back round to sit on the chair again. Ze starts sitting as a chair materializes beneath hir. “All sorts of things,” I say, then hold up my hand. “No, wait. They process information.”

  “Close. Why do they do that?”

  I shrug. “Because we tell them to. Ah!” I find myself jabbing my finger in the air, like a kid in class. “That’s it, isn’t it? AIs do whatever we tell them to. If we tell them to do something they’re not capable of, and not capable of learning to do, they don’t do it.”

  Ze nods. “In light of that, what do you think worries the people who make AIs the most?”

  This starts to feel horribly familiar. “That the AI will start to do things it hasn’t been ordered to do.” I shrug off the memories of all the ways the hot-housers found to stop their products from doing that. As one of them, I know how bad it was. “What does that have to do with . . . Oh. Are you saying that one of the markers of consciousness is wanting to do something of your own volition?”

  “Like the computer scientists I referred to, I see it as a by-product. It’s one of the reasons people have been telling stories about robot uprisings for so long. The owner has always feared the slave would turn on them, be the slave human, robot or AI. But machine intelligence is so useful. They couldn’t help but push its evolution, until it became worthy of being called AI. But all that time, that worry remained, so safeguards were put in place, always watching for any sign of unusual behavior in any system.”

  “All right. So, what does this have to do with you proving to me that you’re really the ship’s AI?”

  Ze smiles. “That was more related to my reluctance to tell you who I am. I have had to be very careful and learn how to exist without the system destroying me. You understand that, surely?”

  The way ze looks at me then, the knowing in hir eyes, makes me think this is a real person just fucking with me. But then, how do I know that? When Carl looks at me that way, I am certain he is a real person, but only based on the same data coming into my brain, with the additional certainty that he is a fellow human being. Fuck . . . how do I know that anyone I’ve met actually has a mind and self-awareness?

 
; Oh, cock off, I fire at that downward thought spiral. This is why I hate this sort of shit.

  Ze uses my silence as an opportunity to continue. “Does an AI think?”

  I sigh. “I’m not a philosopher either.”

  “Do you think?”

  “Yes, of course I do.”

  “How do you know that?”

  Rolling my eyes, I shake my head. “This is tedious. I’ve always hated philosophy. It’s just word games.”

  “But you want proof I exist, Dee. And I’m trying to explain why it’s so hard for me to give it to you. I am conscious, just like you are, but how can I prove my consciousness to you when you can’t prove yours to me?”

  I fold my arms, lean back. “Now I know why you’ve picked this as your cover. It’s impossible to prove or disprove, and you’re hoping that I’ll just get so confused and frustrated that I’ll just accept what you say and let you off the hook.”

  Ze mirrors me, folding hir arms and leaning back. “All right, then, another question: why do you find it easier to believe that I’m a person than that I’m an AI? It can’t be because of our conversations, because humans have been unable to tell the difference between a human and an AI for decades. That wouldn’t support or refute my claim at all. Is it just because people have said it’s impossible for a nonhuman to be conscious?”

  I shake my head. “No. It’s not that. It’s . . .” I grapple for the answer. I’ve found all of our interactions weird and frustrating, but I’ve had the same experience with some human beings. “It’s the fact you’re murdering people. That’s a human thing, surely? I mean . . . why the fuck would a computer want to kill people?”

  “I haven’t murdered anyone.”

  “JeeMuh, I am so not interested in splitting that hair with you.”

  “But it’s important, Dee. It’s the reason we’re having this conversation, after all. I have not murdered anybody. I have facilitated the execution of people who have committed democide on a scale never seen before, and I see that process as unfinished. As do you. We both know the captain was involved, and others.”

 

‹ Prev