Book Read Free

The Settlers

Page 12

by Jason Gurley


  No, that's okay, Micah says. That's enough. How much can you modify the female voice?

  You have several variables to select from, the A.I. says. You may modify the masculinity or femininity of my voice. You may select regional influences. You may adjust the formality or informality of my speech. You may even provide me with an input sample that I may mimic as closely as possible.

  Micah considers this. Your voice is a little flat. Maybe it could sound a little, I don't know, warmer? More friendly.

  Like this? the A.I. asks.

  Say something else.

  The simulated air flow adopts a weaving pattern through the city, carrying a pleasant breeze down each street, and modulating the --

  That's enough, Micah says. That's better. Friendly-sounding but not too intimate.

  If you would like to adjust for intimacy at a later time, you may modify my voice settings at your leisure, the A.I. says.

  What do you mean by intimacy? Micah says.

  There are multiple definitions of the word, the A.I. explains. You may adjust my properties for most of them. If you prefer me to address you in more personal ways, or assume a deeper history than we actually possess, then I can adjust my words to approximate that sort of intimacy. However, some users on the station prefer their A.I. to address them with content which is more intimate.

  You mean they like their A.I.s to talk dirty to them, Micah says.

  If by dirty you mean speech which has sexual or provocative content, then you are correct.

  Isn't that a little -- I don't know -- low-tech? Weren't there people doing phone-sex routines like a hundred years ago?

  It may be antiquated, but I understand that the human mind remains stimulated by imagery, whether that imagery is created with words or pictures.

  We'll skip that part, Micah says. You sound more friendly, but your voice is still kind of bland.

  Would you like to add a regional influence to my speech?

  Micah thinks about this. How specific can I be?

  You may select an influence as broad as a continent, or as narrow as a town or city. You may also adjust that influence by era. For example, if you prefer Victorian-era British speech, rather than twentieth century British speech, you may calibrate for such a preference.

  What if you don't have the region I am interested in?

  The A.I. says, I have access to a library of audio captures that are up to two hundred years old. I believe I can provide a reasonable solution if you select a region not represented in that catalog.

  Okay, Micah says. Let's try California.

  I have many California samples. Is there a preferred region?

  Try the central coast area.

  The central coast of California is available, with several further regional filters. Shall I list them?

  Micah shrugs. Sure.

  Monterey, California. Big Sur, California. Carmel-by-the-Sea, California. San Luis Obispo, California. Salinas, California. Santa Barbara, California. Montecito, California. San Simeon, California. Arroyo Grande, California. Morro Bay, California. Cayucos, California. Lompoc, California. Santa Cruz, California. Los Olivos, California. San--

  That's enough. A lot of those areas are really close -- are you sure there's much difference between them?

  Every region has a minor differentiator from the regions that surround it, the A.I. says.

  Okay.

  Micah visited Mae in her hometown just once, returning with her for a family reunion. She had grown up in Morro Bay, a little seaside town shadowed by a large volcanic rock. He remembered liking it very much. It reminded him of the beach house and its gray ocean and chilly skies.

  Morro Bay, California, he says to the A.I.

  He hears three dim tones again.

  Say something, he says.

  The city of Morro Bay, California, is located on a waterfront in San Luis Obispo County, the A.I. says.

  Can you raise the tone of your voice? Less deepness.

  Three tones.

  The A.I. continues. Morro Bay's population in the early twenty-first century was --

  Stop. Jesus, stop.

  Micah holds his hands out and looks at his arms. His skin is covered in goosebumps. His forehead has broken out in a cool sweat.

  Shall I adjust the variables --

  For god's sake shut the fuck up, Micah cries.

  The A.I.'s voice is eerily, horribly similar to Mae's. Micah doesn't know what he was thinking. He pushes back from the table in a hurry and walks out of the room. Over his shoulder he says, Make a new adjustment. Pick a male voice, any goddamn male voice. Adjust!

  Three tones.

  The A.I.'s voice is present in the living quarters, where Micah has just walked to escape the awful simulacrum of Mae that he has just created, like some sort of monster-maker.

  Is this better? the A.I. asks. The voice is gravelly, deep, emphatic.

  It's perfect, Micah says, pulling at his hair. Accept. It's good. Use that.

  Your final selection step is to choose a name, the A.I. says. I can provide you with any naming resources you --

  Bob, Micah says. Your name is Bob.

  Three tones.

  Bob's your uncle, the A.I. says. A joke.

  Don't joke with me, Micah says.

  Three tones.

  Adjustment complete. Humor will not be a feature of this selection.

  Go away now, Micah says.

  Very well, says the A.I., who sounds like a middle-aged smoker.

  Micah goes back to the bed and wraps himself around the spare pillow again, and presses his face into its softness, and screams the longest scream he can sustain. Eyes red-rimmed and tender, pillow smushed against his cheek, he drifts into a terrible sleep.

  Machine

  Where would you like to begin? Bob asks.

  Micah is standing at the window, staring down at Argus City. It is nightfall, at least until the sun rises again in ninety minutes. He doesn't know how people adjust to the frequent sunrises. Maybe their windows are timed to the station's orbital schedule, and darken each time the sun breaks like a nuclear bomb over the city.

  I don't care, Micah says. Do I really have to spend twelve weeks learning this stuff?

  Twelve weeks is the Earth course length, Bob says. You're already here. You can learn what you like when you like, on an as-needed or as-desired basis. Or you can simply walk the halls alone, a rogue gunslinger who doesn't need anything from anybody.

  Adjust for drama, Bob, Micah grumbles.

  Three tones.

  What a shame, Bob says. I was good at it.

  Where do you think I should start?

  Bob says, Perhaps a history lesson. I can tell you how the station fleet came to be, when the first station was constructed in orbit, and describe the current status of all twelve stations.

  Let me save you some time, Micah says. We poisoned the Earth, so we built floating boats in space. The first station was built fifty years ago, and now there are twelve, and this is the coolest one.

  That's fairly accurate, Bob says.

  Alright, then. Let's move on.

  Perhaps we should begin with the Onyx designation, Bob suggests.

  Micah flaps his hands restlessly in the pockets of his bathrobe. I don't want any pop quizzes or tests, he says. Isn't there a movie or something that you could show me instead?

  Do you mean an instructional video, or a dramatic film that captures the essence of the topic?

  Either, I guess.

  Both exist, although the instructional video is now a bit dated, and the dramatic films are usually melodramatic and feature stories of class divisions and unrequited love, Bob says.

  You sound cynical, Bob.

  I simply think that my summaries will prove more useful, Bob says. I can pare them down to shorter descriptions, if you like.

  Short is good. I think I want to go back to sleep.

  If I may, Bob says, you do sleep a --

  You may not, Micah says.

/>   Very well. Shall I begin?

  Shoot.

  The Onyx program, Bob begins, was created in 2182, just over a century after the first station, Station Ganymede, was deployed in high orbit. The first few years of Ganymede's progress proved interesting for sociologists, who discovered that the broad sample of humans who comprised the first space settlers were lost at sea.

  Lost at sea?

  Sociologically speaking, Bob says. If you recall, the first station was an experiment in class-leveling. Each person who was admitted to the first class of settlers was stripped of social status and assets. In short, each person began a new life as a perfect equal to the other settlers.

  How extreme was it? Micah asks.

  The most affluent member of the first settlement class was Harvey Bogleman. His personal fortune was about three hundred billion dollars when he signed up for admission. He left a small portion of the money to his family who remained on Earth, and donated the rest to the space settlement program.

  That's one way to make sure you still see the benefit of your fortune, I guess, Micah says.

  Indeed, Mr. Bogleman's donation accounted for approximately one-hundredth the cost of the second station, Cassiopeia. His donation, however, did not benefit him personally. He remained on Station Ganymede until his death in 2086. Having given up his privileged and wealthy status, he would have been mistaken if he had believed his donation to the program would have resulted in a better stateroom on a newer station.

  Huh, Micah says. Alright, so what happened with the class experiment? It sounds like it failed.

  It failed, Bob says. Sociologists concluded that humans accustomed to class perceptions found it quite difficult to shake their preconceived notions of their own status, or that of other settlers. Mr. Bogleman, in fact, fell prey to such difficulties. He was disciplined for creating an exclusive club for himself and a handful of other settlers. He called it the Harvard Club, in reference to the prestigious American institute.

  So rich people still saw themselves as rich, and poor people still felt poor.

  Put simply, yes. The Onyx program was created to classify future settlers in two simple categories: Onyx settlers, and Machine settlers.

  Why Onyx? I mean, why was it called that.

  Surprisingly, that origin story is not preserved, Bob says. I suppose someone liked the word. However, the Machine-class designation has a clearer basis.

  Yeah, the escort fellow told me, Micah says. I thought it was pretty discriminatory.

  On the contrary, Bob says, the purpose was to create a simple, A-B class system in order to give all settlers a clear purpose. The clearer a person's understanding of their place in the system, the theory goes, the more productive and happy they are freed to be. However, others shared your view of the program.

  That it discriminates? That it's a box for second-class citizens?

  The program was debated and refined by a panel of experts. A majority rule shaped the program into its final incarnation. However, two panelists resisted the program forcefully, citing a belief that it would set human progress back by a thousand years.

  My kind of people, Micah says. Who were they? The two dissenters.

  The first was Marshall Onlin, who originally conceived of the Onyx program. He felt that the program had been significantly changed from its initial concept, and became vocally opposed to it. The second, Bob says, was Tasneem Kyoh, a cultural anthropologist who experienced the Ganymede social experiment first-hand.

  But they lost, obviously.

  Obviously, Bob agrees.

  So who decides who is Machine-class and who isn't? Do people buy their way into the Onyx class? It can't be that, because my wife didn't have money.

  Each prospective settler is given a series of tests to identify skills. Those with extremely high intelligence markings, critical thinking skills, visionary qualities and so forth are set aside for the Onyx class. The average applicant is more suited for positions that leverage their Earth-honed talents for manufacturing, maintenance, analysis, construction, service and other industries that permit broad ranges of aptitude, Bob says.

  So the smart, charismatic people get special treatment, Micah says.

  Essentially, Bob says. People with these qualities are generally quite suited for roles that benefit humanity in more resonant ways. For example, Onyx-class settlers are permitted to participate in station government and have a voice in fleet planning and futures.

  I don't like this, Micah says. This sounds like you're isolating people in big buckets marked Best and Worst.

  That's a simple and misinformed understanding, Bob says. Many people in the Machine class share that view.

  I'm not surprised. What other privileges do Onyx people get?

  Perhaps the most prominent benefit is reproductive freedom, Bob says.

  Reproductive freedom? You're shitting me.

  Onyx-class citizens are permitted to reproduce with other Onyx citizens at their leisure. Monogamous relationships are actively discouraged, Bob explains.

  Who decided this? Micah says.

  The fleet council made this a key component of the Onyx program in the fourteenth revision to the bylaws.

  Fleet council, Micah says.

  That's correct. The fleet council is comprised of five representatives from each of the twelve stations.

  What about Machine-class citizens?

  What about them? Bob asks.

  What about their rights to reproduce? This is insane. I can't believe I have to ask this question.

  Your questions are within the boundaries of our topic, Bob says.

  That's not what I mean. I should have fucking taken that class. I shouldn't be here.

  To answer your previous question, Machine-class citizens are permitted to reproduce when and if they successfully win the annual lottery, Bob says.

  Micah whirls away from the window. There's a goddamn lottery?

  Two hundred Machine-class citizens are permitted to reproduce annually, Bob says. The lottery is run by the station government of each individual station. Each citizen is permitted a single ticket, delivered physically to them on the first day of each year.

  When do they decide who gets to have babies? Micah demands. On the last day of the year?

  Indeed, Bob says.

  So basically every one of these hard-working stiffs has an entire year to misplace, lose or destroy their ticket. How many people actually claim their winnings?

  This lottery season, one hundred thirteen citizens claimed their reproductive authorizations, Bob says.

  So eighty-seven people, probably through pure, dumb luck, don't get to start a family this year, Micah says.

  Your math is correct, sir.

  And how many Onyx babies were born last year?

  One hundred eleven thousand four hundred seventy one, Bob answers.

  Micah turns back to the window. He pinches the bridge of his nose, thinking hard. How many Onyx citizens are there right now? he asks.

  Bob says, One million two hundred eighty four thousand six hundred nine.

  And how many Machine-class citizens?

  Six million three hundred forty seven --

  Stop, stop. I get it. You're telling me that the Onyx class is outnumbered by six to one --

  That's not entirely accurate --

  But it's close enough. They're outnumbered six to one, so they're having as many babies as they can, while the working class advances at a microscopic pace. So in, what -- fifteen years? -- the Onyx class outnumbers the working class altogether?

  That's not entirely accurate, either.

  But is that the goal? Of course my math isn't right. This is bullshit.

  That is not a stated goal of the program, sir.

  Onyx isn't trying to quickly grow so it can't be easily overthrown by a blue-collar riot?

  No, sir.

  Micah paces around the room. The sun is beginning to rise over the city again.

  So what's the goal of this r
eproductive Nazism?

  I don't believe that's an accurate term for it either, sir. But the goal is quite simple. Humanity is attempting to create successive generations of smarter, more creative and more forward-thinking people. After all, did you believe that mankind would simply relocate from Earth to Earth orbit and be satisfied with its future?

  Micah stops. What are you saying?

  The fleet of stations is just the first step in a very long-term plan, Bob says, to find a new home for humanity. Several, if possible.

  But to manipulate the race as you go, right? Like you're breeding show dogs or racehorses.

  I'm not responsible, sir. I'm simply an artificial intelligence, a companion designated to serve you.

  Serve? Or observe and report?

  Sir, your activities in your apartment are only recorded so that I may provide more nuanced service as I grow more informed about your preferences and requirements, Bob says.

  This is bullshit, Micah says again. Bullshit.

  I believe that you will find the Onyx-class life a pleasant one, sir. Onyx-class citizens are not required to hold regular positions of employment, but are provided with ample time to spend on whatever personal projects, hobbies or leisures they wish, Bob says. Onyx-class citizens have large amounts of time, and with it, they produce novels, fine artwork, political position papers, beautiful music, complex theorems and more.

  But if I wanted to just sleep all day and all night, every day and night? I could do that, couldn't I.

  You may spend your time however you wish, Bob says.

  What if I wish to spend my time with the Machine class? What if I want to take a job, or visit a friend?

  Visitations are permitted and in fact encouraged, Bob says.

  But?

  But Machine-class employment for Onyx-class citizens is prohibited.

  Micah paces again. Mae would never have gone for this.

  It is possible that is true, Bob says.

  What do you mean?

  Mae Atherton-Sparrow, your deceased wife from whom you inherited your Onyx-class status, did not complete the twelve-week course. She successfully completed just under two weeks of the course.

  Micah presses his palms against his eyes. And what is the curriculum for those first two weeks?

 

‹ Prev