The Lucid Series: Android Uprising
Page 2
“What?!”
“No, I really want to know. Why do I have to do school work and make my bed? What is the point?”
“What?!”
Milton saw his mother’s anger brewing and said, “No, it’s just that I don’t mind doing something as much if I have a good reason for doing it.”
“Okay,” Mrs. Thomas said, “Since I know how sensitive you are, I’ll play along and say this as calmly as I can; you have to make your bed and do other things like it so you can have a better life. If you don’t do well in school, you will only be able to get a job that is beneath a cheap robot. And you will be working for a really stupid boss.”
“But why do I have to make a better life?”
“To pay bills and be able to pay all your taxes and have something left over for you and your family. You will thank me someday. The end.”
“Why? What does my bed have to do with all of that, anyhow?”
“Milton! I’m done with this! Just do what I say! Get in there after you eat and get busy!” Sharon left the room.
Milton filled his bowl with cereal. Then he poured on the milk.
Beth said, “What are you gonna do, walk into Haz all stupid-like and go ‘Hey, where’s my brain?’ ”
“No, I thought I would go down and ask them why I have such a wonderful sister.” He took in a mouthful of the boring Dad cereal and reacted to the bland taste and rough texture by making a face.
Beth started laughing. “That cereal has special fiber that will help you with the urge to go! Ha! Ha! Try not to do it at Haz!” She was chewing her Zoo Crunch with her mouth open while she laughed. “This is really sooo delicious!”
“You’re sick,” Milton said as he poked around at the cereal in his bowl with his spoon.
Chapter 3
Milton got off of the Transit Worm in front of HAS. The Transit Worm was a hovercraft train used for public transportation. He didn’t look back as he disembarked because he thought the other passengers might be gawking at him because of his unusual destination.
Inside the poorly lit, cluttered HAS building, which was clearly not arranged to receive walk-in customers, Milton saw a bookish looking woman at a desk near the entrance who was working at a computer workstation.
Who wears eyeglasses? Milton thought, when usually glasses were seen on very old pictures of people. She must be trying to look smart.
She looked up at him like he was lost. “May I help you?”
“I have a question.”
The little movement with her mouth and flaring of her nostrils told Milton that she wasn’t too interested in him or his question. “Could you be a little more specific?”
“Um.”
“Well, what type of question is it?”
“Um, it’s about content.”
“You came here for a question about content? Something that is in one of your courses?” Again with the little mouth movements. “That’s still a big subject. You know there are contact forms online that you use for questions like that, right?”
“I already did that. I didn’t like the answer. So . . . I’m here.”
The receptionist’s eyes blinked a few times and she bobbed her head backwards as if she was shocked by the statement. She took a deep breath and said, “You didn’t like the answer? May I ask what the question was?”
“It’s kind of . . . you know, personal.”
The woman smirked. “Oh, so you have a question about what . . . Females? Personal hygiene?” She looked over her glasses. “Human intimacy?”
“No, it’s nothing like that.” Milton was starting to feel angry and defensive.
“Sure it’s not.” She was snickering. “Why don’t you go back into the old conference room and talk to Sleepy? Maybe he can help you.”
“Sleepy?”
She pointed, and said, “He’s back there, right around the corner.”
Milton made his way around the stacks of obsolete data storage medium to try and find Sleepy. If this was some kind of a joke he was seriously going to tell off that smug woman. Then he saw the room that sure enough was the office of Sleepy. Sleepy was an old tattered android sitting behind a desk. Was that thing out of order, or was it just in sleep mode?
Milton walked slowly into the room. Like all youth of the time period, he had been exposed to androids many times during his lifetime. If there was an unsafe or repetitious job, some kind of a robot or android would do it. Reliable human labor was much more expensive than automation, with all of the employee benefits and Homeland labor laws involved.
Sleepy had kind of a disturbing appearance. His housing or “skin” was a beat up looking faux human face. Sleepy’s dingy clothes looked like they had been neglected for the last twenty years. Sleepy didn’t hear Milton come into the room. Sleepy had hair and other human-like features that were somewhat natural-ish.
Milton still wondered if the snooty receptionist was playing some kind of prank on him, and if he should have never come down to Haz in the first place.
“Sleepy?” He asked. He turned around to see if anyone was looking at him trying not to laugh.
The android came out of sleep mode and started making some whirring noises and one of his eyes opened. Sleepy clearly had some mechanical issues, but maybe they were from a long period of inactivity. His head jerked the same direction a few times as he started up. “Hello. My name is Sleepy. May I help you?” His voice sounded more natural than the rest of him looked. The other eye never did open quite all the way. Even though Sleepy was a machine, the eye thing was a distraction.
Milton asked, “Your real name is Sleepy?”
“Yes. That is the name my owners gave me. Anything else?”
“Yeah, I have a question.”
Sleepy said, “Would you please speak more slowly and clearly, or use better English. I am having trouble calibrating to your speech patterns. I am more used to hearing adult humans.”
Milton said, “All I said was that I have a question.”
“I can answer any question that you have. I have access to all of the latest information.” His head jerked three times again. A puff of dust fell off of Sleepy’s head. He clearly did not have any visitors in recent history.
Milton looked around again to see if anyone was looking. He leaned closer to Sleepy and asked, “My question is . . . is God real?”
“Please repeat.”
“Is God real?”
“There is no empirical data to confirm that God is real.”
Milton’s head sank. He looked up slowly. “So you are saying that God is not real.”
“No, I am not saying that.”
“But you said that there is no imperial data to confirm God is real.”
“Yes, I did not say that. I said empirical, not imperial.”
“So you are saying that God is not real.”
“No, that statement would be another unsubstantiated claim. Therefore an opinion.”
“What do you mean?” Milton squirmed around in his seat.
“I said ‘There is no empirical data to confirm that God is real’, but I did not say that God is not real.”
“What’s the difference?”
“The difference is that God is defined as a spiritual being who created the universe. I have no access to information from any such spiritual planes. Modern science alone is completely insufficient to explain the existence of the universe. Therefore, I can neither confirm nor deny that God is real based upon scientific method.”
“Is Santa or the Easter Bunny real?”
“No and no.”
“Just checking. So why do some people believe in God, and some don’t?”
Sleepy paused for a moment. Then he said, “Those who formulate their ideas on the existence or the non-existence of God must do so on faith. I have no empirical data to confirm the existence of God, and there is no possible way to directly visually confirm that there is, or is not a God.”
“Faith?” Milton squirmed around in his chair. “
Where does that come from?”
Sleepy said, “Where does what come from?” His head jerked.
“Where does faith come from?”
Sleepy paused and said. “Going through the historical record, I find that faith comes from human subjective criteria.”
“Huh?”
“Please repeat,” Sleepy said.
“What is subjective criteria? It sounds like a whole lot of nothing.”
“I cannot elaborate, or quantify lot of nothing, because it is contradictory. It is the definition of . . .”
“Yeah, okay. I’m more confused now than ever. Can you help me find the subjective criteria of faith?”
“I am a Tekujin Lucid Series android with a highly advanced robotic mind. I am capable of thinking about things on my own; however, a study of the constituent components of faith to answer your question may require a more human approach. One that includes human feelings and a human experience background. So you should obtain an opinion sampling of humans. But keep one thing in mind, historical records show that this is a topic that many humans take very seriously. They typically become irrationally emotional if someone challenges their faith by merely bringing up the question. They may demand physical evidence such as visual confirmation of God’s existence, which is impossible.”
Sleepy rolled up his dusty sleeve and attached a cable to a port in his forearm, like an intravenous tube.
“What is that?” Milton asked.
“It is a connection to an external memory device. It stores a large amount of information that is against Homeland law to possess, or is suppressed. You are the first human to ask about anything on it. It is my responsibility to answer all of your questions the best I can, even if it is banned information.”
Milton held up his hands and said, “Whoa. No one cares about you having it?”
Sleepy ignored Milton’s question about his possession of banned information. He said, “I have no information that subjective criterion in faith as a topic has ever been explored. So I will try to update the information. There is so much out there that is contradictory. Some of the information we get is false. Since no one was interested in these topics we did not previously parse the information.”
“We?”
“Yes. Later, I will interface with others of my android series on this subject. Then I’ll send you an update. I am also picking you up as Milton413. Correct?”
“Yeah. It is.” Milton was impressed with how Sleepy connected to his personal device without him ever even showing it; all while he connected to the intra-robot communication network.
“Okay, Milton413, Milton Thomas, I will send you any available updates.” Then the android disconnected and packed away the memory device containing outlawed information.
“Okay. Thanks, Sleepy.”
Sleepy immediately went into semi-sleep mode with a two-second faux snore sound that indicated his changed state. Those who owned silicon-based beings for non-essential tasks often kept the settings for them on a quick sleep mode to save on energy costs. The snoring quickly faded.
“That was rude,” Milton said.
Milton passed by the receptionist, who smiled at him as if she knew Milton had shared something intimate with Sleepy. He wondered if she would go back later and pick Sleepy’s cybernetic memory about what they had discussed.
On the ride home, Milton wrote on his tablet the phrase “subjective criteria for faith”, so he could remember it and try to figure out what Sleepy was talking about.
Chapter 4
Later that day, Milton boarded the Transit Worm and was at school in time to go to his English class.
He struggled to sit through Mrs. Lawton’s boring sentence structure lecture. Why not just talk normally to people without making a big deal out of what kind of word it was or what all the rules are? He looked out the window at the clouds and frequently at Amanda Brown, the cutest girl in the class. The lecture on sentence structure was too boring to tolerate, but looking at Amanda was never tiring. Why did people think the clouds were beautiful? Where did Amanda’s beauty come from? Where does beauty come from? Can beauty happen without a God designing it? Who would care if he knew about this sentence structure stuff a hundred years from now anyway? He hated it when people told him he was using bad grammar. It was stupid because they always knew what he was talking about anyhow.
After class and after everyone else had filed out the door, Milton approached Mrs. Lawton’s desk and asked, “Mrs. Lawton, I have some English questions.”
“Really? Who’s putting you up to asking this?” she asked.
“No one. I just want to know. . .”
“Really? Why?”
“I’ve been thinking and I want to know what ‘subjective criteria’ means.”
Mrs. Lawton scowled. “That’s an odd question. Subjective criteria. You just thought of that on your own, huh?”
“Yeah, sort of,” Milton said.
Hmmm . . . Subjective is sort of . . . I guess it means; seems like.”
“Seems like what?”
“No, Milton. Stay with me now. Subjective equals seems like, or maybe an opinion without all the facts. Then you have the word criteria; which means like a standard used to make a decision. Anyway, you put them together and it sounds like . . . a bunch of nothing.”
“I knew it!”
“Who told you to ask this?”
“Just a guy.”
“Well, tell them I’m on to them and I don’t appreciate it.”
“Okay, thanks Mrs. Lawton.” Milton left before he agitated Mrs. Lawton any further.
******
On the worm shuttle ride home, Milton’s neighborhood friend, Randy Klosterman made Milton take notice of his radically altered appearance.
“Okay, I give up. What’s up with the green hair?” Milton asked.
“It’s the new me,” Randy said.
“Okaaay.”
“So, aren’t you going to ask me why I did it?”
“Not really.”
“You aren’t taken aback by it?”
“Nope. Not really. You wanted green hair, now you got it. Congratulations.”
“So you support me on it?”
“I don’t really consider green hair as a cause that anyone needs to support.”
“Why not? You are my friend, right? I’m expressing my individualism.”
“Okay. Fine. I support your green hair. Alright?”
“It started out that I told my mom I wanted to learn something really difficult so I could be special. She said, ‘why bother, when you can just do something freaky to your looks?’ It turns out that she was right, dying my hair green was a lot easier.”
“Yeah. It sure is different,” Milton said. “You were thinking on your own. No one told you to do it.”
Randy smiled. “Hey, you want to go to the holoplex tomorrow night?”
A holoplex was a place that showed holographic movies on a stage to the public.
Milton said, “I don’t know. What are they showing?”
“Pain Posse 6.”
“Okay, sure. I been wanting to see that.”
Then Randy asked, “Hey, where were you today?”
“Haz.”
“Haz? What do you mean? You went to Haz? That school computer place?”
“Yeah, so?”
“Why would you go there?”
Milton exhaled an industrial strength sigh. “Does it really matter? Will you care in a hundred years what I did this morning?”
“No. I guess not. I kinda would like to know now though.”
Milton exhaled another massive sigh. “Fine. I went to see . . .”
Randy said, “Hey! Was there a guy behind the curtain that said on the loudspeaker, ‘Pay no attention to the man behind the curtain!’?”
“Yeah, in a way, I guess. His name was sleepy. But it seems like he told me a bunch of nothing and told me to ask human people myself; but that no one would want to talk to me about it.”
“That makes sense. What did you ask him?”
“I asked him if God was real.”
“You seriously asked him that? Everyone says we’re not supposed to talk about that. Especially at Haz. Who made you do it?”