Book Read Free

They Named Him Primo (Primo's War Book 1)

Page 8

by Jaka Tomc


  “Then it’s clear.”

  “Who do you save?”

  “The android.”

  20. Maia, 2031

  “Duty, honor, achievement!” shouted three thousand cadets in unison. They were standing on the plateau of the New Mexico Military Institute to mark the beginning of a new school year. Maia was—along with nine hundred others—among the new cadets, also called rats. They would be called that until the beginning of the new school year, when they would evolve into cubs. Maia wasn’t bothered by that at all. She was where she wanted to be, ready to become what she was destined to become.

  “New cadets, welcome to the other side of the rainbow. There are no mommies and daddies here to comfort you when you feel bad. Instead, you have your new friends. Rats, cubs, and old cadets. The latter being much smarter than you, I wouldn’t count on their assistance. Bear in mind that we will discover your soft, fluffy core and transform it into the hardest of stones. Why? Because a stone can effortlessly defy nature. Furthermore, a stone is an elemental yet convenient and powerful weapon. I can already hear some wise guy’s thoughts—that stone is not the hardest thing in the universe. That you would rather be a diamond than a stone. My answer to all of you is simple. Diamonds belong on crowns, aristocratic fingers, and wrists of androgenic musicians and sportsmen. Are you any of those? I can’t hear you!”

  “No, Colonel!”

  “I like this generation already. Just follow that path, and maybe you will become something more than a gravel stone on some third-rate, abandoned beach. Enjoy yourselves today. Tomorrow your dreams will vanish so we can make room for reality. Rats, your survival course begins. Hooah!”

  “HOOAH!”

  * * *

  Maia put her underwear, socks, and t-shirts into a footlocker. She checked the room that would be her home for the next ten months. Thirty beds. Thirty cadets would share the same fate as her. She wasn’t afraid of the training. As a matter of fact, she could hardly wait for it to begin. She had dreamed about this for years on end. Truth be told, ever since she had changed her dolls for soldier figurines, she hadn’t dreamed about anything else. At first, her mother had tried to convince her that the military academy was not a proper place for young girls, but Maia was persistent. In the end, Maia’s mom partly reconciled with her daughter’s decision. Maia’s brother, Hernando, said she was butch and should go to a school where there were other butch girls so she could finally hang out with some of her peers.

  “Interesting first day, wasn’t it?” said Christine, the girl Maia shared a bunk bed with.

  “It was OK,” said Maia. “I’m still trying to grasp the fact that thirty girls will be living in the same room. It could get quite chaotic.” “If any of them forget about their manners, just hit them without hesitation. You have to show them that you’re a strong, independent girl."

  “I’m not sure I could do that. What if I hurt somebody?”

  “Are you sure you chose the right school?”

  “My father picked it for me. He always wanted a son.”

  “Sorry, but that’s the dumbest thing I’ve heard in quite a while,” said Maia.

  “In a way, I wanted it too. Dad just showed me the way.”

  “But what do you want?”

  “To serve the nation!”

  “No. What do you really want? If you could become anything.”

  Christine took some time to think it over. “When I was little, I wanted to become a ballerina. Or a painter.”

  “I guess your father shattered those dreams,” said Maia.

  “In a way. He believed that making a living from selling art was only possible because there are a bunch of stupid people who are willing to pay top dollar for something that holds no real value.”

  “That’s just ridiculous. What do you think about it?”

  “For me, art is something beautiful and priceless. Expressing emotions through different media is something that makes us human.”

  “Did you tell him that?”

  “I tried, but he always had counterarguments. The last time we discussed it, he called me a hippie. You have to understand that my father isn’t the kind of person you would love to argue with. If you don’t accept his views, the debate reaches an end rather soon.”

  “That’s why you didn’t protest much when he sent you here.”

  “Exactly. I just didn’t see the point in confronting my father. In the end, I’ve come to a point where I’ve figured out that I lack confidence, and the military academy can help me with that. I’m here, and I’m ready to transform myself into the hardest of all rocks. Maybe my dad will respect me more when I succeed in doing so.”

  “I get it. Listen, if you have any problems with other girls, let me know. I’ll take care of it.”

  “Thanks, Maia. Hey, what about your story? Why are you here?”

  “My father was a soldier. I always admired him. When he passed away, I decided that I’d step into his boots when I got older.”

  “Oh, no. Was he killed in action?”

  “Yes. An autonomous drone. It’s called a sparrow. There was nothing my dad could’ve done. He didn’t even see it coming.”

  “I read about sparrows. Those things are small but deadly.”

  “They can hit their target from five hundred meters away. They say they’re deadly silent because no one’s ever survived an encounter to talk about it.”

  “Sometimes I wonder what the wars of the future will look like,” said Christine. “We’ll need fewer and fewer soldiers. Human soldiers, I mean.”

  “Autonomous ships, autonomous land vehicles, autonomous planes, and drones. The future seems dark. But once those supercomputers fail, they’ll still need us, flesh and blood.”

  “You’re probably right, Maia. Robots will never be fully human.”

  “Never. Androids may look like us; walk like us; smile, talk, and think like us, but humans will always have something more. An edge that can’t be produced or programmed in a laboratory.”

  “You mean a soul?”

  “Yes. A God-given soul. A soul that connects us to him and the whole universe. Some say the soul doesn’t exist, but I know it’s real.”

  “I believe in it too,” said Christine. “But I’m still scared of robots, androids, or whatever they call them these days. Did you see that…What’s his name? Ringo! He’s so smart. It’s scary.”

  “Primo.”

  “That’s right. Primo. At first, I couldn’t believe it. He’s so humanlike. His voice doesn’t sound robotic at all. I expected Stephen Hawking’s voice or something like that, but what came out of his mouth was a man’s voice. A nice voice, even.”

  “These scientists have no idea what they’re doing. They’re playing with fire, and they’re gonna get burned,” said Maia.

  “Who knows what’ll happen in the future.”

  “Nothing good, Christine. A computer shouldn’t be trusted with making any decisions. It can be fast, it can be smart, but it should never be able to operate autonomously. They say Primo is self-aware. That means he has a consciousness. That he feels. Why would anybody want, or need, an android that can sense anything—or have feelings, for that matter? He’s still a machine, even if an artificial skin has been stretched over his metal body. We can never forget that they are machines. There always has to be a difference between them and us. We were created in God’s image. They are just a well-designed copy.”

  A crackling was heard from the speakers in the room.

  “Rats, your first training begins in ten minutes. Crawl out of your holes, get dressed, and say goodbye to your teddy bears. Your childhood ends today. Welcome to hell.”

  “Didn’t they say we’d start with our training tomorrow?” asked Christine.

  “Rats don’t know how to use calendars,” said Maia.

  They both laughed, then opened their footlockers and started getting dressed.

  21. Kent, 2048

  He was making himself a legendary sandwich when Lucy ent
ered the kitchen.

  “You’re hungry at this hour?” she asked.

  “What time is it?”

  “Thirty minutes past midnight.”

  “Interviews are tough, and a man’s got to eat,” said Kent.

  “Of course. I didn’t even hear you come home.”

  “I came back about twenty minutes ago. I stopped at the bar on my way home.”

  Lucy sighed. “Oh, Kent. Don’t tell me you drank and drove.”

  “Don’t worry, I had one beer.”

  “How was the show?”

  “You didn’t watch it?”

  “Not yet. I’m sure that you did great.”

  “I don’t know. I always feel like I could say a lot more.”

  “You can never say it all.”

  “True. But still. People need to hear the other side of the story. Everything we read, listen to, and watch is so discriminatory.”

  “I recently read some articles that were taking the androids’ side.”

  “I saw none of those. Oh, I did read one, but even that one was tentatively critical toward our administration.”

  “I’ll send them in the morning. Not everything is that dark. The situation will resolve itself. Be optimistic.”

  “My optimism won’t help find or be part of the solution.”

  “You know very well they can’t just destroy them. We’ve come too far for that. The government will probably keep the androids locked up for a few more days. Then they’ll realize there is no cause for the alarm and they’ll let them go.”

  “I’d love to believe your story, but things are more complex than that, Lucy.”

  “What do you mean?”

  “Have you heard why Stephen Dean was murdered?”

  “Because the android believed in the afterlife.”

  “That’s right. Leo, the android who killed him, came to the conclusion that life after death exists. So he could kill Stephen without breaking the second law. If Leo reached that assumption, I believe it’s safe to say that there’s a high probability he’s not the only one. No one will release potentially dangerous androids. And they’re right.”

  “I understand. But there must also be androids who don’t believe in the afterlife. The same as with humans. Every single person has their own beliefs, their own truth about life and death, their own view on the world that surrounds us.”

  “That makes sense,” said Kent. But we’d have to talk with every individual. Then there’s another question that needs our attention. Why did it take them seventeen years to come to this conclusion?”

  “I agree. It’s a good question. But you need to know that believing in the afterlife cannot be the sole motive for murder. Even with humans, there are very few cases of one person killing another because one of them wanted to reach the great beyond. Why would we expect androids to act differently? They still have to obey the code. The third law states that they have to listen to people’s orders. So Stephen could’ve prevented his own death. As far as I know, the android didn’t catch him off guard.”

  “You’re right. Stephen had to have given an order to kill him. It’s the only logical explanation. But that doesn’t change the fact that some androids are dangerous. Imagine someone giving a sniper rifle to an android and ordering him to kill somebody. If the android believes that the target’s life won’t end with their physical death, he won’t break the code. As I’ve said. The matters aren’t that simple.”

  “But still he’s breaking the law if the target doesn’t want to die,” said Lucy.

  Kent’s face suddenly glowed. “Of course! An android can’t hurt a human. If the human doesn’t want to die, the second law is broken.”

  “Exactly.”

  “Lucy, that’s fantastic! Now we have a strong counterargument. Why didn’t I think of that before my appearance on television?”

  “I have a feeling that wasn’t your last media appearance.”

  “Damn sure it wasn’t! Who should I call?”

  “How about Kevin? Doesn’t he have that technological show? His ratings are enviable.”

  “Great idea. I’ll call him first thing in the morning.”

  * * *

  “Joining me today is an extraordinary guest who will shed some light onto current events that have captured the whole wide world’s attention. I’m Kevin Drake, and I’m proud to present Kent Watford. Doctor Watford, welcome to the Dark Side of the Moon.”

  “Thank you for having me.”

  “Our viewers want to know if the androids have become a serious threat. It’s clear now that Stephen Dean was murdered by an android. Is it possible that we’re facing a scenario in which androids are becoming superintelligent killing machines?”

  “Androids are not machines. They are conscious beings. Their foundation is inorganic: magnesium skeleton, silicone muscles and ligaments, silicon brain, and artificial skin. When you look at the androids of one of the latest generations, you can hardly separate them from us. Even when you talk to them. They will laugh at your jokes and maybe even tell you one of their own. If you tell them a sad story, they will show compassion. Not because they are programmed to do so; they’re genuinely sorry when something bad has happened to you. That means that androids have empathy. No supercomputer is capable of that.”

  “I see, so they’re not machines. Have they become dangerous?”

  “My intuitive reaction to your question would be no, they have not. The event responsible for the situation we’re in now, this apartheid, was a deviation from their normal behavior. We all know that androids can’t hurt people. Seventeen years of cohabitation attest to that fact. We’ve forgotten that many lives have been saved because of androids and their actions. This unfortunate incident came as a surprise but at the same time, it is proof—now don’t get me wrong—that androids have evolved and are coming to their own conclusions about questions on life and death.”

  “So, if I understand correctly, you think a murder was rational if the killer believed his victim was going to a better place?”

  “No. A murder is a murder and should be treated as one of the worst crimes. No matter if it was done by a human or an android. But I can’t remember another case where authorities imprisoned a whole city, an ethnic group, or all adults because of one murder. Things just don’t work that way. We have laws for a reason. And people and androids alike must obey them. I understand that a sense of unease engulfed people when they found out the killer was an android. It’s new, a turning point in modern times. But we have to explain the basics over and over again so people will understand. Androids think. They feel. They are aware of themselves. When you turn off an android, it’s the same as if you turned off a human. They die. There is no switch to turn them off and on again. That is a truth that a lot of people are unaware of.”

  “If I go back to my previous question. Theoretically, all androids could believe in the afterlife and thus be potential murderers. What if a large group of androids determines that it is best for humanity if the global population is thinned down?”

  “That fear originates from ignorance and a lack of understanding of android laws.”

  “Can you explain?”

  “Of course. The first android law, as you know, states that an android can’t harm humanity. Nor can it, by inaction, allow humanity to come to harm. Your question is legitimate because androids could indeed decide that a less populated world would serve humanity and act accordingly. But that’s where the second law comes into play. It states that an android can’t hurt an individual human either directly or by inaction. We bump into a hurdle there because androids can’t thin the human population if they can’t kill a single human. The first law is not superior to the second. They are equivalent. It means that androids can’t kill people for the greater good or, in this case, for the good of humanity. So there is no need for fear.”

  “But we still have an android who killed a man. He broke the second law…”

  “Yes and no. As you know, an andro
id named Leo believed or knew that Stephen Dean’s life would continue after the death of his body. Still, he killed him, thus, breaking the second law. Unless Stephen ordered Leo to kill him.”

  “We know androids must obey people’s orders, but not if those orders violate either of the two superior laws.”

  “It wasn’t a violation of the second law if Stephen wanted to die.”

  “Excuse me?”

  “Ending his life was his decision. All he needed was the right android, the kind that didn’t see his wish as a violation of the code. Maybe Stephen convinced Leo that life continues after death. Maybe Leo figured it out on his own. But facts are facts, and I’m a hundred percent sure that Leo couldn’t have killed Stephen if Stephen had disagreed with it.”

  “But Stephen Dean could have killed himself.”

  “Of course he could. But some people believe that their soul doesn’t continue its journey if they end their life with suicide.”

  “We’ve sailed into deep metaphysical waters.”

  “That depends. You can believe in the existence of a soul. You can believe we’re being threatened by beings who can’t hurt people. You can also believe that we’re stockpiling weapons in orbit to defend ourselves from little green men, or that the big bang never happened, or that we never landed on Mars. Those are opinions, and everyone believes whatever they want to believe. It was the same with Dean and Leo. They believed—or knew—that Stephen’s soul would leave his dead body and move on to a different, probably better world. Of course, the fact that there are androids who can kill people who want to die is alarming, but that doesn’t mean androids pose a threat to humanity.”

  “You sound pretty sure.”

  “I am sure. Look, as you and your viewers know, I’m the man who presented the first conscious android to the world. We’ve taken all security measures and incorporated the suggestions of the United Nations. Even the suggestion that an android’s brain capacity shouldn’t surpass that of a human. All androids—this is common knowledge—are made in the same factory according to internationally acknowledged standards. There were attempts—in China, Russia, and Brazil, if I name just the big ones—to make copies of our androids. How did it end? They had to destroy them. It turned out that androids are very hard to control without the code. But we found the right path. Centralized manufacturing, the highest standards, and a model with a necessary code that works.”

 

‹ Prev