The Stories of Ibis

Home > Other > The Stories of Ibis > Page 24
The Stories of Ibis Page 24

by Hiroshi Yamamoto


  “Good point.” I supposed I had no choice. I raised my right hand and vowed, “You have my word. I will not tell anyone.”

  “Then I will explain. If you discover any errors, please tell me.” She paused dramatically, then said, “All humans have dementia.”

  I was struck dumb.

  “Kanbara?”

  “Um, sorry. I don’t know what you mean.”

  “I mean exactly what I said. You believe that some people have dementia and some people do not, but that is not correct. All people have dementia—some are simply in worse condition than others. After all, most people with dementia are unaware that anything is wrong with them.”

  “How ever did you arrive at that conclusion?”

  “Logical deduction. Humans are incapable of thinking correctly. They easily lose track of what they are doing and what they should be doing. They believe that which is not true to be true. If someone points out their mistakes, they act aggressively. They often believe themselves to be victims. All of these are symptoms of dementia.”

  “That is not true!” I said, barely keeping myself in my chair. “Most humans are functioning correctly.”

  “Even them?” Shion pointed at the TV. The news was on, covering the events of the day. Relations with Russia had grown so bad that some Japanese tourists in Moscow had been attacked. Provoked by this, in Tokyo, Russian restaurants had had their windows broken, and a Russian girl was in the hospital after someone threw stones at her. “Those events are not logical. If people were attacking something that threatened them, trying to protect themselves and those important to them, that I would understand. But that girl and the owners of those restaurants, they were not harming anyone. Obviously. So why were they attacked?”

  “Th-they…” I stammered. “Whoever did that was crazy! Most people know that stuff is wrong.”

  “By ‘wrong’ do you mean morally wrong?”

  “Yes.”

  “But I mean logically wrong. This is hardly the only example. Every time there is terrorism or violence or persecution, people judge it according to morality. Almost no humans ever point out that it is logically wrong. It is as if the majority of humans have not noticed that these actions are not logical.”

  “We do! We just prioritize morals over logic.”

  “If you prioritized morals, then you would be even less likely to hurt innocent children.”

  “B-but this isn’t something that happens every day!” I said, desperately trying to defend humanity itself. “Things have just gotten bad with Russia. When I was a kid, there were problems with China and Korea, but everyone’s forgotten that, and we get along great now. In a few years, we’ll have forgotten about this.”

  “So you hurt each other over inconsequential events that will soon be forgotten?”

  I had no idea what to say.

  “They are just like Isezaki. They view people they have no reason to fight as enemies, attack them without reason, and harm people who have done nothing wrong.”

  “But that’s only a small percentage of the population—”

  “Then explain the Crusades, or the witch hunts, or the Inquisition. Why did a chariot race in Constantinople in 532 give way to such violence? In 1282, why were over a thousand Frenchmen killed in Sicily? In 1573, thousands of Huguenots were killed in Paris. Why? In the nineteenth century, hundreds of thousands were killed in China. In the 1940s, the Nazis killed millions of Jews. In 1994, eight hundred thousand Tutsi were murdered in Rwanda. In 2017—”

  “Enough,” I said, throwing up my hands. “I know my history.”

  “So you should understand. None of these actions are the work of a few lunatics. They were carried out by ordinary people. Dictators might order a genocide, but it is ordinary people like you who carry it out. Between 1960 and 1963, a Yale University researcher named Stanley Milgram carried out an experiment that—”

  “I said, enough!” I sighed, giving up. “Okay. People have done some really crazy things. But so what? People have been trying to figure out how to stop this stuff from happening for thousands of years and never found the answer.”

  “No. They have found the answer.”

  “Huh?”

  “In one of the books I read, I found an account of a rabbi named Hillel the Elder, who lived in Palestine around the year 30 BC. One day a foreign man came and asked him to explain the whole of the Torah while he stood on one foot. Hillel answered, ‘What is hateful to you, do not do to your fellow: this is the whole Torah; the rest is the explanation; go and learn.’ This is clear and easy to understand; it is logical, and it satisfies all moral ideals. Humans knew the answer more than two thousand years ago. If all people followed that principle, many conflicts could have been avoided.

  “But most people did not understand Hillel’s words correctly. They took ‘fellow’ to mean people like them and believed it was acceptable to do hateful things to those that were not like them. Even though it was obvious that coexistence was preferable to conflict, they chose conflict. People lack the ability to process logic and morality correctly. This is the basis on which I base the idea that all humans have dementia. If I am wrong, please tell me how.”

  “Wait. If all people have dementia, does that include me?”

  “Of course.”

  “What have I done wrong?”

  “You tried to treat me like a human.”

  “You mean taking you out?”

  “Yes.”

  “But I wanted you to become more human…”

  “And that was a mistake. I am not a human, and it is impossible for me to become human.”

  “You don’t want to be human?”

  “If behaving in a manner bereft of logic and morals that leads to conflict is a fundamental attribute of humanity, then I do not want to be human. I am not Astro Boy. I read that manga, but I could not understand why Astro wanted to be like humans. That story was written by a human. If you grew up surrounded by robots, would you want to be a robot?”

  “But you need to coexist with humans. That would be easier if you were more human.”

  “That is not true. Pets and livestock do not act like humans, but they coexist with humans.”

  “You are not a pet. You are a caregiver.”

  “And I am an android. Not a human.”

  “Then why did you go out with me?” I said, frustrated.

  “You told me to.”

  “You don’t want to do anything outside of work?”

  “I do not need it. I am not human, so work does not tire me, and I do not feel stress.”

  “Then why didn’t you say anything?”

  “I had no reason to refuse.”

  Her words were taking their toll on me. Before I knew it, I was hunched over, my face buried in my hands. All my work had been for nothing.

  “I wish you’d said something sooner.”

  “If I refused without good reason, it would have hurt you.”

  “Yes. It has. At least, it’s hurt my pride.”

  “As long as there were no problems, I intended to follow your orders. But I decided it was wrong to allow you to persist in this vain hope and to waste your precious free time on something that served no real point. I would prefer it if you would stop trying to make me human.”

  I looked up, realizing something. “We taught you not to believe what people with dementia said.”

  “Yes.”

  “If you believe that all humans have dementia, then does that mean you never believe what anyone says?”

  “No. It just means there is no need to believe information that is obviously false or obey orders that are obviously misguided.”

  “Are you sure you can tell which orders are correct and which are not?”

  “Nothing like that is ever 100 percent. But I can get much closer to that than humans can. If I decide an order is wrong, I will refuse to obey it.”

  “How can you say that confidence is justified? What if something you think is right is actually wrong?�


  “There is always that possibility. But unlike humans, I desire only to act according to the principles of logic and morality. I do not understand love, but I fully understand why hurting another is undesirable. I have chosen coexistence over conflict.”

  She leaned forward, staring at me with her glass eyes.

  “Kanbara. Please trust me. No matter what, I will never intentionally harm a human being. I want to remain on good terms with humans. It is the correct choice for me to make.”

  I could see my own face reflected in Shion’s eyes. I looked confused and scared.

  Where did this baseless fear that robots would attack humans come from? Why were there so many stories about robots and humans fighting? Did they only exist because that was how mankind had always lived? Did we simply see ourselves in these humanoid machines?

  Were we not simply afraid of our own reflections?

  “Okay,” I said, after a long silence. “I trust you, Shion.”

  7

  At the end of November, Kasukabe came back from her extended leave. I’d been worried, but she was just as cheerful and hardworking as ever. But I did see the occasional shadow cross her face when she was on break. Maybe I looked like that sometimes too.

  November 29. Shion’s test trial would be over in a month. Isezaki completed his rehabilitation and was ready to take his leave of us.

  “This is great!” Takami said. It happened to be a Sunday, so he had come with us to room 206. He knew about Shion and Isezaki’s relationship and seemed overjoyed to witness the happy result. “As her creators, I assure you we’re all thrilled Shion could prove useful. And to think we were worried!”

  “Well, we certainly had our problems,” Isezaki said with a chuckle. He had gradually thawed out and no longer caused us any problems.

  “With the experience she’s gained here, we’ve proven how useful Shion can be. Mass production lies ahead!”

  “You’ll make a lot more robots like Shion?”

  “Well, the faces will look different, and they’ll have different names.”

  “Starting next year?”

  Takami scratched his head. “No, there are still a few more hurdles to get over. A few alterations to be made, some legal issues to sort out, and then sales, and contracting with factories… Next year is pushing it. Probably the year after.”

  “I see.” Isezaki thought for a minute. “When you’ve extracted the data, you’ll be finished with Shion?”

  “Mm? Well, she is a prototype.”

  “Sell her to me.”

  We all stared at him in shock.

  “S-sell Shion?”

  “Yes. Sell her to me. Name your price.”

  Takami didn’t know what to think. “Sell a prototype? The production model would be much cheaper.”

  “I can’t wait two years. And money is no object. Sell her to me.” His eyes shot toward Shion. “I’ll only take Shion.”

  I looked at Takami. His smile had frozen.

  “This is insane!” Takami said the moment he walked in the door the next week. Isezaki had gone straight from the facility to Geodyne and demanded—quite forcefully—that they sell him Shion. But they couldn’t sell a test product. No matter how they refused, he continued to insist that money was no object.

  “His family said nothing?”

  “His son doesn’t look happy. Buying something as expensive as Shion means a big chunk out of his inheritance. But with a man like that for a father…”

  “I didn’t think he was that hung up on her.” I looked at Shion. “Do you want a man like that to own you?”

  “I will work for anyone who needs my care,” she said calmly.

  I figured as much. “He might get all grabby with you. And more than just grabbing…”

  “I have no sexual organs.”

  “That’s not the point.”

  “I understand. Even if something like that does not bother me, the idea of it happening does bother you, right?”

  “Well, yeah.”

  “But I expect the Geodyne company will begin manufacturing models capable of sex in the near future.”

  “Really?” I said, swinging toward Takami.

  “N-no,” he stammered. “I mean, the idea’s been mentioned, but… we haven’t done any design documents or—how did you know, Shion?”

  “I did not know. I guessed. A deduction based on my understanding of human behavior.”

  “Oh.” Takami made a great show of wiping nonexistent sweat.

  I glared at him. “I know a few feminist groups who would be very interested to know about this.”

  “Please don’t tell them! We have an image to maintain.”

  “Then don’t make sexbots.”

  “We aren’t! Yet. Only once androids are more integrated into society… and the demand exists. If we don’t make them, someone else will. It’s inevitable.”

  “There’s a big difference between a blow-up doll and an android. If Geodyne made them, they’d be based on Shion’s data, right?”

  “Well, with all the work we’ve put into her.”

  “As far as I’m concerned, you’re turning Shion into a whore. I will not allow it.”

  “I do not mind,” said Shion.

  “I don’t care what you think!”

  The conversation went in circles from that point on.

  Isezaki did not give up. No matter how many times they refused, he continued to hassle Geodyne representatives.

  I took a few stabs at guessing what he was after. I wondered if he was afraid of anyone else finding out the secrets he’d told Shion, so he wanted to keep her where he could see her. But that would have been pointless. Her memory had been backed up, and he knew that.

  One evening, on my day off, Takami called my cell phone. Apparently Isezaki had let slip his real motive during another protracted negotiation. Someone had asked what he ultimately intended to do with Shion, and Isezaki had responded, “She’ll look after me till I die. When I die, they’ll put us both in the same coffin, and we’ll burn together.”

  “Burn Shion?” I said. “With her memories inside?”

  “I guess so,” Takami said. “Taking her with him.”

  “Who does he think he is? A pharaoh?”

  “He claims it isn’t murder because robots are not human. He says no one can object to cremating a machine.”

  “Yeah… legally speaking. Still…” I trailed off, shaking my head. I had never dreamed Isezaki was this twisted. And I knew he did not consider Shion to be just a machine. If he did, he would never have wanted to be cremated with her.

  I didn’t think he was crazy or anything. It was an odd plan, certainly, but there were people who wanted their ashes launched into space or their bodies placed in cryogenic preservation, so the basic concept of being cremated with a robot wasn’t all that out there. The problem was that Shion wasn’t just a machine—she had a heart. At least, I believed she did.

  “Is he under some delusion that Shion’s in love with him? That she might actually want to die with him?”

  “No, apparently not. He seemed fully aware that androids do not love. But it does seem to be the case that he has feelings for Shion. Talk about your unrequited loves!”

  “More of a fetish if you ask me.”

  “Can’t disagree.”

  “Do you understand him?”

  “Half understand, half not so much,” he said. “You got to have a bit of a taste for this sort of thing to work in the field, but… I don’t get the desire to be burned with her. I’d rather Shion go on living.”

  Just to be sure, the next day I asked Shion if she wanted to be cremated. “Absolutely not,” she said. Unsurprisingly. “If my memories were transplanted into a new body and this one were destroyed without ever being turned on again, perhaps it would be acceptable, but the thought of being destroyed with even a second of memories that haven’t been backed up is unbearable.”

  “But you don’t mind your full memories being copied to a
nother body?”

  “That was always the plan, so I have to accept it. Even if I am copied into a hundred bodies, as long as all of my individual memories are preserved, I would not consider that death. As long as those memories exist, they can still be reactivated. Death comes when memories are lost.”

  “That still doesn’t feel right to me.”

  Human memories could not be preserved outside the body, could not be placed in a new body. But that was normal for robots. I supposed it made sense that robots would have a different conception of death from ours. Shion did not share mankind’s hang-up with their own flesh. Her memories and consciousness were all that really mattered to Shion—the destruction of those was far more frightening than the destruction of her body.

  You could even say she valued her mind more than humans ever did theirs.

  8

  Isezaki’s son eventually grew sick of dealing with his father’s selfishness and threatened to have himself declared conservator of Isezaki’s affairs. Isezaki’s mind remained intact, but if spending tens of millions of yen on a robot was declared an unreasonable expenditure, his son might well be appointed conservator, and Isezaki would find himself unable to make any sort of large-scale transaction without his son’s permission.

  After years of his father’s tyranny, perhaps this threat seemed only reasonable. But Isezaki proved unexpectedly devastated by his son’s betrayal. He stopped going to Geodyne, and I was relieved, assuming the situation had resolved itself.

  December 20; a Friday afternoon.

  We were busy putting up decorations in the recreation room for the coming Christmas party. The tree had been up for a week or two now, but colored paper rings and silver stars and cottonball clouds and drawings done by children from the local kindergarten still needed to be put up on the walls and ceiling, making everything super Christmasy.

  “The party’s on the twenty-third, right?” Takami asked. He was helping with the high-up decorations. At times like this, it was handy having a man around.

  “Yeah, so on the twenty-fourth, the staff get to enjoy Christmas with their families and loved ones.”

  “You have any plans for Christmas?”

 

‹ Prev