Human

Home > Other > Human > Page 7
Human Page 7

by Robert Berke

"Ach, my little working man. Just like your papa. I make you nice plate and bring it up. You make me proud."

  Sharky climbed the stairs to his bedroom feeling more and more exhausted with each step. He lay down on his bed and looked at the ceiling, his mind awander. He knew exactly how to get Smith on the Internet. He had worked out the details on his way home. No challenge in that. But, he knew that was not what Bayron had wanted him to think about.

  Was it possible that Bayron hadn't considered this before? Was it possible that this issue had only just crossed his mind? Even he had thought about some of the wider implications of the project himself but always felt comfortable in the knowledge that the ethics of the situation were Bayron's problem.

  Now Bayron was looking to him for ethical guidance. He didn't like it.

  Sharky opened the drawer on his nightstand and took out a small pipe. He patted down the marijuana in the bowl and decided there was enough there for one good hit. He lit the bowl with a disposable lighter and as he lit it a small flame shot up, reminding him that he was reaching for heaven.

  No sooner had he exhaled than the door to his room opened. His mother came in with a plate of tacos and a glass of coca-cola. "This is what you call working?"

  "Just needed to relax, mom."

  "You work too hard already. Sit up and eat."

  He sat up in bed and took a bite out of one of the tacos. That made him feel better. But his mother could tell that her son's mind was unsettled.

  She sat on the edge of his bed and asked, "Is everything good at work?"

  "Yeah. Dr. Bayron's great, you know, but... mom, let me get your opinion on something."

  "The genius wants my opinion now."

  "If you had the power to create something that you couldn't control, would you do it?"

  "I created you." She gestured towards the remaining wisps of smoke that lingered below his ceiling light. She knew it was marijuana. She wished he didn't do it. "I don't do too good controlling you Sako, do I? But you always make me proud."

  "But you never really did try to control me, now did you?"

  "Control what, Sako? You was always a good boy." She pinched his chin as if he were an infant.

  "But you didn't know that. In fact, you really had no idea how many temptations there are out there, you didn't grow up here. How could you just let me do my own thing not knowing what I could get involved in?"

  "You think I was a bad mother?"

  "No," Sako replied with an apologetic tone, "you were the best. But you know you took a big chance bringing me up in this country where you didn't know the people or the culture. I could have gone wrong so easily."

  "I never worried, Sako. You come from good people and good people is good people. Should I have locked you away? Chained you to a radiator?"

  "A lot of my friends weren't allowed to leave their houses or go to the store alone, but you always let me go."

  "And you always came back. Why you bring all this up now? Does this have something to do with your artificial intelligence programs? Did you make something important?"

  Sharky was not surprised at his mother's guess. Until he joined Bayron's team he had been working on artificial intelligence project for SmithCorp under a military contract. He had to obtain a security clearance for that and wasn't allowed to tell his mother any details of that work. When he joined Dr. Bayron's team, he was sworn to an even higher level of secrecy. His mother did not know that he was no longer working on artificial intelligence but on artificial life itself, and he couldn't tell her. He was glad she had used him as an example. It allowed him to get her opinion without having to tell her what he had really been a part of.

  "No, ma," he said, "Real artificial intelligence is still a long way off. But from a philosophical perspective, if we were to create an artificial intelligence, it would be at least a little like having a child."

  Sako's mother chuckled, "oh, I suspect the process would be very different."

  "But seriously, ma," Sharky continued, "let's say I did create a kind of artificial intelligence. I think it would be very much like having a child. You know, you send it off into the world, you don't know what its going to learn, you don't know what its going to do. Yet people do that all the time. So I mean, look: here we are, it's a country none of us have ever been in before and you say, ‘go, do, learn. Do what you're going to do, learn what you're going to learn, be whatever your going to be.' That's incredibly dangerous really, if you think about it."

  "Dangerous? No, not dangerous. Not really. There's more dangerous then you give a smart boy a little freedom to roam around. There's lots of smart boys who sit home and study all day, locked in, no experience in life. Look at your papa. In Armenia he had good job. Good, good job. Every day he get up, go to job. He didn't have a choice. They said, "this is your job. This is where you go, this is what you do. When we first get here, you don't remember, you were a baby, he was very depressed. Drinking a lot, like you with your pots. He said to me ‘what do I do? I don't know what I can do. I don't speak the language, no one wants me for engineer. I don't got papers to work.' You know what I tell him?"

  "Yes, mom," Sharky said knowing she was going to insist on telling him again. "You told him, ‘this is America. You do what you want to do.'"

  Sharky's mother nodded approval, "And look what he did. He became a big success, your papa. A big, big success."

  "I know you like telling me that story, mom, but it doesn't really have anything to do with anything. What does that have to do with sending a child out into the world without knowing the consequences. What if your child turned out to be... I don't know... Hitler?"

  "Hitler was always Hitler. Stalin was always Stalin. Your papa was always your papa. I keep you in, I let you out. You're the same person inside. I tell you why I tell you about your papa again. Because he wasn't a man -- wasn't allowed to be what I knew he could be until he came here. You trust your child with the world and the world with your child because otherwise is like death. In Armenia, it was like death. A dead flower."

  "I think it has to be different when you're talking about an artificial intelligence that learns on its own though. You just can't control what it's going to learn and how its going to use its knowledge"

  "Just like my baby, I didn't know. Look, when you first started this nonsense, you said that a true artificial intelligence would be indistinguishable from a man. Do you still believe that?"

  "Of course," Sharky replied, "that's the Turing test. Its the only reliable test for artificial intelligence that anyone has proposed so far."

  "And if its indistinguishable from a man, then it's a man. Like the quack and the duck. That's what you told me, Sako."

  Sako nodded, he was neither surprised that his mother remembered the duck metaphor nor that she had somehow managed to mangle it in the process of storing it in a brain that still thought in a foreign language.

  "I don't know nothing about artificial intelligence, but I know a little about men." She chuckled before becoming serious. "Here's what I know. You take a man and cut off his potential, you might as well kill him. He is worse than dead. In Armenia, many brilliant men like your papa couldn't do what they wanted to do. They don't let these men get success because they afraid. If they not afraid, then Armenia would be America and we wouldn't have to run away. You make this artificial intelligence and you afraid of it--you better as well just not make it. Its not fair to you, its not fair to him.

  "Your papa... If he hadn't gotten us out, they would have killed him, and if they hadn't killed him he probably would have killed himself. And look what he did, after we came here. I don't know from artificial shmaritifial, but I know that to separate a man from his potential is the same as to kill him. And I know that if you save one life, its like saving the whole world. That's all I know."

  She tapped his knee lightly and stood up and walked toward the door to his bedroom, gathering scattered laundry from his floor as she did so. "Now I got things to do. Mr. Philosopher. Philos
ophy is for men of leisure. Me, I don't get no leisure. I got things to do." She got up and left Sharky alone with his thoughts. He knew his mother was brilliant in her own way and in her own right, he just didn't always understand her. In her mind, everything came back to how bad things were in Armenia and how great they were in America. She could turn any conversation in a direction to make that point. He finished his tacos and fell asleep.

  Sharky awoke to the sound of his cell phone ringing. He answered, "Sharky."

  "Sharky, Bayron. Where are you? We've got work to do."

  "Yeah, I'm not coming in."

  "Are you feeling alright?"

  "Yeah, but... I... I don't know. I've really got some stuff I need to think through. I mean, you know, there are implications and I want to think about it a little. I really... well, you know... I really... you know, I just don't know." Sharky stammered and fumbled to frame his thoughts. He thought he might quit. His mother was right about the fear, though. He simply wasn't sure how he felt about continuing with the project, especially now that it seemed time to unleash it on the world.

  "Do you want to talk about it?" Bayron asked.

  "No. You've really been great to me and all, but...I just don't know if I feel good about this anymore. I guess, I'm having reservations. You've got to let me sort it out in my own head."

  "Sharky, you're a valuable member of this team. I'm not going to let you just up and walk away without a fight. We're making history here."

  "Or destroying the future, doc."

  "Take whatever time you need, you're always welcome here. My door is always open to you Sharky."

  "Thanks, doc. I really appreciate your understanding." Sharky said with genuine gratitude for the courtesy.

  Sharky felt like he was waiting a long time just to hear Dr. Bayron say ‘goodbye' or ‘see you soon' to signal the end of the phone call, but just as he decided that Dr. Bayron wasn't going to finish the conversation with one of those trite niceties, Dr. Bayron spoke again, but very quietly and confidentially as if he had dimmed the lights and pulled the shades for fear of being overheard. "Listen, Sharky," he said, "don't think that I haven't thought about it myself. I've lost a lot of sleep on some of the ethical issues here. I can't force my opinions on anyone."

  "What is your opinion, doc."

  "You want to know the truth?"

  "Yes, of course," Sharky said.

  "I think the risks may be unquantifiable."

  "So you want to keep the system closed."

  "Yes," Bayron said. "Unless we can quantify and control the risks, we need to keep the system closed."

  "Do we have the right?" Sharky asked.

  "No, but we have the power." Bayron said. "At least until we can come up with a paradigm for quantifying the risks. And that may not be possible. That's where I need your help."

  "I don't know," Sharky said. "I still need to think about it."

  After the cordial goodbyes Sharky had expected earlier in the conversation, he hung up the phone.

  Bayron didn't have time to think. Myra was paging him. Smith wanted to see him. He cleared his head and walked from the lab to the infirmary.

  "So, will you get me online so I can get back to running my company?" Smith asked.

  Bayron was struck by how tinny and mechanical the voice sounded. He hadn't noticed that before, probably because he was so thrilled that it even worked at all. He would assign the task of getting the voice to sound more human to one of the technicians when he got back to the lab.

  "Yes, it just might take longer than you want." Bayron said.

  "Why?" Smith asked.

  "Because our team is down by one very talented engineer."

  "Who?"

  "Sharky."

  "Oh, no. That's a shame. He's quite a bright young man. What happened?"

  "I asked him to engineer a way to put you on the Internet and, frankly, I think he was overwhelmed by the implications."

  There was a moment of silence before Smith spoke. That moment of silence however, was full of activity in the artificial brain. Because he could "see" the electronic data being input though the microphone and because his memory was now perfect, Smith could compare Bayron's voice to his voice in other conversations. It was different. Despite all they had been through together, this was the first time Bayron sounded stressed and uncertain.

  The advantage of a mechanical voice, Smith thought, is that no one can know what you're really thinking. Smith tried to make his voice sound innocuous and trusting, but it was still just tinny and mechanical. "You know what this means, Doug?"

  "No." Dr. Bayron replied, "At least I don't know what you think it means."

  The speaker from which Elijah Smith's voice emanated crackled a little before the words came out. "It means he's a good man and the right one for the job. Let me worry about getting him back. You worry about getting me on the ‘Net."

  CHAPTER VI.

  Hermelinda came every day to tend to her patient even though that was getting more and more difficult as she got bigger and bigger with the pregnancy. She never told anyone who the father was though she couldn't keep the pregnancy itself a secret. The engineers and scientists in the lab remembered well enough seeing the orgasm on the monitor and they simply accepted the fact that the baby was Smith's. Nonetheless, Dr. Bayron had actually instructed them to keep that information confidential until or unless Hermelinda herself was comfortable to make it public.

  Bayron and Smith agreed that until they could convince Sharky to come back to head up the Internet project Smith could have direct access to an e-mail server. Bayron had one of the other engineers construct a simple interface which would allow Smith to compose, send, and receive plain-text e-mails via SMTP through a dedicated server. With a few simple modifications to an old fashioned text only e-mail program Smith could compose and send messages using only his verbal fingerprints. The process took only two days.

  The ability to send e-mails to the outside world was liberating to Smith. Because the mail program recognized the fingerprints of all the words he thought, he could literally type as fast as he could think.

  But even stranger than that, he realized that he could see what he spoke as well. Sometimes he felt like he was seeing what he hadn't even thought of yet, as if the machine knew what he was thinking before he did.

  He could also 'hear' the e-mails coming in far faster than he could read them. He told Dr. Bayron about this.

  "My guess is that your senses are converging because when you're in the e-mail program all the information you receive is bypassing your eyes and ears and going into your memory as pure data. Even what you think you see, you're not really seeing in a traditional sense. Your brain is not decoding reflected light off an object. It is decoding a digital representation of an object. So the digital representation gets written to the memory, and the mind sees it as an image instantly. Because this process happens instantaneously, it probably seems to you that your brain has developed the ability to recognize patterns in the digital 'pure' form before it is processed into pictures or words. That's really kind of cool if you think about it. It's kind of like having a new sense. Now, in addition to having a memory that can't forget, you can actually see, talk, and hear binary code. Over time, who knows what you'll be able to do. Keep me apprised, will you?"

  "Of course, doctor." Smith's computer voice droned.

  Smith was a prolific e-mail writer. He was reconnected with the world and was desirous of interacting with it as much as possible. Everyone one he knew got e-mails from him everyday as if he were trying to spread his lifetime of wisdom to as many living people as possible.

  True to his promise, Smith embarked on a campaign to bring Sharky back to the lab. Smith sent Sharky frequent e-mails and often these e-mails had a philosophical, even spiritual tone. Smith wanted Sharky's feedback on esoteric and even philosophical topics such as whether a machine can bear a soul. Smith was eager to understand and to help Sharky to understand what his reservations were and
to help him overcome them.

  "To the extent that a soul is a function of the mind's processes," Sharky reasoned in one of their exchanges, "then the replication of a mind on a machine would necessarily port the soul to that machine too. That assumes, of course, that the soul is simply a meta-intelligence capable of synthesizing all of the various intelligences of the mind into values and beliefs. That would make the soul a mere aggregator and organizer of collected information. Like a book written from all the information in the mind."

  Smith replied that, "the opposite could also be true. That the soul maybe a mediator of all information. After all, a fact is not a fact until the mind tests it and verifies it. Pure information thus does not become human intelligence until it has been parsed and filtered by the soul. If so, then all intelligence is a distorted reflection of truth reflecting in equal parts the objective facts and the filter through which it has passed. The soul would not so much be the book, but more like the pen with which the book is written."

  "To extend the metaphor, then," Sharky wrote back, "the soul could be the paper on which the book is written. That perfect blank page awaiting the marks of life. In that case we would have to say that information gives color to the soul rather than the soul giving color to information. Then the soul would be more akin to the blank pages on which all that is seen and heard is written."

  "Which is simply another way of saying that the soul is perfect until it becomes corrupted by knowledge, rather than the other way around wherein knowledge is accepted as being perfect until corrupted by the soul. Looks like we've reasoned our way back from Einstein at his blackboard to Adam and Eve in the Garden." Smith reflected.

  "I can only deal with one metaphor at a time, Mr. Smith. I have to go."

  Sharky felt obligated to engage Mr. Smith with these e-mails since, even though he had not shown up for work for weeks, he was still getting a paycheck.

  Occasionally though the e-mails were not purely philosophical. Occasionally, they addressed pressing issues, genuine concerns, or tangible problems. For instance, Sharky received the following missive from Smith one afternoon which brought almost all of the farfetched concepts they had been corresponding about into the harsh light of reality.

 

‹ Prev