Book Read Free

Eleven Graves

Page 19

by Aman Gupta


  “What if the patient wants Procedure 1 as for him, staying alive is better than any pain,” asked Jay.

  “If he’s adamant, then we go to Procedure 1,” said the girl.

  “So the patient says 1, doctors say 2, parents say 3. If you’re an AI, whose emotions should you take into account? How do you learn whose emotions aren’t deviating you from your ultimate goal which is to deliver success?” asked Jay.

  “Parents are responsible and care most for their kid, so we go with them,” said another student.

  “What if the patient’s not a kid? Let’s say he’s your age. 20-21,” said Jay. “Remember we are long past the discovery stage. Your AI is not responsible for that. It’s responsible for using the available prognosis to find out the best possible treatment. If your AI is a Full Ethical Agent, like human beings, with free will, consciousness and morality, how does it behave? How do you want it to behave? If you don’t know how you want it to behave, how do you know what’s right and what’s wrong?”

  “I think it should show us all the possible answers and we can select the best one,” said a student.

  “Who’s we? The ultimate aim for a healthcare company investing in AI is to reduce the errors caused by humans. They want AI to succeed where humans are likely to fail, to stop doctors from making wrong decisions. If you aren’t using it for that, then why would you go for AI in the first place?” reiterated Jay.

  “I think depends on the industry. Some are more important and critical than others,” said a student.

  “That’s true,” said Jay. “If AI’s handling weapons and defense, everything changes.”

  “I think no. AI’s job is to be objective. It’s a machine, a program, and giving it human traits is a fool’s errand,” said Sam.

  “But then we could never use it for applications where lives are at stake,” said a student.

  “Maybe that’s a good thing. The last thing we should do is to relinquish control of our lives,” said Sam.

  “That wouldn’t stop a foreign corrupt power. And to overcome their advantage, other countries would also be forced to use AI for such applications,” said Christina.

  “What if we added a few rules that safeguard human interests and they always have to follow them?” asked a student.

  “You know what they say about rules,” said Jay.

  “Right. But if we made sure they couldn’t be broken?” said the student.

  “Like?” asked Jay.

  “We add a rule that AI cannot take a decision that costs human lives,” said the student.

  “That’d be useless for the military, as whatever they do will cost lives. Some of theirs, some of their enemies. And in our healthcare instance, more often than not, odds say it will cost lives. So the rule will stop AI from functioning to its full potential. Either, it will self-learn and discard or bend the rule, else it risks becoming useless,” said Jay.

  “Let me ask – how do we, as human beings, decide who lives or dies? What’s right or wrong? When did our choices become so binary?” asked Jay.

  “The greater good is the right choice,” said Christine.

  “Greater good is an extremely slippery slope. Trust me, you don’t want an AI to make greater good decisions,” said Jay. “What else?”

  “Past,” said a student. “The wisdom that we gain from the hindsight.”

  “Humans have been around for many millennia. Yet we find new ways to kill each other. Thought we’d learn by now,” said Jay.

  “Legal,” said one more.

  “What’s legal today might be illegal tomorrow. And vice-versa. Legality is a fragile state of affairs,” said Jay.

  “Morals,” said another. “If it’s morally and ethically right, then it’s the right choice.”

  “That’s the Utilitarianism debate again. Also, as an example, assume you’re driving at a high speed with a friend in the morning smog, and are about to run into a lethal road divider. On your left is a car with a family inside it. On your right is a pedestrian casually strolling on the sidewalk. What do you do? Change directions and kill a family or an innocent pedestrian? Or crash into the divider and kill your innocent passenger? Not every action can be measured on the moral and ethical scale,” said Jay.

  “Experience,” said Sam. “We learn from our mistakes and take steps to avoid their repetition. Sooner or later, we’ll run out of mistakes.”

  “It might be too little, too late,” said Jay.

  “How come your paper isn’t available for download?” asked a professor. “I tried. It said its access is restricted to proven researchers and engineers in the field.”

  “Yeah, their decision. They conveyed the same to me,” said Jay. “They also said I can’t tell anyone about some aspects that I covered in there. But I’ll discuss here as much as I’m allowed to.”

  “What did you think you should done?” asked Sam.

  “I suggested the elimination of binary thinking in AI. I would suggest the same to human thinking too, but I doubt anyone would follow it. It’s too easy to think yes or no, good or evil, left or right. More complex answers aren’t accepted by the crowd,” said Jay.

  “How do we do that?” asked Sam.

  “I suggested an approach called Tethered Psychology Network – Here, we tether emotional intelligence of an AI upon a complex network made up of incorrigible guidelines, experiences, as accentuated by one of you, a group of silent objectives which aren’t the part of the original problem but rather new problems that the solution would create, a self-learning extrapolation algorithm and a human element, hopefully an observer or even the creator whose judgment is ranked higher than any, by the AI. This complex network can be AI’s conscience. Then all our other remote AIs have their EI tethered to this central system. As long as we control the conscience, and maintain the accord, we control the central AI which in turn controls the remote AIs. It went into much more detail, and also inculcated the applications of predictive analytics through the entire system,” said Jay.

  “Do you think it’s possible? Would you be consider to be a part of one if someone asked you?” asked Sam.

  “I hope not. Theory only takes you so far. The existence of it is a game where no one wins in the end,” said Jay. “That’s all I am allowed to say about the paper.”

  Everyone thanked and congratulated Jay for the lecture.

  Probably the last time in his life that he would get a standing ovation for his work with a smile on their face, from over a hundred people. Only if he knew that.

  Jay had a certain élan which scrambled people’s minds into thinking faulty arguments as flawless masterpieces. It always used to come to him whenever he needed it.

  Jay and Sam became good friends in the next few months. Sam warmed up to O’Donnell eventually.

  They all would usually hangout in Sam & Brianca’s room, playing LAN games, hacking into secure networks, with Jay sometimes intentionally triggering fights between Sam and Brianca.

  “What happened to you making new friends?” asked Jay.

  “I think 2 is enough for now,” said Sam.

  “Cute, you think we’re friends,” joked Jay as laid down on the bed while Sam was working on her desk. She threw an empty coffee cup at Jay without looking.

  “My eyes, my eyes,” shouted Jay, in pain.

  She turned around in horror to look.

  “Ha..got you,” said Jay, juggling the coffee cup.

  Sam came over, grabbed a pillow and smacked it on Jay’s face. “Jerk.”

  Jay and Sam got into a pillow fight which lasted for a few minutes before both of them got tired and laid on the bed. Both were looking up at the ceiling while their legs pointed the opposite way. Jay positioned them on the wall behind the bed, while Sam used the table in front of the bed as a cushion for her feet.

  Jay turned his head towards his right, facing Sam, and asked, “Last night, I saw you reading an email, and crying a little. What did it say?”

  “Nothing,” said Sam. />
  “Come on,” said Jay. “Tell me.”

  “It was a letter from my mom. Apparently, I got caught sneaking into a cyber system last year. The company has officially filed charges. My mom’s scared,” said Sam.

  “What? How serious are the charges?” asked Jay.

  “They aren’t backing down. Want to make an example out of me,” said Sam.

  “Why didn’t you say anything?” asked Jay.

  “I didn’t want to bother you,” said Sam.

  Jay grabbed a pillow and smacked on Sam’s face.

  “I know, I know..I should have said something,” said Sam.

  “Is that what you were doing just now?” asked Jay.

  “I was deleting all traces of my access logs via Daulton’s network. I forgot to use Tor this one time in a hurry. Now, I’m sure I’m in the system. If the company lawyers find any evidence that I’m still doing this stuff, they might be able to make their case a lot stronger,” said Sam.

  “Are they coming here?” asked Jay.

  “Maybe tomorrow I think,” said Sam.

  “We’ll handle it, don’t worry. I’ll delete all the tracks,” said Jay.

  “Some are on the campus server and I can’t get in remotely,” said Sam.

  “Then I’ll go in there and do it tonight, when no one’s around. You trust me, right?” said Jay.

  O’Donnell came in the room with the information that Daultons were coming over for a seminar on Applications of Neurotechonology.

  Sam rolled to her right, kissed Jay on his forehead and whispered, “With my life.”

  She rolled out of bed, as did Jay.

  “This is perfect. The entire campus would be in the hall, including most of the security guards,” said Jay.

  “Wait, what is happening?” asked Brianca.

  “Sam and I have some work to take care of, in the server room,” said Jay.

  “Yeah no, they lock down the campus when this happens. There’s no way that you’re accessing the server building until the Daultons leave,” said Brianca.

  “We have to try,” said Sam.

  “Let’s go,” said Jay.

  They all left the dorm building and sneaked past the security guards at the front gate, who were directing all students to the hall.

  “What about CCTVs?” asked Sam. “They are on their own private network, and the server is offsite.”

  “New Plan. We first need to go to CCTV room and loop the cameras that cover the server room, before we go there. Once we get out, we’ll reset the CCTV footage. They’ll think it was a glitch,” said Jay.

  “Okay,” said Sam.

  “I need to be in the hall though,” said Brianca. “I’m the new student president. I have to deliver the inaugural address.”

  “To go to the CCTV building, we’ll need to cross the corridor leading to the hall. Help us distract the guards,” said Sam.

  Brianca nodded.

  Evading cameras through carefully timed movements, they managed to get near the corridor. Brianca went forward to distract the security guard standing near the stairs.

  “Hi, I’m the new president here. Could you tell me where the hall is?” said Brianca as she kept walking and went to the guard’s extreme left forcing him to turn around.

  “You’re the president. You don’t know where the hall is?” said the guard.

  “You’re right. I should know,” said Brianca as she started crying to buy Jay and Sam some time.

  The guard consoled her, which allowed Jay to run up the stairs behind the guard’s back.

  Sam ran after Jay, but the guard turned around in time.

  “Oh hey Brianca, I’m looking for you everywhere. Let’s go. Everyone’s waiting,” said Sam.

  “Oh yes. Let’s go,” said Brianca.

  “Did you just call me Brianca?” whispered Brianca.

  “Don’t get too attached. Just keep walking,” said Sam.

  “It’s just him, now,” said Brianca.

  “Hope nothing happens to him,” said Sam with a concerned face.

  Being late to the hall, Sam wasn’t able to grab a seat near the exit. She was forced to sit in the middle rows. Brianca had went around to the backstage as she was called to deliver the next address. The seminar had already started. The commotion caused by Sam to get to her seat caught the attention of Anthony Arnold, Victor Daulton and Sarah Daulton.

  The security guards on the CCTV floor had gone off to command certain spots on the ground floor, so there were only 5 guards left on the floor.

  Soon, Brianca had delivered the address, and it was Sarah Daulton’s turn to speak. Sam put her head down and called Jay to inform him of her situation, but Jay couldn’t answer it as he was busy sneaking past the remaining guards. After few close calls, he managed to get close to his destination.

  After successfully entering the main CCTV room, Jay looked at the missed calls, and called back Sam.

  Sam’s phone rang in full volume in the middle of Sarah’s speech, as she had failed to properly connect her phone to her earpiece. It took less than ten seconds for everyone to triangulate source of the sound to Sam, who looked around nervously. Already on her bad side for the earlier commotion, Sam’s call infuriated Sarah. Sam had already picked up the call by reconnecting her wireless earpiece, covered by her brown locks, but the damage was done.

  Jay had entered the CCTV room and was seeing this on the central monitor.

  “Please, what’s your name?” said Sarah on the mic.

  “Go ahead, tell her your name Sam,” said Jay, on the phone.

  “It’s because of you,” murmured Sam.

  “What was that?” asked Sarah.

  Sam got up and said, “Sam… Samantha.”

  “Bye, Sam, it was nice knowing you,” said Jay on the phone.

  “Are you talking on the phone?” asked Sarah.

  “No, why, do you want me to?” said Sam.

  Everyone laughed. Victor Daulton chuckled.

  “Excuse me?” said Sarah.

  “Do you want me to get you out of this? Just nod. I’m looking at you via the cameras,” said Jay.

  Sam nodded.

  “Say, I was passionately listening to your speech which doesn’t really address any potential damages that real world applications of neurotechnology could have,” said Jay.

  Sam repeated after him.

  “I’ll get to that,” said Sarah. “You can sit down.”

  “Don’t hang up,” said Jay.

  Sarah continued with her speech about Verati and how it was changing the lives of millions of people through their advanced neurotechnology programs, with a focus on its initiative of also helping its competitors by giving them access to Verati’s technology. She kept an eye on Sam most of the time.

  “She’s looking at you,” said Jay.

  “Have you created the loop?” muttered Sam.

  “Still working on it. The security isn’t as poor as I originally thought. The firewalls are hard to penetrate,” said Jay.

  Sam laughed.

  “What?” asked Jay. “Oh shut, up!” he continued.

  “What’s so funny?” asked Sarah.

  “Nothing, I..” said Sam.

  “Please share..” said Sarah.

  “What do I do now?” said Sam as she got up with her head down.

  “Repeat after me.. How come Verati is doing so much work in neurotechnology and yet so few patents are being filed by it?” said Jay.

  As Sam repeated, Victor Daulton looked at Anthony Arnold as they smiled at each other.

  Sarah Daulton was confused.

  “We don’t believe in patents. We think they restrict growth,” said Sarah.

  “And yet Verati applied for 2000 patents in the last year itself, with zero patents for neuroscience?” said Jay as Sam repeated everything he said.

  “Sam, right? I think you have missed your facts,” said a flustered Sarah.

  “I haven’t. Maybe you don’t really know about it, as it doesn�
�t look like your field. So that’s okay,” said Jay.

  As Sam repeated, Sarah got offended and blurted, “I don’t need an idiot like you telling me what I know and what I don’t.”

  “Sorry, I didn’t mean to offend you. Perhaps I’ll get my answers when others speak,” said Sam, on her own.

  “Ouch, that’s a low blow,” said Jay. Sam smiled.

  Victor Daulton grabbed a mic and said, “No, you’re right. Your facts are accurate.”

  “Look at that! Daddy Daulton likes you,” said Jay.

  Sam could barely control her laughter.

  She got her head down and whispered, “Don’t make me laugh.”

  Anthony Arnold said, “No, we didn’t apply patents in neuroscience because that’s my department and I feel that I should help in inclusive growth to bring about lives in people’s lives.”

  “Ask, what about military applications?” said Jay.

  “No..” whispered Sam.

  “I cannot remove the traces from the server alone. And the only way you’re getting out of there is if they kick you out, politely. I’m giving them the opportunity to do so,” said Jay.

  “What about military applications?” said Sam, out loud.

  “Currently, there aren’t any military applications being taken care of, by Verati,” said Anthony.

  “Do you plan to?” said Jay, as Sam repeated everything that Jay said. It was like Jay and Anthony were talking.

  “Not in my lifetime,” said Anthony.

  “Then, I hope you live a long life,” said Sam.

  “Thank you. You’re too kind,” said Anthony.

  “Because my concern wasn’t ethical. It was more technical, but since you aren’t doing that, I’ll just shut up,” said Sam.

  “No, we aren’t. But I’m always open to learning. What technical concerns?” asked Anthony as Victor Daulton looked at Sam.

  “When your AI unlocks the gut-brain, your fail safe is likely to collapse,” said Sam.

  “We wouldn’t be unlocking that. It’s still a mystery to the world,” said Anthony.

  “Oh okay. Clearly, you aren’t doing as well as Sarah Daulton, daughter of the legendary Victor Daulton, who has graced us with his presence out of his busy schedule, has said,” said Sam.

 

‹ Prev