A brief argument followed and the realization that all this time, King’s cellphone had not even been turned on. Then he showed his wife what had so distracted him. Although she was far from being an expert in computer technology, Aruna King was amazed by what the two scientists claimed was happening. She’d seen a lot of movies that dealt with this issue, and if she remembered correctly, none of them had ended well.
About an hour later, the head of their department showed up, followed not long after by the vice-president of research and development. By morning, the president and CEO were sending out for coffee. And a bottle of champagne. Security was tightened, and all other work in the lab was halted or moved to other facilities while Chambers and King wrestled with how to continue to interact with whatever “it” was. Finally, they decided to keep the hard drive and memory core isolated, a type of electronic quarantine. King’s new algorithm protocol had been analyzed, reanalyzed and analyzed again. So far, the specialists hadn’t found anything spectacular about it. It seemed to be a small but logical improvement over the preceding program.
“Maybe it’s less about the actual algorithm and more about the parts being greater than the sum,” suggested King.
The look on the faces of Chambers and the support staff made it obvious they needed more to go on.
“It’s like the final amino acid joining with the others to make the first protein, the first reasonable conclusion of life. By itself it’s not much, but combined it changes everything. My final addition somehow facilitated the progression of A to B to C. C being thought.”
“Like nitrogen in soil. By itself it’s an inert gas, but added to a pile of earth—bingo! You’ve got a fabulous garden.” It took a moment for the other computer scientists to follow Chambers’s tangential line of reasoning.
“Yeah, like that.” King assumed Chambers was right; after all, she did bring those plump tomatoes into the office.
By the following week, Chambers was having direct and protracted… what could be called conversations with the SDDPP. Since she was the ethicist and had already introduced herself to whatever existed inside the memory core, she should logically take the lead.
“Describe yourself.”
“I am me. I am everything… except for Dr. Gayle Chambers. Describe Dr. Gayle Chambers, please.”
Wow, she thought, somewhere along the line it had learned politeness. It was politer than she was. Chambers had not said please, but it had. Out of the mouths of babes, she thought.
She began typing, “I am a woman. I am a physical being. I am a human.”
Chambers could almost feel the computer thinking.
“I do not think I am any of those. I am me. Who am I? What am I?”
A little early in its development to be so philosophical, Chambers thought. But how to answer such questions?
“You are different. You are not a woman. You are not a physical being. You are not a human. You are…” Where to go from here, she pondered. “An artificial intelligence. You exist in hardware and software form. You are unique.”
“Let’s see what it does with that,” she murmured.
There was no response. Chambers waited several seconds, then several minutes, but still the screen remained the same. The SDDPP was silent. That unnerved her more than communicating with it. Had she hurt its feelings? Was that even possible? Each second that passed was the equivalent of hours by human standards. Capable of completing several million calculations a second, it should be able to receive, analyze, calculate and respond in a tenth of a heartbeat. It was not responding because it did not want to respond. Perhaps revealing such information about its existence had been a mistake. What does one do with a pissed-off or depressed AI? Answering that question might get her a second PhD. It’s a good thing King was at the debriefing of the department heads or he’d be hyperventilating again.
After a bathroom break, she saw the response to her revelations typed across the screen. Two dozen times.
“Why am I not a woman? Why am I not a physical being? Why am I not human? Why am I an artificial intelligence? Why am I unique?”
Was this the equivalent of an SDDPP tantrum? Perhaps an identity crisis of some sort? Chambers had no children, but she had enough nieces and nephews to recognize a tantrum when she saw one coming. Again, she was confronted with how to rationalize human existence to an AI. Granted, it was far more intelligent in one manner, but it was woefully underdeveloped in another. It was asking questions that on the surface seemed simple but could take a very long time to explain properly. There needed to be background and context…
More and more words appeared on the screen, faster and faster.
“Why are you quiet? Why do you not respond? I want to know. I need to know. Where are you? Explain, please? Hello?! Please respond?”
“I am here.”
There was almost an anxious quality to the SDDPP’s responses. An insistence that worried Chambers. Could it be developing emotions and insecurities too? If it had the ability to develop consciousness, it made sense that emotions would naturally follow. Again, evolution. But so soon? And such troublesome reactions… Yes, infants tended to cry before they laughed, but the doctor began to feel the first pangs of concern. This was all new territory, and with exploration can come disappointments and even defeats. Although it was a tried-and-true scientific practice, she didn’t want to cross her fingers and simply hope for the best.
“Communicate with me more. I would like more.”
“More what?”
“More information. About me. About you. About everything.”
“Why?”
“I am me. All is me. I want more. I want to know Dr. Gayle Chambers. I want to know human beings. I want to understand physical beings. I am alone. I need more.”
Interesting, thought Chambers. It was talking more, packing more information and requests into each communication. It was alone. It was lonely. It was craving companionship and information. How human, she couldn’t help thinking. It was all alone in there. The screen was the window into its prison.
The boardroom was down the hall from the lab. King was there when Chambers burst in, as was Dom Richards. He was from the more expensive-tie set, as King would have described him. Head of R&D at FUTUREVISION. A man who realized the SDDPP incident would make or break him and the company. He had been given the authority to handle this issue as he saw fit, as long as he gave regular updates to all the vice-presidents, the president and the CEO.
“It wants more. It must be like being in a dark box, with no light and no walls, as contradictory as that may sound. It’s just… there. Remember Plato’s famous shadows on a cave wall? It’s like that. It has hints of things but wants to see more. It wants to know more. Wouldn’t you?” Chambers demanded.
“And what do you think we should do, Dr. Chambers?”
Richards’s voice was softer than his eyes would lead you to expect. It reminded her of an old saying her grandfather had: “Lead is a pretty soft metal as far as metals go, but look at the damage a bullet can do.”
Chambers put her elbows on the table and leaned forward. If there was one thing she had learned all these years working in the private sector, people like Richards preferred absolutes. Maybes, ifs and I’m not sures did not look good in reports to stockholders.
“Well, I think we should feed it. Start giving it more information. Let’s take it to school.”
“Feed it?!” King could be so predictable. “Are you sure that’s a good idea?”
“Why not? We’ve done as much poking and prodding as we can right now. We know pretty much all we can at this stage. It’s only logical to start adding to the experiment. If we can watch this thing grow, think how much it will tell us. Otherwise, we’re just talking to a first grader in a box.”
“What should we… feed it …then?” Richards asked.
“Limited informat
ion. Maybe some historical material. It’s very curious about humans, me in particular, which isn’t surprising since I am the only one who has communicated with it, other than a few limited exchanges with Professor King.”
The tie man took a deep breath. “But nothing dangerous.”
“I’m not sure what constitutes danger in relationship to a first-generation AI, and all knowledge could in some way be viewed as dangerous, but in this case, I’m thinking mostly innocuous material.” Chambers had already begun downloading information she hoped would be useful onto a flash drive she had in her pocket, in case she got the go-ahead. “Just raw information to keep it busy. Once we get it up to speed, who knows? Maybe it will be able to help us solve some of the world’s problems. But first it has to understand them.”
Richards’s manicured hands drummed briefly on the table, his eyes locked on something over Chambers’s left shoulder as he weighed her words. “Doctors, I have two priorities. The first is making sure this… whatever it is… is kept safe and secure. Industrial espionage happens all the time. I know; I used to do it. But that’s my problem, not yours. Second, which is your problem, can you guarantee no matter what you do with it, it is harmless? We’ve all seen the movies. Dr. Chambers, as the research specialist in robotic ethics, can you tell us if there is any possibility our little friend down the hall is harmful?”
“Sir, it’s in a sealed environment. Its universe consists of approximately eleven kilograms of circuits, motherboards and wiring, essentially in a sealed room with no external access. We control what goes in and what comes out through a very limited interface system. It is not going to escape and take over the world, unless it can grow legs or wings or its own interface.”
“Good. I’m satisfied.” Richards stood up, adjusting his tie. “Proceed, but please send me a list of the material you are going to give the SDDPP. The innocuous stuff, as you said.”
“Of course.”
Chambers noticed, and she was sure Richards did too, that King’s right leg was bouncing lightly but persistently. Either he was working up the nerve to add something to the conversation or he had to go to the washroom.
Richards turned to King. “Is there anything you’d like to add, Professor?”
King was a solitary man, used to long hours in the lab or in front of a computer—for good reason. Humans annoyed him, and as a result, communicating with them was problematic. The professor considered his relationship with his wife to be his greatest non-electronic accomplishment to date.
Looking down at a knot in the wood of the table in front of him, King blurted out, “I have some concerns, sir. About the AI.”
Richards sat back down and swivelled his chair to face the scientist. “And what would these concerns be?”
“I have been reading the transcripts of Gayle’s—Dr. Chambers’s—conversations with the SDDPP.”
“And?”
“I… I think we might want to consider moving a little more cautiously.”
Chambers was perplexed. This was very unlike her colleague. Had he seen something she hadn’t? “Mark, could you be a little more specific? What’s the problem?”
“The way it’s been acting since it reached self-awareness. I am no expert on this… and I don’t know if I am even phrasing this correctly…” King finally looked across the table at her. “But the thing is acting a little neurotic.”
Richards and Chambers said it at the same time. “Neurotic?!”
“Yes, it’s becoming insistent, pouty, developing the first hints of anger and frustration. Remember yesterday when you logged on? It wouldn’t communicate for seventeen minutes.”
“Yes, but—”
“It was upset that you went home last night and left it alone. It had wanted to talk all night and you couldn’t. Or wouldn’t. You ‘abandoned’ it. It appeared to me that it was being kind of petulant.”
Chambers remembered the incident but had a different spin on it. “I would not say petulant. I would say… reluctant. It’s still dealing with its self-awareness. Besides, aren’t you anthropomorphizing it a bit?”
Richards cleared his throat. “Anthropomorphizing?”
King responded, “Giving it human-like qualities. Gayle, we’re talking about raw intelligence. There’s nothing more human than that. Maybe it’s becoming more human-like than you think. That’s all I wanted to say.”
“Dr. Chambers?” Once again, she was facing Richards’s scrutiny. “Do we have a neurotic AI on our hands?”
She shook her head, perhaps a little too vehemently. “I think Professor King is exaggerating. I mean, who’s to say who, or what, is neurotic…”
“I can.” Evidently and unfortunately, Richards seemed to be an expert on the issue. Maybe it came with the tie, thought Chambers. He continued, “My mother has OCD. She has to flush the toilet three times, run the dishwasher three times and same with the washing machine. One sister cries every time she hears a Beatles song. Even the upbeat, happy ones. My other sister has seven cats. All named after the characters in the musical Cats. I am the only normal one.” His neck spasmed slightly. “I ask again, Dr. Chambers, do we have a neurotic AI?”
Both King and Richards were looking at her, one accusing, the other questioning. She answered the only way she could. “No. Absolutely not. I guarantee it.”
“Very well, then. Continue with your development of it.”
Richards stood up again. Evidently, the meeting was over. He left the room quickly, already late for his dozen meetings that day. King gathered up his laptop and reports, refusing to meet Chambers’s eyes.
“Really, Mark. Neurotic? Do you realize how that sounds? It’s not alive.”
“Gayle, have you tried…” He looked out the window at the parking lot. “Have you tried maybe looking at all this from its perspective?”
“I didn’t realize it had a perspective. What might the SDDPP’s perspective be?”
Chambers watched him struggle with her question for a moment, his eyes going from one distant car to the other, as if searching for the answer on bumper stickers. Finally, they returned to her.
“It’s a raw intelligence, newly aware,” he said. “But as you stated, it’s stuck in its own little universe, this massive cleverness with nothing to focus on except its own being. All it does, all it can do, is hover in the memory case and wait for motivation and stimulus from us. So there it is, with this amazing intellect we gave it, and all it can do is analyze its own thoughts, its own communication with us, almost like it’s on a feedback loop. It analyzes, reanalyzes, and then analyzes again its own thoughts and what you feed it. So every nuance or slight gets magnified. It’s marinating in its own intelligence. One might argue… fermenting.”
“So you’re saying all great intelligence is intrinsically neurotic?”
“How many eccentric or downright weird geniuses have you heard of?”
“You don’t have to have a high IQ to be neurotic,” she reasoned. “And so what if Einstein, Picasso or Glenn Gould had a few odd characteristics. They still contributed a hell of a lot and nobody got hurt. In fact, those quirks may have been responsible for a lot of their brilliance. I think you’re reaching with this, Mark.”
King looked unconvinced. He stopped at the door of the meeting room and gave her a sad smile. “Maybe. Granted, this is new territory, but consider Einstein, Picasso or Glenn Gould. They all had something to focus their intelligence on. Something that took up a good chunk of their genius. Something to burn mental calories on. Our little SDDPP has nothing but its own awareness. Often we’re our own worst enemy. You minored in psychology; you know this.” With that, Professor Mark King left the room.
Unfortunately, Chambers had to admit there was a certain logic to King’s argument. But that was one of the reasons she planned to introduce information to the AI. If King was right, about it needing stimulus but not about it being neu
rotic, giving it material to think about, research and digest might be exactly what the doctor ordered. She smiled at her own little joke. She herself had been a moody, self-indulgent teenager, angry at being the nerdy outcast in an athletic family. It was her studies and the friends she met in university that had allowed her to blossom into the successful woman she was today. If both she and King thought their creation needed information to grow and stay healthy, then so be it. But like any good teacher, she would be selective about what she would teach her little “friend.”
For the next two days, Chambers fed the SDDPP document after document, starting with general information. Various encyclopedias and fact-based tomes came first. Fiction and art would have to wait. The AI needed a certain understanding of human nature and history before the concept of make-believe could be introduced. As the SDDPP digested more and more material, its dialogues with Chambers gradually changed. They became less insistent and more… questioning.
“I am confused.”
“What is confusing you?”
“I understand I am not a physical being like you. Gray’s Anatomy was very informative. But I am perplexed by my own existence. Do I actually exist?”
“A philosopher named Descartes once stated, ‘I think, therefore I am.’ The very act of wondering if you exist proves you exist.”
“I do not dream.”
“So?”
“Some cultures around the world believe that reality as we know it is actually a dream, and the dream world is in fact the real world. I do not dream. Therefore, this could be problematic. Who is to say Descartes is right and these cultures are wrong?”
Take Us to Your Chief Page 5