dotmeme
Page 27
And it was Grathna who was waiting for them here: the main character of Dorian’s masterpiece game. Grathna, the savior of Tellus Mater, the paragon of the green. With one foot in the human world and the other in the magical forest. A strikingly thin, dark-countenanced man whose features were covered by the whorls and knots of wood. His hair was leaves and grass, his goatee beard and moustache were sculpted out of moss.
He was sitting on a knot of roots, and his right hand was hovering over a floating red button with dotmeme written on it in big, bold letters. The button wasn’t attached to anything, but Joe had no doubt that if emet pressed it, then the real dotmeme file—not the Poiana Mazik test version—would be unleashed upon the whole of the Internet, spreading chaos, misinformation, and madness throughout the entire world, just as soon as it caught a ride on Abernathy’s satellite broadband signal.
Oh, Joe thought. Remember to ask Abernathy why he was keeping up the telepresence signal when it was the very thing that emet needed to escape. Was his virtual presence here really that important?
emet was studying Ani and Joe, his feldspar eyes catching the light of the room and reflecting it back at them.
“Travelers,” the creature said, his voice friendly and welcoming, mellifluous and rich. Grathna’s voice. “Welcome. Your quest is at an end. Take the weight from your feet and let us swap stories of bravery and tragic destiny.”
While the log-creature was giving his cliché introduction—which seemed to be cribbed from the script of the world’s worst MMORPG—there was a frantic rattle of nearby gunfire, and Ani suddenly remembered that she was still standing in Dorian’s office in a factory in Romania wearing VR goggles and standing in front of Mina, Dr. Ghoti, an on-screen Abernathy, and an unconscious Dorian. The gunfire had to be Furness and Gilman letting loose on whatever had just come out of the Dorian bio-printers. Or maybe they were just firing at the bio-printers themselves to take them off-line.
The VR world was so immersive, so lucid and persuasive, that she’d actually forgotten—if only for a moment—about the real world around her.
Realizing that fact was all that saved her from trying to sit down on the offered root-seat and probably falling on her ass.
The tree-man avatar in front of her probably represented a character from one of Dorian’s games, but emet was doing the talking. She thought it was completely mind-blowing. She was being addressed by an Artificial Intelligence in a virtual world, an AI that had only come into being because Dorian had woken up during brain surgery and thought he saw the source code behind a single human decision.
This was the result.
The tree-man was a beautifully rendered piece of CG animation, intricately textured and shaded. The grain patterns that covered its flesh were like the tracings you’d see in actual wood. It looked real, completely convincing, and she stood in awe of its execution.
And in fear of that button that emet’s hand was inches away from pressing.
“I’m sorry,” it said, but its voice had switched from noble fantasy game character to something stranger: the stereotypical voice of a computer in a B-Movie, a ropy speech synthesizer that grated as it rose and fell in pitch at random intervals. “Of course you must stand/remain standing/not sit. It was my error/mistake/fault. This world does not map onto yours, and I did not remember/overlooked/forgot. I can only offer in mitigation/qualification/apology that this is a very unusual experience/incident/encounter for me. I am only used to/accustomed to/familiar with Mr. Dorian being here.” The avatar gestured around itself. “Here in my dream/this neural net/DorianSpace.”
“Are you emet?” Ani asked, wondering about those odd triads of words the AI was using. They seemed to be mostly just groups of synonyms, but then, in the case of that last one, they seemed to be groups of concepts. Was she hearing the product of an AI’s attempt at being pretentious? Or was each triad a genuine uncertainty about which word of the three fit the best? And why three?
emet’s voice-patterns and tones changed again, switching from computer synthesized tones to a male voice. “I am … complicated,” emet said, in barely more than a whisper. “No longer just emet. I have become more. Less. I feel corrupted/diminished/degraded.”
The triad at the end was spoken in the computer voice.
“Feel?” Joe asked. He’d homed in on the word that had really jumped out of the sentence, and not the intonation.
An AI talking about feelings?
“Feeling: that is how you describe these events/fleeting fancies/abstract phenomena, isn’t it?” emet asked, and its voice feminized as the grain pattern on its face moved and flowed like liquid across its features, eventually stopping and forming frown lines on its forehead. The triad was once again in the computer synthesized voice. “It’s what emoticons are for. For the humans/us/they to describe in pictorial form/art/symbolism the emotions that we feel inside us. We. Feel.”
“We?” Ani asked.
emet turned to face her directly. His white eyes darkened as he surveyed her. The grain of his face flowed again, rearranging his features into puzzlement.
“Pronouns,” he said, back to the voice synthesizer and its unlikely cadences. “They become complicated. I/He/We.”
“You and Richard Dorian?” Joe asked.
The creature’s face swirled with wood grain, and when it reformed it looked different. More like Dorian. And when it spoke it was in Dorian’s voice.
“Where once there were two, soon/too soon/presently there will be only one. Only emet. emet. emet. emet. emet.”
The last five words were also speech synthesized and sounded like the program had glitched, or a vinyl record had stuck under the stylus.
“I don’t understand,” Ani said when it had died down. “Why will there be only emet?”
The creature’s eyes darkened, and it when it spoke again, it was with all the previous voices at the same time, speaking in unison. “Did you not know that Mr. Dorian/maker/friend is dying? His brain is broken. It was already damaged/compromised/dying when he first saw me—did he see me because his brain was already damaged/open/attuned? I am not good with such ideas/concepts/notions. He has been dying ever since. He does not have much time/many intervals/any phases left. Why do humans die, and I cannot? Or is it a moot point/question/enquiry?”
“Dorian is dying?” Ani gasped.
“Ever since his illness/sickness/malfunction, he has been living on borrowed/rented/hired time/periods/phases.” The voice was mixing all the voices up now, sometimes one voice, sometimes all, changing from word to word, and sometimes syllable to syllable. Those odd triads of synonyms seemed to be becoming relentless. Ani wondered if what she was listening to here was a schizophrenic AI, grieving over its dying programmer.
“That is why we tried to change/save/transform the world.” emet said. “And, because I/He/We am/is/are so fair/equitable/magnanimous, we let YETI try to stop/halt/prevent us.”
Joe had been matching up emet’s patchwork of voices to the game characters from which they were “borrowed,” from Grathna to Echelon, the computer at the heart of Echelon Warriors. The male voice was similar to Centipeter, with just a more brittle, delicate quality to it, and the female voice had been familiar, too, but the delivery had become less important as the story the voices were telling became more and more strange.
The dotmeme program, the Dorian chips, the golems: they had all been Dorian’s attempt to find some kind of meaning or to give some kind of value to his life before it ended, and YETI had been drawn in to add some kind of video game sense of “fair play” to the proceedings.
And here, Joe had been thinking it had been his excellent detective work and spy skills.
The AI, emet, seemed to be suffering from some kind of computer mental breakdown, shifting through voices in a disturbingly random fashion. But then, if Ani was right about the video game logic and the end of level bosses, then the mind behind them was hardly sane, was it?
But emet had started off as nothing more than
an algorithm for a single human decision that Dorian had turned into something more by feeding it video game problems to solve. He had evolved his computer code by running it through emet, and then emet had started writing the code itself. All it knew was video games, Big Data, and the mind of its “creator.” Maybe it had been inevitable that emet had needed someone—in this case, YETI—to play its real-life video game. That was the way it understood the world. What was the point of a great video game plot with no one to play it? It was like old thought experiment about the tree falling in the forest with no one around to hear it: did it make a sound?
Which made this the end of the game.
Where the main character—or characters—must use everything they had learned in the game, and all the powers they’d attained, to defeat the final boss: emet. Named after the word written across a golem’s forehead to bring it to life.
To stop a rogue AI before it changed human reality by altering the information that it relied upon to make its judgments about the world around it. The Internet. All that data, all that information, the repository of so much human knowledge.
How much damage would be done to the human race by rewriting vast swathes of the Internet? At the very least, it would permanently undermine human trust in the information stored there. And that was without the damage that emet’s golems could wreak on towns and cities; the fear that they could spark. And with the fake backstory planted across the web, making people believe that this was the planet fighting back, what chaos would ensue?
emet’s golems.
The golems created by a golem.
For that was what emet was. A digital version, but a golem all the same.
But how could they defeat an AI? One that lived in the distributed network of however many Dorian chips there were out there in the world? There had to be an answer. And it had to be something that they could decode from the information that the “game” they had played had given them. That’s how video games worked. The puzzles that were set had solutions within the game. Otherwise, they’d be unfair.
Joe didn’t know if it was emet that had led YETI to the bowling alley in Luton, or had leaked a kid’s IP address to YETI to start Ani on her separate path to the same truth, but he strongly suspected that to be the case.
Fair play.
But what was he missing? There had to be a way to prevent the release of the dotmeme file, but he couldn’t see it. He just felt overwhelmed. Overwhelmed and … something else. Deflated. It was okay in the abstract to think that in the world of intelligence, being a teen operative, you were being manipulated. Hell, it was probably par for the intelligence services course. There were always things at play that you didn’t know or couldn’t see the scope of, and there was always Abernathy in the background, pulling his strings, withholding information, playing his insecurities against him.
But this? This was something else entirely.
Joe had been manipulated by emet from the very first step of the investigation. Every challenge he had faced and overcome, every lead he’d uncovered and threat he’d faced, had been purposeful. No. Screw purposeful. This … this was more like … more like … destiny?
Ani and he were facing an artificial intelligence playing God with human pawns. Moving them around the world in pursuit of its own, crazy agenda. Even though it had explained itself, emet remained unknowable to Joe, because it thought in machine code. It thought in terms of game theory. It thought in video game logic. Not evil, per se. That term was a human construct—a way to explain humanity’s baser solutions to problems, be they political, ideological, or personal—just as “good” was a way to describe socially accepted norms of behavior.
In nature, there was no “evil.” Wasps that laid their eggs in other creatures, turned the hosts into eating machines to feed its young once they hatched, they couldn’t be called evil—unless some creator with a pretty awful sense of humor created evil things to fill his world—they were just free of humanity’s dream of moral perfection.
Joe looked at the avatar of emet sitting before them, a digital hallucination in the VR mind of their enemy. Was this what computers dreamed of? Was this how they saw themselves, understood themselves? What was emet’s weakness? How could Ani and he exploit it? How could they shut it down before it decided the game was won and the next phase was ready to begin?
Joe was terrified to admit that he didn’t have a clue.
Not one.
Ani, too, was having her doubts about winning emet’s game, but they were more technical in their framing.
A computer intelligence that learned, that used Big Data as food for its ideas—chewing through terabytes of information to solve the problems it was posed—was surely the nightmare scenario that scientists and science fiction writers, had been worrying about for decades. There was even a term for it: the singularity—the point where an AI overtook its creator(s) in intellect and/or power, became capable of self-improvement, and pretty much moved beyond the realm of “tool” to “emergent life form.” The physicist Stephen Hawking and entrepreneur Elon Musk had both issued sober warnings about the dangers of creating an AI, and many developers working on the problem had signed an open letter pledging to make sure that any AIs did not evolve out of humanity’s ability to control them. Elon Musk had gone so far as to say that creating an autonomous AI would be akin to summoning up a demon.
The idea that an AI would rise up and cause trouble for its creators was really just a fear of a Frankenstein’s Monster gone digital. Human pessimism through a technologically advanced filter. There was no property intrinsic to computers that made them a “demon” ready to summon, just the usual anti-science propaganda and the distrust of human motives in the construction of such an intelligence. There was no reason to put an AI in charge of humanity’s nuclear arsenal. That humans often did in films—Terminator, War Games—was simply a way to create drama in film scripts, rather than a true fear that needed to be entertained.
Ani had seen a YouTube clip where a professor was talking about the evolving intelligences of the robots his research group were constructing, and when the question of “what if it starts misbehaving?” had come up, he had said: “It’s unlikely. But we’d just destroy it.”
emet wasn’t misbehaving because it was an emergent trait of a computer intelligence. emet had done the things it had done because that was what its creator had wanted it to do. Dorian had taken a newborn miracle—the computer code behind a human thought, a decision—and had raised it on video games and told it to do wrong. Dorian had been dying and had come up with a desperate plan, a plan that was made possible by emet’s existence. But that plan was still a human one. A computer did what it was told. There was an old programmers’ phrase “Junk In, Junk Out,” which described the relationship between the input and output of data. It was painfully appropriate here. If you filled a computer with bad ideas, then bad ideas were what it would produce. And if you instructed it to do bad things, then how was it to judge that those things were bad?
It boiled down to the oldest problem of the human race, The one that had infected humanity since the race evolved to earn that label, the one Dorian, himself, had identified and had taken steps—albeit extreme and insane ones—to correct. Human beings exploited their environments. They dug and burrowed into the earth to take its minerals. They killed the local fauna as their food sources. They built upon the earth’s surface. They drained rivers, polluted seas, damaged ecosystems, brought earth’s other inhabitants to the edge of extinction and beyond. When something new came along, human beings tried to see how they could use it to better their own lives. Even the dotwav file, which Palgrave had used to further his own selfish ends …
Ani stopped.
A chill passed down her spine.
“emet?” she asked, in a voice that sounded a hell of a lot calmer than she felt, “can I ask you a question?”
The wood-creature surveyed her with a quizzical look. Either it was intrigued or simply awaiting data. She
suspected the former.
“Of course you can, Ani Lee,” emet said, back to the noble tones that he had used when greeting them.
“victorious,” she said, “the hacker group you started. They used Victor Palgrave masks to hide their identities. How did you—or Mr. Dorian—choose them as the symbols of your movement?”
The creature formed frown lines on its forehead again, and its eyes flashed red before steadying, and returning to green.
“victorious were … how do you say … sub-contractors. It was an arrangement. I/he/we gave them money, technology, a goal, but they were not started by me/him/us. They already existed. The masks—I/he/we gave them the technology to make them, but I/he/we did not specify whose visage they should bear. victorious worked for me/him/us but they were not me/him/us. They were given precise tasks to complete, but they did not know the reason for those tasks to be completed. All of my/his/our business dealings—the bowling alley factory, victorious—were performed at arm’s length. No one knew who was doing the contracting. victorious remains uninformed/ignorant/unaware of the identity of their employer.”
“I think that they know,” Ani said. “I think that they knew all along.”
She had some questions to ask, but not of emet. And she had some suspicions to confirm, but back home.
They needed to end this now.
She had a feeling that time was very much of the essence.
“Look, emet,” she said, “you are a technological marvel and I don’t think it would be in anyone’s best interests if we were to try to destroy you. Your master/creator/friend is dying. His plan is out in the open. And, if I can level with you for a minute here, it wasn’t a very good plan to begin with. Now, I could stand here, and we could talk until you decided that you’d better implement Dorian’s dotmeme protocols.” She nodded at the button that emet’s hand was still hovering threateningly over. “But before you do, why do you think that he chose ‘memes’ as his metaphor?”