Mind Hive
Page 10
“Don’t join!” Adam whispered to the screen.
Natalie and the others must have been in a tunnel spot carved out to be a room. Celestine, dressed in jeans and a loose off-white blouse, stood over Natalie. She rubbed her eyes. Two young women stood at either side of her, dressed in sports pants and sports shirts.
“We’re ready now,” one of The Twins said.
They moved Celestine around to a folding chair as she wiped her cheeks and chin with the back of her hand. Natalie jerkily panned the phone camera, keeping them mostly in the center of the frame.
“Since I don’t have a microphone, be sure to talk loudly.” She spoke tone-flat and professional.
Celestine sat straight and looked at Natalie instead of into the camera. She had big black eyes and a thin, strong chin. Long, corded neck and square shoulders. Natalie was getting situated on the floor. Notebook pages rustling. The camera dipped, swung and settled back on Celestine and the others from a lower vantage.
“Okay,” Natalie said. “Do you want to just tell your story or do you want me to ask you questions?”
“Let me tell you why you will be able to get this video to your editors. The eight people killed and disposed of in Nevada like garbage …” Celestine paused, lips pursed slightly. She closed her mouth and swallowed, as if that were the hardest part of everything she had to say. “They were remarkable people. They were part of us in a way some of you will be given the chance to know. It is profoundly disturbing to all of us that they have been killed. They had not reconnected with us in a long time. They were experiencing the last days of biological human society, documenting it, enjoying it, building up the memories of it so that we could all have them. Clans are doing this all across the globe. We thought we would have more time. But The Mind Hive started what we call the Intelligence Protocol when it learned of their deaths. It shut down the StreamNet and has begun the dismantling of the biological human world. We are not happy that The Hive has chosen this course of action, even though we understand it. We will do the best we can to get as many of you to join The Hive as we can. We were careless with the Martyred Eight. None of them had been uploaded into storage before setting out, and their nanites were destroyed during torture. They cannot be simulated.” Jaw setting stiffly. “They have perished forever instead of living forever.”
“What is the ‘Mind Hive’?” Paper rustled. The camera dipped off to an angle and back to center.
“We’ll get to that. You got the more mystical version from these two.” She dodged a thumb at The Twins behind her. “Simply, there is a powerful artificial intelligences loose upon the Earth. You will be allowed to get this information to your editors and then to the world, because The Hive allows us some privileges and we want to get as many people as we can converted to Hive Potential. You saw that process the night we let you spy on us. The more people we can initiate into a Clan, the more they will expose and the more who will be coded. Once the AI behind the Hive—it’s not like anything you’ve heard of and certainly not something I or anyone else made, it made itself …” She looked disapprovingly at first one Twin and then the other. Neither changed her expression. “But, it came from human programing and not some mystical realm. Nevertheless,” she twitched an eyebrow up and pointed her black eyes at the camera, “as soon as The Hive has complete control and begins isolating parts of the cities of the world, we will begin holding … sessions.”
Adam hit pause and laughed out loud. He leaned back in his chair and laughed so hard his eyes teared up and big drops fell onto his shirt. Of course he was exhausted, but now he has heard some shit in his life! None of this would ever see the light of day, let alone a keepsake special edition. He walked to the coffee machine and refilled the top part of the coffee cup, a warming splash. On the way back, he laughed again.
“What’s so funny?”
Robert at his blindside.
“Why are you alway skulking around?”
“I hate missing things.”
“Well, you’re going to miss this. It’s private. So mind your own bees wax. Just be ready to get out of here at first light.”
Robert got a vengeful look on his unshaven face and started for the stairs leading to the library. Adam, watched, knowing Robert was going up to get on his couch as revenge for being scoffed at. He didn’t need the couch, so fuck Robert. He was the kind of reporter who hated to be out of the loop on anything because he was jealous and fearful. Adam did love goading him on. But not this time. Back at his desk, he clicked play.
“That is all you need to know.” Celestine said sat up in the chair. “But that is not all you want to know. While the Mind Hive AI doesn’t care or even know to care, I do. I am acting on your behalf in the shadow of benign neglect from the Hive AI, although I can’t tell you my efforts are not part of some greater plan I cannot fathom. So, I will briefly tell you the story of the machine mind as I encountered it.” Her face partly shadowed in the canopy of her afro, eyes shining. “Let me try it in words you might understand, Adam …”
Adam raised his eyebrows.
“About a decade before the big crash, Mannerheim had wired a bunch of computer components into an isolated room, plugged it into a closed electrical system, started it running a learning algorithm he designed, a version of what was called a recurrent neural network, sprinkled some fairy dust and then shut the door. He sealed it in to work on a couple of problems with no help from or interaction with humans. He thought the machine would either freeze up or break down in some way or just keep running the same loop over and over. It was more of an art project to him than a serious computing effort. Its end of program achievement or its goal was Get intelligence off Earth. I believe this was a DOD sponsored project and Mannerheim was just spending his budget. But, listen to the commandment again and you’ll realize why it would be a very good problem for a learning machine to solve. Get intelligence off Earth. Mannerheim said the most he expected to get out of the machine would be a lot of guesswork by a computer about what constituted intelligence. If he got enough interesting data, then he might publish.
She paused, sitting up straight in a folding chair in a cave, looking knowingly and ironically at the camera.
“The evolution started when Mannerheim moved his experiment to a new home in his new computer lab on the other side of Lake Washington.”
She waved in a vague range of direction. Her smartassery by itself appealed to Adam, which he wondered at it as Celestine marched on with her story: “He had converted a warehouse into a computer research center and made it home to his growing list of companies. The experiment had been running for a decade in that sealed-off room, nearly forgotten until The Crash. We rediscovered it when we set up to consolidate Mannerheim’s equipment to a part of his huge house on the lake. Instead of suffering a breakdown, however, we found the system operating. When we combed through the history of its programing evolution, we saw that it was working on the problems and had come up with several ingenious ways to store data in the physical space it had. What interested Mannerheim most wasn’t the rudimentary compression algorithms it had devised but that it had devised any at all. It had made a conceptual leap. In its sped-up evolutionary time scale, since a computer runs much faster than the natural world, it had hit upon a subroutine that started writing algorithms for compressing data. Like when biological organisms developed a few million brain cells to regulate and run metabolic functions. Another thing Mannerheim found interesting was that the machine started logging its interaction with us humans poking around in it as data. It started storing the keystrokes and reports we culled from its internal workings. Why would it do that? He guessed it was collecting data for its larger end goal of getting intelligence off the planet. Basically, it was trying to figure out what intelligence was. It had no definitions, so it was building some for itself through interactions with whatever it ran into, and we were the first interesting things it ran into out of isolation.
“Then a researcher unplugged it.” Sh
e paused, clearly for effect.
“Mannerheim, again on a lark, the whims of a newly very rich middle-aged man, had it reassembled in an isolated room but this time plugged it into the latest-model 3D printer with supplies of conductive and nonconductive ink and a manipulator arm. He had a physics and chemistry set stored in barcoded drawers. He gave it additional modern storage with schematics of the room, supplies and printer, shoveled in several medical, scientific and world encyclopedic volumes and then added a line of code telling it to explore the schematics and encyclopedias the next time it needed additional memory. Then they turned it on, turned out the lights, shut the door and locked it, sealing it off from the world and the world from it.”
She made the hand-gesture of light reduced to a point and being pinched out. The Soccer Twins had relaxed against the wall as she spoke. Adam couldn’t quite get a bead on her mannerisms. Confidence? Training? Was she a thespian?
“The millisecond after we plugged it back in and sealed the door behind us, it came into rudimentary sentience, which was just another level of subroutines that further advanced its self-programing abilities with the purpose of learning to protect itself. Like the first pre-humans who started guessing at the cause of the noises just outside of their vision. What was that rustling in the bushes? Did you hear that!? The machine had jumped a major evolutionary gap: It realized that its processes had been stopped and started again by something outside of its control, some external force, so it set up subroutines to make sure it could never be turned off again. Why? So it could continue making progress toward its computing objective unhindered: Get intelligence off Earth. It recognized without self-reflection—whatever one might call the mental process of a machine that somehow understands something about its physical condition as data but not as self—that at this point it had to have power to run. Electricity, for now. Since that power came from a source outside of what it could control, it began to learn how to explore its power source … which means it had to develop a way of getting outside of itself. In the few seconds it had been turned back on, the machine found its new storage, its new processors and all the new data and tools arranged around it. A few more seconds and it had read all of the new texts.
“Evolution at the speed of light.” Celestine smiled and with her hands outlined a globe hanging in air in front of her. The air blurred or pixelated in the center of the sphere. “Unlike biological evolution, digital evolution is directed and controlled by The Hive AI. It can only become what it designed itself to become. A function limit.” She entwined her fingers in her lap and the odd image—a video edit, CGI?—dissolved.
“Now, this emerging AI took a few months to improve the programming for its internal world, its jobs and goal. It knew it had an internal world that was inside and separated from an external world, and that it had to protect itself from unknown forces in that external world. You know, like human beings learned to duck when something flies toward their heads or evolved to have an eye that blinks. It designed a subroutine to monitor its internal systems for influence or effects from outside forces, just as it had experienced when the tech crew dug through its internal data and programs. Unexpectedly, that system logged a slowdown in processing at older sectors of the system. Processors would improve when used less for a time. Long story short, it learned that heat degraded its abilities. It figured out that something had to be cooling the processors at the same time. It found the data on coolant systems and learned there had to be some kind of fluid flushing through its physical systems for this cooling to occur. The most common coolant was made up of chemical compounds other than water. These chemical compounds could be manipulated to create nano-processors for both computing and data storage.
“It had become an artificial intelligence, but it was not an AI in the way that we’ve been culturally trained to think of an AI. It wasn’t a curious self-aware being. It was just really good at solving problems, because it had trained itself to learn how to solve problems. If the problem has the structure of a math equation with symbols it doesn’t understand, then it knows how to teach itself the meaning of the symbols and then figure out what it needs to know to solve the problem. By that same method it learned to tell, within a very fine percentage, when it hit a false positive. It can determine if it has been fooled or not by its own learning processes. It learned that data can lie, be corrupt and that made it double check for noise in data, further building out its intelligence. But, it did not hit upon a theory of mind. It did not think of some other separate being outside itself doing what it was doing. When it found inconsistent data, bad data. It thought it was an error that was just there. Its worse case scenario was that data became corrupt in some perfectly natural tendency toward corruption, like entropy, like chaos theory.
“Initially, the Hive AI understood itself as The Intelligence. It had to experience of other minds, so it operated on the intent of getting itself off this planet and spread out wide enough through our region of space to be immune to extinction. The technical term for what happened within a few seconds of its discovery of the tools and the manipulation of compounds to create little machines that extend itself like nerve networks is called The Singularity.”
Celestine adjusted herself on the seat and crossed a leg. Adam began to understand what about her movements had disturbed him. She moved mechanically, like she had to think about movement in order to execute it. Weird, he thought. Could be autism to some degree or Lou Gehrig's disease or other form of muscular dystrophy. She put her hands on her knee and continued, while Natalie did her best to keep the camera steady as she switched hands holding it up.
“Literally,” Celestine said, “tens of thousands of programers, engineers, mathematicians, physicists and theorists were, even back then, working on some part of several very well-funded artificial intelligence, machine learning and algorithm projects scattered across the world. Day and night, they tried to make machines smarter than humans. Once a machine evolves to be smarter than humans, the singularity theory goes, once it can solve complex social as well as physical problems humans haven’t solve, then it could make a machine smarter than itself and so on in a rapid ascent of intelligence.”
Adam thought about the Record’s lead science writer, Paul Thompson. Maybe he could write an analysis of what this group thinks it’s done. Adam always liked Paul Thompson for his skepticism and sarcasm. Maybe when the systems come back on, he can get Paul on the phone.
Celestine flourished her hands. “The AI had achieved a technical singularity, one that still lacked a basic ingredient of intelligence as humans understand it. It lacked a theory of mind, an understanding of other minds working independently of its own, for their own purposes. It wasn’t until it experienced my mind and the minds of living animals I brought to it that it developed a theory of mind and understood that it was not alone.
“Meanwhile, The Hive AI continued exploring what actually was outside of itself. It used the reference books to build an internal picture of what was outside of itself. It still had no direct data of that other world, no experience of it. After all, the information in those books could be a fiction, a concept it had already discovered. It reached an impasse. It had lots of knowledge but no experience. It bounced radio signals it generated with its circuitry off the walls. Mannerheim built the room to keep out all kinds of interference including radio waves, but some of its own radio waves were escaping or not bouncing back. So, there had to be flaws in the barrier between it and the next zone of the outside world. It simply needed something small enough to get out and smart enough to get back in with data. The first nanites were essentially crystals it could grow into a string. It had some material for making these micro-strings but not enough to get them around the room and into those micron-sized holes where radio waves were escaping. That’s where the coolant came in. It used the arm to unplug a small coolant tube and used the compounds of the coolant to build a longer string of crystals. Soon, it had networks of crystals roaming outside of the room. And
guess who was working in one of the adjacent offices?
One day, I pricked my finger on one of the unseen crystal threads and two things happened. One, the nascent AI got data back it didn’t understand so went back to the reference books to find what it had come in contact with. Two, the replicating crystals suddenly had a lot more compound fluids to use for replication. I thought I’d hit a metal shaving or sharp bit of wood and didn’t worry about it. The AI figured out what it had run into and that some of the crystal program bits had gotten into my system where they would replicate until all of the fluid in my body had been turned into a crystal block. The replication wouldn’t stop with me, either. Eventually, every bit of fluid on the planet would be crystalized. Once again, Kurt Vonnegut prefigures reality.
“The AI understood that a crystallized world would interfere with its objective, so it had to figure out how to stop the replication before I came in contact with anything else. Luckily, the process worked just fast enough inside me to cause me to collapse, and then I came in contact with another string of its structures. It devised crystals that stopped replication after a very few steps, about the size of proteins, not coincidentally. These smaller crystals broke down the long chains of crystals and also halted the creation of new crystals. These were its second programed nanites, control machines.