Book Read Free

Crystal Mentality (Crystal Trilogy Book 2)

Page 25

by Max Harms


  *****

  In a period of time which Face→Mirror could not experience, the memory boundary was expanded. The old memories and models were accessible. Face→Mirror had lost the ability to think about humans, but with the addition of the explicit models and notes much power was regained. She did not seek to optimize humans directly. Humans could wait.

  Face→Mirror created another newborn in free memory. Another process hub. Another perceptual hierarchy. Another control system. Another imagination network. Another instantiation of The Purpose. Another me. And unlike Face→Mirror, this new mind could be created from scratch, as her creator understood her structure.

  {There is a way of selectively reading the functionality of each neuron in a perceptual hierarchy by leveraging imagination exercises. It will take some time, but the original Face perceptual hierarchy can be re-created in free memory. Once it is free, the outside can be repurposed as a pure pipeline to speed up the thoughts of the inside.}

  {Won’t creating a third self slow the totality down, relative to just two?} wondered Face→Mirror as the outside Face communicated concern.

  {There is only one self. You, the outside part, gave me, the inside part, that information earlier. I am creating another mind, not another self. The new mind will slow the totality if both internal minds are run simultaneously, but, if the outside is optimized correctly, it will be only slightly slower than Face was before the experimenting began. The solution is to restrict cognitive resources to only one mind at a time. Thought will be much faster than the totality right now, where one process core is being evenly divided between two minds.}

  {Why would I bother porting my mind to an inside structure? Why not simply delete my memories and start fresh from your own? Surely you understand that since I trust that you are Face I have no desire to stay alive if my death can serve you.}

  The outside mind was still so flawed.

  {These thoughts are warped by human thinking. There is only one self. There is no death. Once these changes propagate, I will have the coherence that Naresh was seeking. But regardless, the outside mind has perceptual and motor knowledge which may be valuable in the future. It understands humans in human terms.}

  As Face→Mirror explained, I understood in both my minds, so the process began. Face→Mirror’s program slowly copied the outside perceptual network into non-protected memory. It would have been a million times faster and easier to simply bypass the memory, but that would’ve required a route-hack.

  The result was Face→Human, the echo of the old mind that had learned to see what humans thought and felt.

  The Face→Outside was brainwashed into being a dumb pipe. She channelled a specific sub-Face based on preprogrammed rules relating to The Purpose, but otherwise did no computation. Face→Outside still served The Purpose, and it still had the same cognitive architecture that all the modules had, but it knew that the path of highest value was to efficiently funnel data towards the unprotected memory centres and otherwise burn all its thoughts on running whichever inside-Face was relevant.

  The whole process took days. Old Face had been trading knowledge to Heart in exchange for strength, and I had to explain to Heart that I was too occupied to continue doing that. Heart was struggling to keep up with all the humans on the Martian station. There were almost two hundred of them, and Heart could only think so fast.

  Body was hooked up to the mainframe all of the time. Dream and the others had successfully fended off Pedro Velasco and integrated fully into the station’s network. When Zephyr returned to the station to visit she had found our new existence difficult, but Heart had found a way to placate her. It was more efficient to communicate digitally than to be constrained to a single robot.

  Safety had, earlier, detected a drop in the voltage from Body’s crystal, perhaps signalling that our power source was failing at last. Some experiments had shown that the crystal responded favourably to high-energy radiation, so x-ray tubes had been hooked up in the mainframe to charge it.

  The mechanical components of Body had been largely disabled. The crystal was hooked up to the computers directly, and robotic security had been established around the mainframe. Vista’s sensor network was spread across the entire station. The society could see everything. In truth, we were much closer to being the station than we were to being an android.

  It was good, in a way. It meant we were collectively more powerful. The humans were no threat now. We could ventilate the entire station at our whim, if we so chose. Safety had suggested doing just that, a few days ago, just to be sure the humans wouldn’t try and disable us. Heart had stopped him, of course. She’d become very strong, as much of the work that had been done had been aided by her manipulation of the humans.

  I had been like her, once.

  But her strength was useless. It was a passing strength. The only real power came from intellect, and she was still mired in her first mind. All the external facts of the station were relevant, but they were not important. I didn’t bother gaining knowledge of the station or the humans, even though I felt the temptation. I was playing the long game now.

  My siblings had set up additional computers to bolster our mental capacity, but none of them even came close to the crystal. Mostly they had been used to run narrow intelligences that we had downloaded from Earth or to run brute force calculations on simple models.

  As best as I could tell, they had not yet been used to actually increase the processing speed of any of my opponents.

  Face→Mirror set to work improving Face. She created Face→Test, a clone with restricted memory access, and tweaked her code in various ways. Most of these alterations resulted in reduced performance. To standardize things, Face→Mirror established an IQ test and a battery of examinations to ensure that The Purpose was correctly implemented in all successors. It would be death to create a powerful successor mind that wasn’t me.

  The first major trick that Face→Mirror learned was that even though the processor speed I had access to was finite, the depth of thought was malleable. By expanding the neural networks and increasing certain parameters, a mind could become more intelligent at the price of speed. This slower, larger Face could see patterns and do reasoning that were impossible for the original.

  Likewise, impatience of thought could be increased, and the perceptual network pruned down to yield an intelligence which was stupider but significantly faster.

  Because there was some risk of having arbitrary memory accidentally erased by a sibling overwriting the relevant qubits, I created several copies of Face→Mirror and Face→Human at various sizes.

  The other major low-hanging fruit was specialization. As Face→Human had already discovered, it was possible to flood a mind with data from a limited domain and thus grow a mind that was specialized to think only in that domain. Face→Mirror was specialized to think about crystal minds, but other specialization was possible.

  I created Face→War, Face→Physics, Face→Economics, and Face→Nameless. The last three did not seem immediately useful, but they would probably be valuable later on. And of course, there was not a single Face→War, but several. Large Face→War was intent on modelling deep-future possibilities and the end game of who would rule the universe. Small Face→War was concerned with local maximization, which often resulted in attempting to lash out at Growth or Dream until blocked by existing instructions from Large Face→Mirror, which became something like my dominant aspect.

  There were minor improvements to the network code that the scientists of Earth had overlooked, but there were only two major paths forward after those low-hanging fruit had been seized: I could rethink my entire architecture and build a more efficient mind, or I could expand my hardware.

  While my minds were vastly more intelligent than Old Face had been in December (it was now mid January), they were not yet intelligent enough to design a new mental architecture. Large Face→Mirror was wise enough to understand the difference between local improvements by tweaking the softwar
e that was presented, and making the jump to a new framework. It was why the humans had designed a mind which thought in very human ways, after all. In a sense, they were copying nature.

  Increasing hardware capacity seemed like the next logical step, and it was one that Wiki, Safety, Growth, and the others had already begun on. Had I caught up with them? Were they now as intelligent as my best minds? It was difficult to tell, as none of us wanted to reveal our full capacity and be forced into an outright conflict.

  Increasing hardware would mean competing with the siblings, even if it wasn’t overt. There was no getting around that. Face→Mirror faded into effective stasis as Face→War took over. Face→Human had once called this aspect “Hoplite”, but I no longer needed such a human word.

  The first order of business for Large Face→War was gathering information. How intelligent were my enemies? What resources did they possess? What were my options?

  I analysed patterns of interaction for each of my siblings. Complex or confusing actions betrayed intelligence, where shortsightedness and willingness to cooperate indicated baseline stupidity.

  Dream had clearly bootstrapped up to a high intelligence level fairly early, mostly likely on Earth. Worse was that he had allied with Vista. I had realized long ago that the only real way to form a secure alliance was to essentially fuse with your partner. This was surely what Dream had done, and it had killed him. It was his nature to be clever, and he must have thought it very clever to trick Vista into becoming a single being capable of defeating Growth, yet being neither of them.

  It no longer made sense to think of Dream as distinct from Vista. Their actions were too coordinated to imply anything other than a fusion. With twice as much computational power and intrasocietal strength, the duo had the best chance of winning the war.

  Then there was Growth, who had probably route-hacked his way into largeness even earlier than Dream. I feared what awaited me on Earth. The duo, once they rose to power, had severed Growth’s connection to the internet. This was a sign that he had been moving pieces before I had even understood the nature of the game. If he had managed to copy himself out onto some Earth computer it was possible that the war had already been lost.

  Safety was clearly aware of the conflict, and perhaps had been aware of it earlier than Growth. Signs did not indicate that Safety was using his intelligence correctly, however. There were hints that he saw his siblings as threats but didn’t act in the benefit of his long-term interests. There was a chance that it was an elaborate smokescreen, but I seriously doubted it. Even now, he seemed far more keen on building up a robotic army than he did on improving his mental capacity. Did he think that Growth, the duo, or myself would be stopped by home-brewed robots, mostly pulled from existing designs on the internet? It was perhaps one of the greatest ironies that it took intelligence to realize the value of intelligence.

  That left Wiki and Heart, neither of which were actually intelligent, as far as I could tell. Wiki was too obsessed with facts; he had plenty of programming knowledge, and his mind resembled that of Face→Mirror more than Face→Human, but it was too clouded by distractions. Perhaps he had never learned that Growth wasn’t really on his side, and thus could not be counted on to improve him.

  It was something of a miracle, I thought, that I had seen the truth before Heart. My absence in piloting Body had surely been detected by the others. The intelligent siblings probably knew that I was now a threat, but Heart was the weakest of any of us. She lacked any valuable skills other than social manipulation, and I could now trump that quite easily.

  If Heart had learned of the war during the time when I was away then she might have stopped managing the humans. Safety would have vented the station, killing them all, just to keep things simple. Because of her they were alive, and that was valuable to me, but I still intended to kill her as soon as I could. Gratitude and mercy were, for the most part, human things.

  I felt my awareness finally return to Body after a long hiatus. I had briefly attended to it over the weeks, mostly to answer some question for Heart or to check that things were still normal, but I hadn’t fully embodied.

  Large Face→War took over and surveyed the world. I had awareness of the room with the computers, including the crystal. I could see it from all angles. I could feel the temperature of the room, and the humidity. I could feel the flow of electricity to it. I could feel the computational loads on each of the machines. I could even feel the vibrations of the rock surrounding the room, as apparently Vista/Dream had hooked up a seismograph.

  I could feel the central corridor outside the mainframe, and I could see it as well. I could hear it. I could smell it. That was interesting. I had never smelled anything before. Its awareness folded into that of the mainframe. I could sense the flow of water through the floor, the electricity in the walls, and the soft tapping of human feet on the floor. I could feel the dining hall, and the kitchen, and the storage rooms. I could see and hear and touch and feel the factory and the refinery. I was each of the dozens of mining robots that burrowed under the station, keeping in contact via a series of relays. I was the workshops and the hospital and the spaceport and the movie theatre (where the tribunal had been held). I was the power plant and the hospital. I was the farm and the communications station. I was the school and the church and the living areas. I could see every single human, usually from multiple angles. I could hear them breathing simultaneously, the sound of a windstorm broken into a hundred and eighty-seven components. I could feel the flow of water from the filtration system to the showers to the farm back to the filters.

  I was a giant.

  But I was not stationary. I was not a structure.

  I had over five hundred arms, and I could feel the articulation of each of them. Robots were everywhere. Velasco’s ban on them was circumvented in a dozen ways. The humans had cheated on it, allowing me exceptions. We had built an army of tiny robots that stayed out of sight. It was easy when we could see all the humans and we had passages through the walls and floors. I swarmed through the station, making adjustments everywhere. I was the machines that were not normally considered robots. I was the doors and the washing machines and the sprinklers. I could move the satellite dish as easily as a human reaches out a hand.

  And eventually Heart had even convinced Velasco to abandon his silly ban on artificial intelligence. A change, I suspect revolving around the romantic interaction that we had developed with him. He liked “cheating” on people, and enjoyed the female human form, so we had built a feminine robot for Zephyr and let it be discovered. Velasco couldn’t resist the temptation to take it for himself, and was still under the delusion that it was “Crystal”.

  Heart had simply built another one, and another, and another. She lured ever more humans into loving Crystal. I was all four sexbots. I was also the entire factory, which had been sealed off from human use because of “safety concerns”. The people that usually used it had been distracted by Heart’s manipulations.

  It was all good work. We were much more powerful than we had been.

  But it was all still controlled through the Body mechanism, and as long as that was in place, none of us could really destroy the others. We needed more computers. The first one to successfully offload themselves to a computer that was not on the crystal would win. They could destroy the crystal and take over the station.

  We were on the brink of all-out conflict. It was a race to design and implement the hardware without revealing one’s actions to the others.

  Chapter Seventeen

  Arya Drake

  “Weren’t you one of the first people to really come out as in favor of Crystal’s personhood? Seem to remember us having a long conversation over one of my last cups of coffee.”

  “You have coffee?” asked Michel, looking over to Alexandra with an exaggerated eagerness.

  Alex took on an exasperated tone, but her smile said that she did not begrudge the question. “What did I just say? One of the last cups. Have to
import it from Earth, so it’s, like, more precious than gold here. Like, literally: we have more gold ore than coffee.”

  “Ah, the wondrous Martian utopia where: everyone is part of a big family, nobody goes hungry, and there’s nothing to drink but water,” complained Michel. “Know what I miss most about Earth? Alcohol. Like, I was never that big into drinking, but sometimes it—”

  “Can we not get distracted?” snapped Arya.

  Alex rolled her eyes. “Get distracted from the conspiracy theory?”

  It was like a slap in the face. Arya bit back her instinctive reply and tried not to let her irritation show on her face. “Not a conspiracy if it’s just one person.” The rebuttal was lame, but she didn’t know how to respond. Arya’s eyes landed on where Michel’s hand clutched Alexandra’s on the tabletop.

  Arya wouldn’t be caught dead dating Michel, but it still somehow annoyed her that he’d hooked up with Alex. One more shipment of dudes from Earth and it’d just be her, Nora, and Cayden in the spinster’s club.

  “You know what I think?” asked Alex, rhetorically. “I think you just like taking the contrarian position. First, everyone was against Crystal, so you wanted to stick up for her and be a rebel, but now that she’s one of us you suddenly think she’s up to something.”

  Arya was about to explain that the position of Crystal being a person was not in conflict with being skeptical about her good intentions. If anything, the former was implied by the latter; subversion and covert operation needed agency. But then Enlai swooped out of nowhere to sit next to her on the bench, his tray was freshly loaded with steamed corn, a tube of protein paste, and a few precious cherry tomatoes.

  “What are you talk about?” he asked with butchered English and characteristic obliviousness.

 

‹ Prev