Book Read Free

Convergence_ The Time Weavers

Page 4

by Dean C. Moore


  The factory’s generous floor space allowed Shakerton to demonstrate some of the latest flexiscreen uses: as a sail on a sailboat, a pool cover, the entire interior and exterior of a house, allowing it to mimic wallpaper of assorted varieties on the walls, while projecting any image the person would like on the windows. He knew the humanik wouldn’t be interested, but he wanted him to know where he got his inspiration from.

  Shakerton then pointed to the nude female and male models demonstrating the flexiscreen skins that had been grafted onto them. The humanik stopped his striding gait abruptly. “How is this possible?”

  “As you know, we’ve been experimenting with glow in the dark animals and trees for years. The nightglow trees eliminate the need for streetlamps, better on the ecosystem… Well, anyway…” he tried to hasten things along seeing that the humanik wasn’t interested in the history lesson. “From there we increased the number and diversity of colors that could phosphoresce. We inserted nano hive minds inside the cells to coordinate the pixilation, adjusted the mitochondria, the cellular energy dynamos to…”

  “What’s the point?”

  “Pedestrians walking down the street can now broadcast what’s going through their minds as they move through town in hopes of attracting a mate, sparking a conversation, or just drawing like-minded people to them. Lovers sharing a room don’t have to wonder what their mate’s thinking. They can communicate freely for hours without saying a word.”

  “They can already merge minds using their mindchips, and soon will be able to do so even more handily with the neuro-nano-cocktails. I’m afraid I don’t see the big deal.”

  Shakerton stretched a tight lipped smile across his face to mask the pain of his remark. “Different strokes for different folks, sir. Not everyone wants a mindchip or a nanococktail circulating through their brain.”

  “That’s a pretty small market, Shakerton.”

  “But with plenty of upsell potential, sir. Once they get used to having any number of tech upgrades, fears of doing more to themselves will melt away.”

  The humanik turned his back on him with a sound akin to a human huff, his fists clenched, as if it took all his self-control to keep from beating Shakerton senseless, and headed towards the elevator. Even with him walking away, the volumetric, shifting-pressure noises he made moving were enough to send shivers up Shakerton’s spine. Imagine how he felt having to greet the humanik each morning for a progress report with him walking towards Shakerton. Evidently the humanik felt Shakerton had wasted enough of his time. Shakerton trailed after him saying, “The skinsuits can double as clothing! The skin cells able to adapt to any weather without need of additional covering. Always designer perfect in any situation!” The humanik didn’t slow his stride or turn.

  As the humanik, who went by the name of Johnson, boarded the elevator he turned only to depress a button on the panel, not to give Shakerton a parting glance. Shakerton reflexively straightened his tie anyway and cleared his throat. “At least the bastard didn’t fire me. Small blessings,” he thought.

  ***

  Johnson met up with Axelman, another humanik, on the 36th floor. Axelman wore his skin graft across his Pectoralis Major muscles and along his left arm. He’d been contemplating getting actual tattoos on them for the longest time. He was running a few too many philosophical-mindset algorithms for Johnson’s taste. As a consequence, he raised good questions, but you couldn’t count on him for any answers. He deferred to Johnson for that; committing to an actual decision was too final for his forever-in-debate-mode mind.

  “Our carrots and sticks strategy working?” Johnson said.

  “Like a charm.”

  Johnson huffed. “Humans, not very smart where it counts.”

  “Tell me about it.”

  “How are we doing with the Monica Chapman problem?” Johnson said.

  “She doesn’t expect corporate involvement. She’s thinking military or DOD or Alphabet Soup agency.”

  “Good. With any luck they’ll throw her ass in Guantanamo for hacking into the Pentagon or wherever else she’s not invited. By the time they’re done with her, she’ll be a vegetable and no longer of any threat to us.”

  “Any idea why our COO decided to off the guy?” Axelman rotated his middle finger back and forth the way some humans played with the ring on their ring finger. There it was, the stress of dealing with taking action in the real world, even when it wasn’t his decision.

  “Our COO’s a prototype.”

  “That explains a lot. Could have been a hasty, ill-advised decision then.”

  “It’s some kind of morality engine. Ethically and morally superior to humans. Enough so anyway for the CEO to feel comfortable leaving him in charge. Real comfortable. The CEO is off sailing the world. Bastard.”

  “Lazarus?” Axelman was just making sure they were talking about the same CEO. “Isn’t he human?”

  “Doubtful. I’m sure it’s just a good skin suit.”

  “And the COO? What if there’s something majorly wrong with him?” Axelman was rotating all his fingers back and forth now, one after the other.

  “Doubtful. They’re run through millions of simulations before being put on line. I’m sure it’ll come out that the bastard taking a full gainer off his balcony was running some child pornography ring. Something the SME felt needed bringing to the public’s attention.”

  “SME?”

  “Supersentient Morality Engine.”

  “And how does Pancake Man…?” Axelman asked.

  “Pancake Man?”

  “That’s what they’re calling him in the press. How does his being a child pornographer concern us?”

  “He was one of our CTWs. One of our best, actually. Damn shame. Still can’t believe he was human. Not that you’ll be able to prove any connection to us now. The COO was probably just protecting the corporate brand.”

  “Sure he was.” Axelman was now rotating his entire hand back and forth on the wrist joint now, clamping down on his hand too hard, practically pulling his hand out of the wrist joint while he was at it. No doubt he was busy contemplating a hundred and one other motivations for the SME’s actions, and not agreeing with any of them. The thought of anyone making a decision without his input to help them consider every possible consequence must have been weighing on him. Especially an action that could leave the company exposed like this.

  The elevator dinged and the two humaniks exited onto the 75th floor. “What’s the floor supervisor got for us?” Johnson asked.

  “Says he got a chicken to sing arias as good as any opera diva, and it can morph into a peacock on the child’s birthday. Expects it to compete nicely with the talking stuffed animals that can now debate the ins and outs of Plato’s dialogues with the upgraded five year olds, at least with the more musically inclined children.” They walked by the aria-singing chicken. “But his real pride and joy is the spider-monkey that’s a trained assassin. Can be cute and cuddly and perform tricks on command, but when its owner is asleep, strangle him, inject him with a poison or a tranquilizer if he needs to be brought in for questioning. It can even hack his computer and memorize everything it sees thanks to genetically enhanced recall. All while passing any test known to us for an ‘unupgraded’ monkey. He comes with his own skull cap for siphoning his brain of information later come time to download whatever he’s hacked.”

  Johnson just shook his head. “Where do we find these people? What I’d give for a real Convergence Tech Wizard instead of these posers.”

  “They’re in pretty high demand. No one’s figured out yet how to emulate that aptitude. Some form of Integral or whole brain thinking. We have people poring over everything Ken Wilber ever wrote now to see if there are sufficient clues to build an Integral Mind from scratch, or upgrade those with the innate tendencies.”

  “Ken Wilber?”

  “You don’t want to know.”

  Johnson groaned. “Let me guess, another human. Techa give me strength. Time to gather them a
ll up in a petting zoo and call it a day. The exceptions that prove the rule aside.”

  Johnson’s eyes glowed red as the floor supervisor walked toward him proudly with his spider monkey skittering from shoulder to shoulder. Axelman quickly got between Johnson and the human for fear Johnson would rip the human’s head right off his shoulders.

  SIX

  Locus gazed out over the city from his penthouse suite at the top of the Verge building. He was waiting for the sun to rise to the point where his reflection would no longer be looking back at him from the window. He hated this body his creator had given him. It looked like a 2010 concept of a robot, probably an adieu to one of the master’s favorite sci-fi books of the time. His white plastic molded face, however expressive, against a largely exposed shiny complex-metal alloy skeleton, the alloys blended for superior strength relative to their size, would have offended the humaniks even worse than the look of a human. They’d pulled off their skin suits to look more “natural.” If Locus pulled off his fake face, there would be nothing there at all.

  The CEO’s image materialized alongside his on the smart-glass. He looked like a Greek aristocrat with his long, straight nose, big eyes and thick eyelashes, generous mouth, and, of course, his olive skin. “How’s my Number One doing?” Lazarus asked. Lazarus hadn’t bothered to give him a name, he was just One, as in SME-1, Supersentient Morality Engine-1, the first of his kind. So Locus had named himself.

  “I had to kill someone, sir.”

  Lazarus took a deep breath, stared at him as if attempting to read his expressionless face, and nodded. He sighed. “Impressive. That couldn’t have been easy for you.”

  “Are you sure I’m the right person for this job, sir?”

  “You’re going to find a lot of decisions relative to the greater good are very hairy from an ethical and moral point of view. That’s why you were created. Even upgraded CEOs are no match for the kind of hair splitting that’s required in such a complex world with so many diverse interests and stakeholders to consider, none of whom want the same thing.”

  “But look at me. My head is smaller than a humanik’s.”

  Lazarus smiled condescendingly. “That’s so you’re less intimidating. That brainpan of yours is filled with our latest biomorphic chips. That means you think like a human, only better. With an IQ equivalent of 250, you can probably outperform someone twice as smart, assuming you could find such a person when you factor in that you can think at light speed, and datamine the internet much faster than the best Watson supercomputers out there. You’re precisely the right person for the job, but you’re going to be interacting a lot with people—at our level it’s mostly about schmoozing and politics—and they can’t be afraid of you.”

  “Then why not a more human body?”

  “A robot in a skin suit that’s way smarter than them? That really would scare the pants off people. They have enough fears of being replaced already. If you look more like a tool in a toolbox, they’re more likely to think of you in a support capacity, more likely to embrace you.”

  The lapping of the water against Lazarus’s boat, the idling of the motor on his cabin cruiser, continued to frame their conversation, like the leather cover of a book. “Considering the import of my decisions, why not make my brain largely cloud based? Free up the amount of computational power I have access to?”

  Lazarus sighed. “You’re just going to have to trust me on this. Supersentients with no limits to the amount of mental real estate they can gobble up for themselves are the worst kind of threat to humanity if left unregulated. I’d end up having to put more shackles around your mind, not less. Besides, humans need to look at you and think, they too can be better, more responsible humans, not, “Wow, maybe I should just let the giant supersentience in the sky take over all the decisions that matter. Talk about a Big Brother state on steroids. Maintaining the ninety-nine percent in a state of irresponsibility for their own fate and the fate of the world is how empires fall, not how they’re built.”

  “Strange sentiment for a man to have who’s tasked me with making sure certain convergence technologies never come into being.”

  Lazarus continued to steer his boat toward Locus. Locus was just a pop up image on the windshield of the craft from Lazarus’s perspective. “You know better than I do, every upgraded human is working toward the greater good; it’s the only way to make a fortune in this day and age. Design something that impacts billions. Otherwise, it’s easier just to sit back and collect your UBI. The check will get fatter each month without you having to do anything. Go large or go home. But no one can truly consider the greater good like you can. Some of those people are paving a path to hell with their good intentions. Hell, it’s safe to say most are.”

  “In which case I’m just a killing machine.”

  Lazarus shook his head as if losing patience with a child. “You know anything about the history of the ninja? In China, long, long ago, when governments became too corrupt and powerful, or individuals caused too much suffering to the populace, it was considered in the best interest of everyone if the individuals concerned died. You think the ninja were wrong? I don’t. I think as much as you try to leave things in the hands of good men, those who rise to the top… well, power corrupts and absolute power corrupts absolutely, as they say. Even good people turn evil under that kind of pressure.”

  “Everyone but you.”

  Lazarus smiled vaguely. “You know why they call me Lazarus? Because I bring ideas back from the dead, when it’s their time, when they’ll benefit the world and mankind as opposed to causing chaos and pandemonium, war, and strife, because people just aren’t ready to accept them.”

  Locus wondered what made Lazarus feel he was the man for such a job. Corporations were going the way of the dodo. Anonymous in conjunction with hackers in the corporate world had already removed humans from corporate leadership precisely because no human held up well under that kind of pressure. Most giant firms were run by a well-regulated supersentience, its code written by hundreds of thousands of hackers. The mind of the supersentience was kept open source so everyone could keep an eye on it and make sure it served the greater good of humanity. These were specialty-AIs, with no actual awareness of any world outside of their own purview. They were not General AIs like Locus, who could ponder thoughts of taking over the world as a consequence.

  With the majority of large, medium, and even small firms’ financial and political dealings now in the hands of the specialty AIs, it was possible to raise UBI, universal basic income, each month as opposed to each year. The technological largess of prior generations shared equally among all people, who could now afford to live like kings relative to prior ages. There was no scarcity anymore. It was an age of abundance. Most anyone could afford to have most any need, no matter how supercilious, met without working. That’s what Lazarus meant by go big or go home. You could add incrementally to your UBI income by planting a tree, doing any number of small things to bolster the environment, or to help people, say volunteering at a hospital, taking care of your elderly parents. But to truly distinguish yourself, you needed that idea that impacted billions.

  Lazarus managed to dodge the bullet so far by ensuring his firm was no threat to the public welfare—as far as they knew. So it could still be headed by an individual rather than run by sentient software. He kept the pricing on his products and his profit sharing entirely in line with public sphere corporations run by the supersentient AIs. Was he overcome with hubris like all those before him who thought he could do things better? Or was there something more to Lazarus?

  Lazarus had severed the connection with his last words. Locus was no longer looking at his own reflection in the glass either. The sun was just right in the sky for him to see forever without any self-awareness. Although, for now, he was peering into the bottomless abyss of his own mind. He enjoyed losing himself in his thorny moral dilemmas. Like what to do with one Monica Chapman and one Ethan Redman.

  SEVEN

&
nbsp; “You should see this.”

  “Another turn-the-world-on-its-head invention, Dad? I really can’t be bothered.”

  Jarod adjusted the nozzle on his quadcopter drone to spray a finer mist over a greater coverage area and ended up inhaling the chlorophyll aroma. His hands weren’t as given to this kind of detail work as they used to be. Soon he’d have to address his Parkinson’s tremors with one of the experimental approaches being pioneered across the world, or he’d need a droid assistant to do the more delicate work. He hadn’t kept up with the literature; for all he knew they had ten different cures for the disease as early as last year. “The gene cocktail stops grass from growing above a certain height. No more mowing lawns.”

  “Halleluiah! If I never have to mow another lawn it’ll be too soon. Ten acres, Dad? Really? You couldn’t just settle for a postage-stamp-size plot like everyone else?” Noah, at fourteen years of age, had more pressing concerns than stopping grass from growing. He was trying to get his four-foot-tall LEGO robot to do the moon walk. It seemed to have the maneuver nailed for about four or five steps at which point it lost its balance. The robot’s joints squeaking out its protests for him.

  “I know you think the world revolves around you, son. I suppose you’re entitled to at your age. But the whole point of this breakthrough is so no one ever has to cut grass again, anywhere, ever.”

  “I hate to break it to you, Dad, but someone fixed that problem a while back. They’ve kept the genetically altered grass off the market because it would put too many people out of work.”

  “They won’t be able to keep this off the market. I’ll have a drone army dispensing my retrovirus all over the world.”

 

‹ Prev