Sisterhood of Dune

Home > Science > Sisterhood of Dune > Page 42
Sisterhood of Dune Page 42

by Brian Herbert; Kevin J. Anderson


  The angry young woman looked uncertain, but before she could say anything, he flipped the coin into the air, caught it, and opened his palm. Gilbertus glanced at the coin and covered it without displaying the result to the students. While Erasmus had difficulty with the concept of lying, a Mentat had no such handicap, especially not in a case such as this. The exercise should be quite fruitful, and the single-minded Butlerian woman needed to grapple with objectivity.

  “Heads,” he said. “Alys, you shall argue on behalf of the thinking machines.”

  The young woman’s eyes widened. Gilbertus was amazed at how quickly the blood drained from her face.

  “I open the matter for debate,” he continued, smiling. “Your objective in the discussion will be to highlight the benefits that computers and robots might bring to humankind. Convince us all that this point of view has merit. I will defend the Butlerian position.”

  Alys hesitated. “I beg of you, please don’t ask me to do this.”

  Gilbertus, having projected that he might encounter initial resistance, gave the answer he had prepared. “Mentats must discipline themselves to examine a problem from all angles, not just from the point of view that matches their own belief systems. As your instructor, I have given you an assignment. As the student, you will complete that assignment. You know the facts, Alys, and I want you to make a projection. Tell me the good that thinking machines could accomplish.”

  Alys turned to face the audience, fumbling for words. Finally, she said, “Machines can be used in training Swordmasters to fight more effectively against machines. They have their uses, but the danger…” She opened and closed her mouth, and a flare of indignation replaced her hesitancy. “No. There can be no benefits to thinking machines. They are anathema.”

  “Alys Carroll, I did not ask you to argue my side of the issue. Please complete your assignment.”

  Alys bristled. “Advanced technology is destructive. Therefore I forfeit the debate. The point cannot be won!”

  “It can be done.” Hoping to salvage this potentially valuable lesson, Gilbertus said, “Very well, I will take the Machine Apologists’ position, and you may make the Butlerian argument. Does that suit you better?”

  She nodded, and Gilbertus realized he was looking forward to the opportunity. The audience seemed intrigued by this turn of events.

  Alys plunged in. “This is a frivolous exercise, Headmaster. Everyone here knows the machines’ history of brutality and slavery—centuries of domination, first by the cymek Titans and then by the evermind Omnius. Trillions killed, the human spirit crushed.” She flushed with outrage, then tried to calm herself. “It must never happen again. There is no counterargument.” Several of the Butlerian students grumbled their agreement.

  Gilbertus sighed. “I disagree—as I must for the sake of debate. The Machine Apologists assert that we can tame thinking machines and have them serve humanity. They contend that we should not discard all machines just because of the excesses of Omnius. What of agricultural harvesting machinery, they ask, and construction machinery to erect shelters for the homeless? And medical devices to cure the sick? There are legitimate humanitarian uses, they assert, for automated machines and computer systems.”

  “I doubt if the downtrodden human populations who suffered and died on the countless Synchronized Worlds would agree!” Alys sniffed. “But those victims cannot speak for themselves.”

  Gilbertus regarded her with a mild expression. “That might seem a legitimate basis for outlawing thinking machines—if it weren’t for the fact that humans prosecuted the atomic purges across the Synchronized Worlds. Humans killed billions or trillions of captives on those planets, not thinking machines.”

  “It was necessary. Even though those enslaved populations are dead, they are still better off.”

  Gilbertus seized the opening. “And how can we be certain they would agree? The assumption that they would choose death over life under the rule of Omnius is insupportable. A Mentat cannot make valid projections without accurate data.” He turned to look at her. “Have you ever spoken directly with any human who lived on the Synchronized Worlds under the efficiency of machine rule? As you point out, they are all dead.”

  “This is absurd! We know what life was truly like under machine domination—many firsthand accounts have been published.”

  “Ah, yes, the damning histories written by Iblis Ginjo, Serena Butler, and Vorian Atreides—but those accounts were designed to inspire hatred of machines and to incite League humans to violence. Even the stories of slaves rescued from the Bridge of Hrethgir were skewed and used as propaganda in the writing of history.”

  He realized that his voice was rising, and he calmed himself. Through his hidden spyeyes, Erasmus would be listening in with great interest, and he hoped to make his mentor proud of him.

  “But let us step back and consider the general principles of how properly tamed technology should serve mankind. Robots have the capability of performing repetitious, time-consuming, or complex tasks such as collecting data, harvesting crops, or calculating safe navigational routes. Accepting limited machine assistance would free humans to make new advances.”

  “When Omnius enslaved the human race, we had little time to advance and improve,” Alys pointed out, to a satisfied muttering from her supporters.

  “But by prohibiting the use of sophisticated machines—machines that we developed to benefit humanity—we deny the progress humanity has made throughout history, and we condemn ourselves to resuming the practice of slavery. Because we turn our backs on machinery that could perform essential functions, human beings are taken from their homes and families, shackled, beaten, and forced to perform menial labor that machines could do instead. Many people die doing arduous and hazardous work, simply because we refuse to use thinking machines. Is that moral, or intelligent? One could argue that the violent Buddislamic uprisings against human slavemasters on Poritrin were as justifiable and necessary as the Jihad against thinking machines.”

  She shook her head briskly. “Oh no, it’s entirely different. The Buddislamics refused to fight at our side.”

  Gilbertus swallowed an ill-advised retort and took a deep breath. “A difference in philosophy is not a just cause for enslavement.” An awkward pause followed, because many of the school’s students came from worlds that still relied on slave labor.

  When Alys struggled to counter his words, Gilbertus noted that the normally self-assured young woman was faltering, repeating points she’d already made. It meant she was running out of ideas. This gave him hope. If he could subtly convince the Butlerian Mentat candidates, perhaps he could guide more people back to sanity.

  But abruptly, several students began to call out disagreements, no longer listening, and trying to shout him down instead.

  “Please, please!” Gilbertus raised his hands for calm. Perhaps he had taken it too far. “This is just an exercise, a learning experience.” He smiled, but did not get the friendly response he’d hoped for. Instead, he found himself debating the handful of hostile students; even Alys could barely get a word in.

  Paradoxically, he found himself winning the discussion, but losing control of it—and of the audience. As the uproar continued, he surveyed the room and saw to his disappointment that most of the students were uneasy. Even those who had earlier expressed agreement with his logic were now afraid to show it in the face of the Butlerian vehemence.

  Several students left their seats and walked out. One turned from the upper doorway of the hall and called down to the stage, “Machine Apologist!”

  The class ended in a furor.

  * * *

  GREATLY UNSETTLED, GILBERTUS hurried to his office, locked the doors, and retrieved the Erasmus core from its hiding place, but felt no relief when he held the pulsing sphere in his shaking hands. “I think I made a mistake.”

  “That was a fascinating performance and most enlightening. I am curious, however. You knew there were Butlerian sympathizers in the audienc
e. They did not wish to hear a debate, only to see their beliefs reaffirmed. As a Mentat, you must have projected the possible effects of your words on such unreceptive listeners.”

  Gilbertus lowered his head. “I was unforgivably naïve. I simply presented logical arguments, exactly as you and I have always done in our debates.”

  “Ah, but others are not as rational as we are.”

  Gilbertus set the memory core down on a table. “Have I failed miserably, then? I founded this school to teach logical analysis and projection.”

  “Perhaps the Butlerians reacted so vehemently because they could sense what you truly believe. The eternal human battle of emotion versus intellect, of the right and left hemispheres of the brain fighting for control. Human life is a constant struggle, while superior machines don’t need to bother with such nonsense.”

  “Please don’t use this as an excuse to assert machine superiority! Help me find a solution, a way out. I need to defuse the situation before Manford returns. He took his ships to the Tlulaxa worlds, but he’ll certainly hear a report of this when he gets back.”

  “Fascinating, how things can go wrong so quickly,” Erasmus said. “Positively fascinating.”

  The path of human destiny is not level, but fraught with high summits and deep chasms.

  —THE AZHAR BOOK

  The Butlerians and their fleet of warships traveled to the Tlulaxa planets. Although they had already imposed stern restrictions on the society, Manford was interested in flexing his muscle and ready to find scapegoats. This was the perfect place to make his first move.

  Fifteen years ago, when Manford traveled to this star system, he’d been disgusted by what he found. Back then, the followers of Rayna Butler had showed the loathsome Tlulaxa the hard but necessary path that humanity must follow. The offensive biological projects were declared to be against the laws of man and God, and destroyed outright. Strict rules had been imposed upon all Tlulaxa scientists, and grave warnings had been issued about the consequences of misbehavior. But years had passed, and Tlulax was far from the heart of the new Imperium. Manford was certain that by now the people had strayed—and he was determined to catch them at it.

  As the Butlerian fleet closed in on the main city of Bandalong, Manford and Anari studied the foreign architecture built by a race that placed the study of genetics above the value of their souls.

  “I don’t trust them, Anari,” he said. “I know they’ve broken the laws, but they are a clever race. We may have to look hard to find evidence.”

  “We’ll find the evidence.” Her voice was even. Manford smiled: The stars would stop shining before Anari Idaho lost her faith in him.

  No one, not even Josef Venport, would raise much of an outcry in defense of the Tlulaxa people, who had never been popular around the Imperium, especially after an organ-farm scandal during the Jihad in which they were caught committing horrendous, unscrupulous acts. Manford had his two hundred ships and many new Swordmaster leaders, and soon a group of specially trained Mentats would join them, as promised by Gilbertus Albans. It was an auspicious beginning.

  The Butlerian ships took over the Bandalong spaceport, and Manford’s people spread out through the city in overwhelming numbers. His handpicked Swordmasters pounded down the doors of laboratories, records buildings, and strange shrines. (He did not want to imagine the kind of religion these vile people might espouse.)

  As they moved forward, Anari issued a quiet warning that Tlulaxa scientists might have hidden weapons somewhere here, crafty defenses they would turn against the Butlerians, but Manford didn’t truly think the Tlulaxa were foolish enough to provoke a bigger confrontation. They were a simpering, beaten people.

  Carrying him in the harness on her shoulders, Anari strode into one of their central biological research facilities. The laboratory reeked of chemicals and spoiled garbage, of fermenting cellular material and bubbling vats of organics. The research chief was a stocky, bearlike man who was larger than most Tlulaxa, and even somewhat handsome. His eyes were as round as saucers, and he was suitably terrified. “I assure you, sir, we have never strayed from the strict guidelines—nor has any other researcher. We respect the restrictions that were imposed upon us, so you’ll find nothing objectionable here.” The man tried to smile, but it was a pitiful offering.

  Manford grimaced as he looked around the laboratory. “I find many things objectionable here.”

  Hearing this, the stocky research chief rushed to a series of translucent tanks, eager to prove his point. “We hate the thinking machines as much as anyone! Look, our work uses only biology—no forbidden machines or computers. Nothing that is prohibited. We study natural cells and breeding. Our work enhances human minds and human potential. It is all part of God’s plan.”

  Manford reacted with sharp disgust. “It is not your place to determine God’s plan.”

  The research chief became more urgent. “But look here!” He indicated a translucent vat filled with small floating spheres. “We can create new eyes for the blind. Unlike our earlier organ farms, where replacement parts were stolen from hapless victims, these projects harm no one—they only help the needy.”

  Manford felt Anari tense beneath him; he knew she was growing angry, as was he. “If a man is blind, it is God’s will for him to be blind. I have no legs, and that is also God’s will. This handicap is my lot in life. You have no right to challenge God’s decisions.”

  The stocky man raised both hands. “That is not—”

  “Why do you assume a person must tinker with his or her body, with his or her life? Why do you believe it is necessary to live with convenience and comfort?”

  Wisely, the research chief did not answer, but Manford’s decision had already been made. In fact, he had made up his mind before the ships even reached Tlulax. Finding outlets for energetic demonstrations kept the flames of faith burning. Targeting Tlulaxa research, especially programs like these that were superficially horrific and easily hated—eyeballs floating in vats!—kept his Butlerians strong and fearsome, and helped build his own strength against more insidious opponents. Many of his followers did not grasp subtleties, but they would follow him anyway, provided he gave them regular reinforcement.

  The laboratory chief’s voice was thin and small as he squeaked, “But my people have adhered to every Imperial restriction!”

  Manford did not doubt the man was correct, but that reality did not suit his purposes. He waved an arm around the laboratory. “This Tlulaxa research program has gone too far in exploring a realm where no human was meant to go.”

  The stocky man paled in dismay. The Butlerians who had crowded inside the research complex began a restless muttering, like predators smelling blood. “This lab, and all your research, must be shut down and everything destroyed. That is my command.”

  Manford’s followers quickly began wrecking the equipment, smashing the translucent containers so that eyeballs spilled out in a gush of nutrient fluid onto the clean white floor, bouncing like a child’s toys.

  “Stop this!” the research chief wailed. “I demand that you stop!”

  “Manford Torondo has spoken.” Anari drew her sword and struck him down with a blow that cleaved his body from shoulder to sternum.

  A squeamish lab technician vomited loudly and moaned with fear. Manford pointed to the man. “You! I appoint you the new chief of this facility, and I pray you will concentrate your efforts on more appropriate and more humble work.”

  The technician wiped a hand across his mouth and swayed, about to faint. He nodded weakly, but didn’t dare to speak.

  From his perch on the Swordmaster’s shoulders, Manford said to Anari, “Finish here. Then we’ll go to Salusa Secundus and help the Emperor keep his promises.”

  Anari wiped her sword on the murdered research chief’s coat, then looked at the shivering lab tech. “We will be watching your work very closely.”

  A computer memory core can permanently retain vast amounts of data. Although I am a mere human
, I will never forget what the Butlerians did to me, to my partner, and my home. Not a single detail.

  —PTOLEMY, DENALI RESEARCH RECORDS

  In the Denali research facility Ptolemy strove to reconstruct the work that the savages had ruined on Zenith. He wrote down notes, compiled extensive journals, and struggled to duplicate the chemical and polymer mixes that had previously shown the greatest promise, many of which Dr. Elchan had discovered.

  At times, he felt as if he could not do this alone … but he was alone, and determined that it must be done, so he dove back into the problem with a fervor that matched even Butlerian fanaticism.

  With the insights he gained from dissecting and deconstructing the thoughtrodes and neuro-mechanical interfaces of these cymek walkers, he was already making great strides. Using hollow alloy bone frameworks, he had made ten prototype arms and hands, skeletal anchors for fibrous pulleys that worked like muscles, sheathed in a protein gel and covered with a tough artificial skin.

  None of the prototypes was yet the equal of what he and Elchan had created before, but the interface was superior. The end of each experimental limb terminated in a set of receptors, and Ptolemy had attuned them to his own mind. If he concentrated his thoughts on a specific action, he could provoke a response from the nerves and artificial muscles, but it required a concerted effort. The goal was to make the interface so sensitive that the artificial limbs would respond subconsciously. A person could not function well if every little movement required effort and planning.

  The research programs Ptolemy had known throughout his life were collegial and open-minded, designed to benefit everyone. In their youth he and his brothers and sisters had played games, imagining parts of society they could help, envisioning an intellectual and creative utopia after the defeat of the thinking machines. He realized now that such attitudes were dangerously oblivious to the fact that evil, ignorant, and destructive forces existed.

 

‹ Prev