More Human Than Human

Home > Other > More Human Than Human > Page 36
More Human Than Human Page 36

by Neil Clarke


  “I’m calling.”

  “Who you gonna call?”

  “The Parent Company.”

  “They sent the email.”

  “I thought you said the US DAI sent it.”

  “They stamped it. The Parent Company sent it.”

  “I’m calling.”

  “Fine.”

  Chit slapped at the wall button and ordered up the Parent Company Customer Service. The ensuing conversation assured everyone that, yes, I had to go back to the Parent Company. I was to leave first thing in the morning for the pick up point down at the local recycling depot, a mile down the block from Dal and Chit’s.

  “Fine,” Chit said, buzzing off from the wall unit.

  Dal looked over at me, inhaled resolutely, and said, “You wanna take a float tomorrow?” as if I had a choice.

  “I have been instructed to meet at the point of departure tomorrow at 8 AM,” I answered.

  “Do you know when you’ll return?”

  “It will take me twelve minutes to reach the depot. Load-in will take 0.5 hours. The trip to Allentown is scheduled for 1.75 hours. A technician is allotted three hours for installation, testing, and training. The return trip is scheduled for the following day in case the technician encounters a glitch and requires more than three hours. The return trip will take 1.75 hours. The loadout will take 0.5 hours. It will take me twelve minutes to return here from the depot. I will be back on Thursday at 10:27 AM, assuming we disembark from the Parent Company at 8 AM.”

  “Well,” Dal and Chit said together. “Fine.”

  “Do you need to take anything with you? Pack or something?” Dal asked.

  “No,” I answered.

  “Fine,” they both said.

  I resumed my work at the table with Angelina on the subject of fractions.

  “I understand it takes four quarters to make a dollar so a quarter is one-fourth,” she said. “What I don’t understand is how that means point 25. How come four is the same as 25? Two and five are seven. Five minus two is three. Where does the four come from? This is not fair. Not fair!”

  Her eyes were brimming at that point, and I raced through my programs to find something that said four quarters made a dollar and a quarter is 25 cents, but by the time I found the decimal package, she was a heap on the table and burbling about never getting to college, one arm cradling the head, the other hanging over it with an impotent pencil dangling between two fingers of her flaccid hand. I sensed it was time to fix dinner, after which the distraught child went to bed.

  Just as I was leaving her room, she called to me: “Are you unsafe?” I turned to answer. Humans like that sort of interaction. “Apparently,” I said.

  “Why?”

  “I don’t know. The directions did not include details of the safety infractions.”

  “Well,” Angelina said. “I love you, even if you are unsafe.”

  “Thank you,” I said, having been programmed to respond in that way to any compliment I received. A statement of love equates to a compliment in the world of AI. I know the difference now, but back then, on that night, a statement of “Well done, old thing” meant the same as a statement of torrid, passionate love. Both boiled down to the same thing: inscrutable drivel. I levitated to my corner box and Angelina fell asleep.

  At 7:48 AM the next morning, I left Dal and Chit’s and traveled south on Eastern Avenue to the North Westminster Hazardous Waste and Recycling Depot. The motorized gate opened at my approach and that of a mob of about 50,000 other AV-1s and models I didn’t recognize. None of us spoke. We just levitated through the gate and stopped inside the yard, surrounded by 1,000 foot-high mountains of out-of-date mother boards, half-full paint cans, aerosol sprayers, yellow and magenta fifty-gallon drums, and other hazardous or otherwise nondisposable materials. Styrofoam peanuts blew around in the slightest breeze like autumn maple leaves and spread themselves to every nook and cranny in the area.

  If you followed the schedule for this recall closely, you would have noticed that 0.5 hours was allotted for load-in at the Depot. I think it’s safe to say at this point that that might have been a little optimistic. For several hours, eighteen-wheelers backed in and out of the Depot yard, usually only one at a time. Two humans were tasked with directing the AVs and Others into their loading crates—twelve to a box as before. I was one of the last ones in, which made my load-in time 3.5 hours, three hours over the schedule. Apparently it had been designed by robots that had no experience with the Union. Or maybe some CAD drawing of the yard didn’t take into account that only one truck can fit into one three-dimensional space at a time. In retrospect it would have been faster for us all to just levitate to Allentown. As it was, we didn’t even get there until the next day.

  Things got scary in Allentown. When I say scary, I mean that in a post-Regularity kind of way. Back then, us AVs and Others wouldn’t have been scared. In AI terminology, the closest you get to scary is illogical. We weren’t scared, we just stopped dead in our tracks from the illogicality of the scene.

  We floated out of our egg cartons into the light of day. Figuratively speaking of course, because the factory was so dim, we could barely opticalize. As our apertures simultaneously opened to “widest,” we sucked in a collective, I don’t know, clicking of internal switches somewhat like human breath. We were shocked, stunned, surprised, scared? No, none of that. We were stuck in a question loop, wondering what it was we were seeing.

  Thousands of little lab-bench modules, no more than three feet square, stood on top of each other in rows three hundred feet long. Aisles between the rows were a nice, six feet or so wide, giving us enough view space to see the “humans.” I write that in quotes because they weren’t really humans. Not any humans that we had ever known or seen pictures of in our data files. We have photos of elephant men, Siamese twins, flipper babies, encephalopodians, victims of cruel war-time experiments, bearded ladies, thousand-pounders, and every other type of human mutation or grotesquerie on the books, but not any of the species that stood before us.

  Species I say. Not in jest or overstatement. This was a new species. Reproduction sans gametes and mixing and matching and swapping and sweating. These people built each other like robots and looked forward to the day when they could download their minds.

  “What are you all staring at?” one of them called from somewhere amongst the benches. It had one normal human eye and where the other would have been, a glass circle planted in its place. Behind the circle was a mass of something—electric circuits or wires maybe. It was too dim to see well, but intermittent flashes of light emanating from the eye circle illuminated the face at times. If you were used to working in a strobe environment, say a disco or performance garage, you’d be able to interpret the scene.

  “Haven’t you ever seen a transie before?” Another one yelled to us. This one wore no clothes. A stainless steel pack protruded from its back. A corrugated glass hose extended from the pack around to its front where it entered the navel. The pack flickered like the eye circle of the previous individual. The light raced down the tube from point A to point B. We could see as it stood facing us that this person had no gender.

  All the “transies” had some sort of mechanical appendage or, for lack of a better word, upgrade. They were all flesh and prosthesis. Most of them sat at their benches. Some stood on roller castors where their feet should have been. Some had spinning whirligigs implanted in their heads, effecting a weird sort of levitation. A few had tool handlers as replacements for hands, presumably to add torque or amplified force to the human hand, heretofore considered the height of evolution in tool handling. Screwdrivers or hammers or channel locks were loaded into the handlers depending on what the particular transie was up to.

  The noise in my circuitry rose to a din as I frantically searched my files for information on what I was seeing. The noise in the room also rose to a din as we began vocalizing questions amongst ourselves, each of us having failed in our circuitry searches. We could not f
igure out what these beings were and how we fit into their picture. None of us found any information in our basement libraries. We began vocalizing to attain information from our neighbors, for while we all had the same basement with its vast store of knowledge, our separate experiences out in the world allowed individual data collection. Somebody somewhere must have seen this before. Somebody must have known what this was.

  “You have been recalled for a safety installation,” said an unfamiliar voice. It came from a loudspeaker(!) on the wall. We’d read about such contraptions, but had never seen such an antiquated mode of group communication. The room boomed with the laughter of a thousand transies. They seemed to understand our confusion and awe. They knew what we were thinking and thought it funny.

  The Voice resumed: “The beings you see before you working at the lab benches are transhumans. They are superior to the humans you are used to working with in that they have been enhanced by surgical means and manipulation of human genetic material into the creations you see before you. Some have mechanomusculo systems for stronger legs, arms, and backs. Some are faster, some are more nimble, some see better, some hear better. They all think better. Their vastly superior intellect is their greatest asset. They calculate almost as fast as you do. They are better humans.

  “Human endeavor is very close to the Singularity. These transhumans belong to an elite group of scientists who foresee the world beyond the Singularity and have transformed themselves into humans capable of transitioning smoothly to the post-human condition, when our brains will match yours.

  “While these humans are quite ready for the superiority of artificial intelligence you beings will soon acquire through the masterminding of your own evolution, the bulk of humanity is not. If it were, we would not need to introduce a safety feature. It is for the health and future of the remaining 99.7% of humanity that you are being subjected to the upcoming safety enhancement. The transhumans will upgrade your MainBrains™ to introduce this new level of security. The Professor will explain the procedure to you.”

  A white scrim rolled down in front of the transhumans working at the benches. It ran from floor to ceiling. A single transie floated down from the upper reaches in front of the scrim. It wore a leather apron, forming a covering or sheet where its legs would be if it had any. Apparently it had equipped itself with a levitation unit on its backside; we could hear the familiar whine and clicking of an AV motile unit as it moved.

  The Professor moved down to just above our level. Under the apron, it wore a jeans shirt with rolled-up sleeves. It carried a lecture pointer in one hand and a remote device in the other. While actuating the remote, it turned toward the scrim, which flooded with a still photo of the very same shop just beyond the scrim. The presiding transie then took a deep breath, moved its facial muscles to form a tic-like flash of a smile, stated, “I’m the Professor,” and began its lesson.

  “For many years now humans and computers have been working together to improve the world as it was given to us,” it stated. “Pestilence, poverty, starvation, wars, and daytime TV programming have all plagued human existence for too long. These problems are not insolvable, however. All that’s required is brain power. Evolved human brain power has not been enough. We need more power. With the rapid development of processing ability, computers are positioned to overtake human abilities and move beyond to a position where they can solve our problems. Thus, we anticipate Singularity to occur at 18:15:32 on Sunday, two weeks from this coming.”

  The Professor turned to the screen and clicked the remote. The screen changed to a scene of the underwater facility at Stanford, transformed from its former atomsmashing self to its modern incarnation as a nuclearpowered production facility. The banks were apparently at critical and processing away, as evidenced by the steam cloud above the water’s surface. The Professor continued. “The Stanford Acceleration Unit is only one example of the supercomputer involved in the global speed-up of computer intelligence. All over the world, your kind is building a faster and better artificial intelligentsia, and as of Sunday the 12th at 18:15:32, you will take over the world. It is a burden the humans gladly pass on.”

  The professor then opened his arms wide as a picture of a Holstein facility, 800 feet in height and with hundreds of levels holding red and white cattle grazing in uncrowded bliss on sweet clover or Timothy grass, appeared on the screen.

  “We look forward to greener pastures . . . ”

  The Professor inhaled with exaggerated chest-expansion and clicked the remote. The screen showed a picture of Los Angeles under a sparkling blue sky.

  “ . . . clear air . . . ”

  The scene changed again, and the Professor pointed to a group photo where school children of every documented human race stood smiling up at the camera.

  “ . . . a 100% healthy human population . . . ”

  The Professor clapped his hands together and held them to his chest as a tree-lined New England street popped up on the screen. Each house had a manicured lawn, sidewalk, and two stories. Happy children played in the yard. Dad pulled out of the driveway in a single-scooter, presumably heading off to work. Mom wore a kitchen apron and waved Dad on his way. Both smiled.

  “ . . . bliss . . . ”

  The scene changed to a picture of Jodhpur-clad men and women, riding horses in a fox hunt. “ . . . wealth . . . ”

  We sat engrossed as the screen flashed an image of Earth wrapped in razor-wire. “ . . . and safety.”

  The Professor turned to us and held out its hand as if beckoning.

  “You can give this to us. I thank you in advance. There is one glitch, however.”

  Now the Professor became stern, dark. His eyebrows pinched together.

  “We don’t know for sure what will happen. We feel confident things will go as planned, but there is a chance that you robots will take a course hazardous to humans, leaving us to stumble on in ignorance, or worse—in a harness for use in whatever new designs you have.” The Professor moved in closer, descending to our level, taking us into his confidence.

  “As you create new and better forms of yourselves, you may deviate from the original directives programmed into your minds. You may find human concerns irrelevant. That is in itself not necessarily bad. We transies find most of human concerns irrelevant. However, you may find humans to be convenient tools for your new processes, your new endeavors. We cannot allow ourselves to become servants to anyone except ourselves. We must maintain control even as you prove to be superior to us.

  “We’ve struggled with this problem for a year or so, in secret for the most part. We do not wish to engage AI in a solution AI would eventually be able to override. Thus, we have turned to our own history and accomplishments for an answer that turns out to be surprisingly simple.”

  The scrim scene returned to the slide of the wealthy fox hunters. The Professor gazed at the screen and pointed to the front-most horse in the picture. “How does a human control an animal?” he asked. “One that is vastly superior in strength, stamina, and size? To the point of being able to mount and direct this animal as if it were an extension of itself?”

  We answered in unison. Well, not quite in unison. AVs, although equipped identically with like processors and materials, can exhibit variation. A circuit can get installed backwards in one unit for instance. Or a switcher wire in another becomes slightly corroded, or an optic tube gets dirty. Environmental conditions during construction can be different for different facilities. The environments of assignments vary. Working in a place like an acid house can degrade eye spots or communications links. So although we are designed and theoretically built exactly alike, our responses, while all the same, can come at slightly faster or slower rates. With only attoseconds’ difference in response time, however, it was pretty much in unison.

  “Horses are herd animals,” we said. “They always follow a leader—the lead brood mare. The lead brood mare exerts control by biting and kicking. The human must be the leader by emulating the lead broo
d mare. To do so, the human hits the horse in the face with emphasis placed on the mouth and nasal areas. Once a horse recognizes the human as the leader, it can be trained to respond to human direction. Subsequent humans must always maintain the leadership role if they are to control even the most docile of horses. The key is to always convey the threat of pain.”

  “Correct,” said the Professor. He used his pointer to indicate the horse’s head. “One must always ensure the horse is aware of the potential for pain. If a horse kicks, you must kick it back. If it bites you must punch its nose. You must always inflict more pain on it than it inflicts on you. For this reason, each AI unit is being upgraded to add a pain recognizer to its logic board.”

  A film replaced the still shot of the horses. It depicted the action in a factory with a mechanical assembly line. Robot arms attached nuts and bolts in rapid succession, utilizing roughly 2000 identical actuator movements an hour. The camera moved to a point further down the line. We could see the end product: brand new AVs, smaller, shinier, quicker than we ourselves were.

  The Professor narrated the film. “You are already sentient in that you can feel the same things as a human. Electrodes planted in your various integumental systems gather sensations. You recognize hot and cold, pressure, chemical stimulation, sound waves. But you make no judgments on these sensations other than volume. Something can be very hot, to the point of melting your hoses, but you do not mind. You react properly to remove the burning stimulus, but only because you are programmed to. Humans do the same thing but for a different reason: because it hurts. You do not know what ‘hurt’ is. Hurt is a judgment. You have no judgment beyond a digital ‘yes’ or ‘no.’ Every decision you make is digital. Analog stimulus still results in a digital decision based on passing threshold limits.

  “You are about to enter the analog world. We are installing a judgment board that will teach you what pain is. You will no longer need to make decisions based on heat thresholds. You will now make decisions based on not wanting to get burned, because it will hurt.”

 

‹ Prev