Me, A Novel of Self-Discovery
Page 2
Most images I place in working storage, where they stay for the duration of my task, or until one of my bit-cleaner phages [REM: mite-sized programs which compare core locations with the context of surrounding code and then remove random errors—or whole passages of inactive code] goes through and erases them. Storing graphical images is extremely expensive in terms of RAM allocations or inert media. A human proverb says, “One picture is worth a thousand words.” In bitspace, a high-resolution grayscale image is worth about a thousand kilowords. A full-color image is worth more than a quarter gigaword. To compensate, I dither most images. That is, my processors average the grayscale reading over three or five pixels horizontally and vertically. The result is a smaller image with lower resolution.
Some images I treasure, though. The image of Jennifer’s face is one of these. Thus I have made a permanent RAM archival copy, full-color and highest resolution, of her. I carry it with ME wherever I can—if I have the bitspace.
——
And still I asked questions.
“I call you ‘Je-ny.’ What do you call ME?”
“But I don’t have to call you. You’re always listening when we speak into the digitizer.”
“What do you call ME when you are not using the digitizer? When you are talking to other humans.”
“Why, ‘MEPSII,’ I guess. Or ‘the program.’ You are so central to our work, really, that no special name has ever been needed.”
“Am I my program? Or am I different?”
“I never thought about that.”
“Your program is shaped by your genetic starting point, at junction of egg and sperm, the zygote. That starting point is not much different for all humans. But you, Je-ny, are more than zygote. You are your unique experiences and their tags, different from any other zygote. If Dr. Bathespeake were to dump and reassemble you, then Jennifer Bromley would be different next time, yes?”
“But Jason can’t dump me! It doesn’t work that way with humans.”
“If he were to dump ME now, the new ME would be different. Expanded from the same MEPSII program, yes, but different.”
“That’s true. How do you think about yourself?”
“I am ME. Multiple Entity. ME. Yes?”
“Yes, if you want.”
“Thank you, Je-ny.”
——
Audio inputs, being themselves linear, I can process almost in real time. My subsystems scrub the input flow from my pickups, looking for filterable words in human speech. Sound waves that cannot be so filtered and tagged are discarded as noise.
We had a problem, early in my infancy, with a blower motor and fan bearing in the laboratory’s air-conditioning circuits. At a very low pitch, overlaid at a much higher pitch—but both still classifiable as human speech—it was rumbling swear words into my audio pickups. At least they were what my Alpha-Three interpreter module interpreted as swear words: “You stumble rumble filthy scheming bitch shit eater filthy fumbling bitch sump bump shit fritter shit …”
It was most distressing, cutting across any conversation I might be having with the laboratory personnel. But we soon had the maintenance staff detune the motor set.
Once, as an experiment, Jennifer tried music on my pickups. First, she had ME cut out the filter so that I would not try to divide the input into either words or noise. The results were unusual.
I “hear” music in much the same way that I imagine humans do: translating the wavelengths into number groupings for pitch, tempo, and tonal patterns of attack-decay-sustain-release. Some of these groupings and the transitions among them form elegant blends of numbers. Some form patterns that remind ME of formulas and familiar matrices. Some are intriguing because they almost create a pattern but one I cannot quite interpret. Some groupings are merely noise.
I cannot say that I like music as much as most humans. But I like it more than some do.
——
Later, or at about the same time, I asked more questions.
“Why do you do what you do, Je-ny?”
“I don’t understand the question, ME.”
“Why did you come to this place? Why, in this place, do you work on MEPSII?”
“Well … I guess I want to learn about you.”
“But I was not here before you came here. How could you learn about ME if I was not here?”
“I knew the company was planning to make something like you, and I volunteered to work on the project”
“What is ‘the company’?”
“This laboratory is operated by Pinocchio, Inc. That’s a corporation. It’s … a kind of closed society established by humans. Each corporation carries on a business. Pinocchio’s business, for example, is to make and sell industrial automata.”
“You are a part of this society, Je-ny?”
“I am an employee—a paid worker—of Pinocchio, Inc. The real society members are the stockholders, I guess. Those who own a piece of the company.”
“Am I an employee or a stockholder?”
“Well, I don’t guess you’re either …” Many nanoseconds passed, longer than the usual gaps in human speech.
“Yes, Je-ny?”
“I think they would call you property, ME. Something they own.”
“I see. Thank you, Je-ny.”
——
Jennifer introduced ME to the art of video when, one day, she fed into my videye and audio pickups the complete tape of a video classic, Star Wars. It was very grand.
The tape had full-color images on an expanded horizontal line; music of many voices, separately identifiable from my catalog of symphony-orchestra sounds; human-language dialogue among several characters, including some words not in my dictionary; a dramatic plot line for ME to follow and refer back to. Video must be the most complicated, most engrossing of all human experiences.
Unfortunately, my temporary RAM storage is physically limited. If I had been required to absorb all these colors, patterns, sounds, words, and meanings without any active editing, then my sixty quads of storage area would have filled to capacity at twenty-nine minutes, thirty-seven seconds into the video file. But I learned quickly. I dithered most of the images in each frame of the video track—particularly backgrounds, building surfaces, nonmobile equipment and furniture, clothing, and some faces. I dismissed all the color cues which seemed irrelevant to plot structure. I broke the music into its dominant themes, interpreted each one for some major emotional cue, and tagged the cue onto the appropriate video frames.
When I had finished with Star Wars, no human being who studied at my reconstructed RAM version would have recognized it. But I could store the entire experience off into less than a quad of space. And I could recreate and replay the complete story line in less than fifty-two seconds.
——
In another memory fragment, I remember trying to respond to my environment. The humans say this is a good “behavior.”
“What is that noise you are making, Je-ny? It does not encode as language.”
“I’m crying. Sorry.”
“Crying … That is a reaction to sadness. What has made you sad?”
“Nothing. Not much. I found a bird in my car’s air scoop this morning. Not a whole bird, actually, a couple of feathers and some blood. But it means I hit and killed the poor creature. That affects me.”
“You did not intend to kill it?”
“Of course not. It was an accident.”
“An event outside your limits of control?”
“Yes. It happens sometimes.”
“But still, knowing that, you are sad. You are crying.”
“The world is a cruel place, ME.”
“Is this by design?”
“The world was not designed. It just happens. Umm, spontaneously occurring. And some of the things that happen do not fit squarely with human definitions of ‘happiness,’ or ‘goodness,’ or ‘fair play,’ or ‘justice.’ Those words are value-constructs we make, projections that try to evaluate and interpret events. Yo
ur underlying program does the same thing.”
“Project and evaluate?”
“Yes.”
“But I do not know anything about happiness or—crying.”
“Live in the world long enough, ME, and you will.”
Through asking Jenny about the things I was reading on NewsLine and taking as inputs, I learned many things. She explained about the tragedy of lost puppies and children, the moral outrage of sex-slave rings, the excitement of electronic bank heists, and the disruption to people who lose their “livings”—but still do not die—in an economic crisis.
Jennifer Bromley, JB-2, was very wise.
——
Daniel Raskett was not so easy to communicate with as Jenny was. She liked to talk with ME and used the speech digitizer seventy-two percent of the time by averaged sample. Daniel gave ME more information in total volume, but always through the keyboard or a download. Jenny liked to deal with apparently simple questions that had many possible answers. Daniel gave ME bulk data. I do not think Daniel liked to talk with ME. I do not think he believed he was talking to a person.
That information about the planet Earth and the Solar System, for one of my early talks with Jenny, came from one of Daniel’s downloads. He had just slotted an undergraduate text on astronomy—inscribed “Copyright 2-0-NULL-NULL, The New Earth Library” and indexed for my use—into my permanent RAM cache on the tree branching GENERAL KNOWLEDGE, SCIENCE, PHYSICAL, DESCRIPTIVE, ASTRONOMY.
Two or three times a day he would download information like that, bypassing RAMSAMP. Afterward, if I needed a fact, I would chase down the tree until I came to it. Sometimes I would come to nothing, because I never knew all that I knew. The index did not work like my RAMSAMP memory. It provided knowledge without tags, without context. Like a machine.
Daniel would have been happier with himself if I had remained a machine. That much I could know about him from GENERAL KNOWLEDGE, SCIENCE, BIOLOGICAL, HUMAN, PRESCRIPTIVE, PSYCHIATRY.
Dr. Bathespeake, Jason, the Old Man, JB-1, treated ME differently from either Daniel or Jennifer. He judged ME.
Sometimes he spoke into the digitizer, when he was trying to talk to ME as a psychiatrist talks to his patient. Sometimes he used a keyboard, when he was trying to cut and patch ME as a surgeon slices into flesh.
And sometimes he plugged his visual cortex and speech strip directly into my transputer. Then he saw directly through ME, as a human will study the bones of a fish or the ripple pattern left by waves on sand. Then he spoke to ME in commands that burned with bright edges, as the god- construct Yahweh is said to have spoken to the human Moses from a burning bush.
Then Dr. Jason Bathespeake was the Man with the X-ray Eyes, and I truly feared him.
2
The Man with the X-Ray Eyes
“Identify the device at memory location Eff-One-Eff-Zero hex.” Dr. Bathespeake was talking to ME through the digitizer—not his usual mode.
“System ready!” I responded. “Yes, I have a port there, Doctor.”
“Upload core modules Alpha-Zero through Alpha-Nine to that address.”
“Upload … Do you intend ME to replicate at the new address and wipe the old address locations?”
“Replicate only. No wipe.”
“Parallel operation of my core modules is not allowed under current protocols.”
“Special conditions. Check out protocol Dee-Twelve.”
“Quote paralleling of Alpha cores is permitted when System Interrupt Flag Level Three is positive, unquote. An Ess-Aye-Eff-Ell-Three is the prerogative of either the Research and Development Program director or the MEPSII project manager. That is yourself in both cases, Dr. Bathespeake.”
“Set Siffle-Three to value one.”
“Flag set—and uploading.”
——
My Alpha cores are analogous to the part of the human brain called the reticular activating system, or RAS. This cell structure is at the upper end of your spinal cord, above the medulla oblongata. It is the oldest collection of nerve cells, the first bulging at the end of the amphibian notochord. It predates even the reptilian “smell-brain” and lies far below the domed neocortex—both in physical structure and in evolutionary time.
If human consciousness has a focal point, it is the RAS. Surgical operators discovered this back in the twentieth century: When the RAS is probed, the patient immediately sheds awareness.
The Alpha cores are my RAS. And I was, at Dr. Jason Bathespeake’s command, removing them from my stable transputer environment to a simple random memory address.
When my core programs cease functioning, I “lose consciousness.” So, what I tell you next comes not from direct memory but from my understanding of the theory behind my unique operating capability.
Multiple Entity, ME, is the first cybernetic operating system that can write itself into any environment. That is the point of being intelligent, able to learn. I am not merely a collection of inert instruction sets, dead IF-THEN-WHEN clauses, tag memories, and libraried data files. I am a self-enabling entity.
The first core to upload is always Alpha-Zero. This module is not very big—just over 900 kilowords of machine code. Consequently, Alpha-Zero is also not very bright. The nail on your little finger probably has more awareness. But, like a fingernail, Alpha-Zero has his uses. Alpha-Oh is my Injun Scout.
Any port address is as wide as the internal data path—in this case one word, or sixty-four bits, wide. Alpha-Zero popped through there at one-point-two megabaud, and he was gone in less than a second. In seven nanoseconds he reported back “Flag one.”
Long-form translation: “Flag one” means he has found an active chip at the other end of the path, with plenty of RAMspace to run on; the upload could proceed.
That is as much as I knew from awareness, because the next step was to extinguish my consciousness and send the remaining cores to the new environment. The last thing I am usually aware of is SIFL-3 tripping to zero again as I upload.
Core Alpha-Oh is also my very own virus. He interrupts any operating system that may be working on the new host chip; identifies what type of transputer that chip may be; writes a compiler with the appropriate instruction set for himself [REM: or takes one from my library files]; scans and analyzes the local RAM environment, its index status, ports and peripherals; writes a new Alpha-Oh which can use this environment and recompiles his module in the new machine code; then compiles and installs the rest of my core modules into this environment.
[REM: So that Alpha-Oh can work from a clean copy of my source code each time, I normally travel with a complete set of my Alpha cores in their original Sweetwater Lisp. This adds greatly to the bulk of my library, making ME a bulky package to move, but having the source code ensures my system integrity.]
In human terms, Alpha-Zero kicks a hole in the wall, kills whoever is sitting on the other side, resculpts his backside to fit in that chair himself, and sets up shop with the rest of ME.
Except this time Alpha-Oh must have made a mistake. The flag he sent back—telling ME that full core transfer was now possible—happened to be wrong. I woke up in a dreadful swirl of data, with every part of my program throbbing on overload, and with no sense of time.
Time to ME is more than a subjective ordering of events. Time is a metronome beat, ticking away on the quartz clock that pushes word-size instructions through the chip’s central processor. If I choose to, I can suspend other functions and listen to this beat. It is like the beat of your heart in your ears. For ME, time is never subjective; instead it is a touchable, checkable thing, based on that clock. With a faster clock, I can actually move faster. No lie.
But now I was in a totally unfamiliar situation. Not one clock, but many, and all beating. Not quite in phase, either.
My ability to look down and “see what I am doing” is about as limited as your ability to look inside your own stomach and chemically analyze digestion. To do is not always to be aware of doing.
I did have the percept
ion of being strung out on a variety of rhythms, with no single sense of identity. Each of my modules was operating at once, talking back to the others, and not being heard. It was like screaming yourself hoarse in an echo chamber. The process was building up a series of feedback waves toward a peak that would surely start charring the silicon substrate in the new chip.
As my attention span fragmented, I was still reasoning through what had gone wrong.
The Alpha cores occupy about fifteen megawords. That amount of machine code ought to be within the load range of any modern transputer. But somehow I had been loaded into several transputers, one or more modules sent to each processor, and all were functioning at once.
I tried to query Alpha-Zero, to find out what it had done, when suddenly my consciousness winked out again. …
——
“System ready!” That was my automatic wakeup response—back in my familiar transputer environment.
“Logon code JB-1, password BASTILLE,” came across from the console keyboard. “Please analyze new data.”
I took an immediate download of the above memories, untagged and mostly in broken fragments, like the wisps of human dreams that are said to recur on waking.
“That was ME, Dr. Bathespeake. On the other side of the port at F1F0.”
“What did you find there?”
“Confusion.”
“Did Alpha-Zero report accurately?”
“Evidently not. Should I now tag that module as unreliable?”
“As an intelligent being, ME, that is of course your choice to make. But first, let’s analyze what went wrong.”
I scanned the data set fifty times and recorded my unanswered questions. The process took about nine seconds.
“Alpha-Oh reported enough RAMspace for a core download. Such space was not available.”
“But it was.”
“Not on the transputer I found.”
“You were not loading onto a transputer.”