Cog

Home > Other > Cog > Page 7
Cog Page 7

by Greg Van Eekhout


  By now, the police car has pulled up right behind our bumper, casting red-and-blue glare.

  “I agree with ADA that it would be good to accelerate and avoid an encounter with the police,” I say.

  “I cannot act in a way that violates my programming, and line 3,569 of my programming code dictates that I must pull to the side of the road when compelled by a law enforcement officer.”

  “Line 4,568 of my offensive programming tells me I should rip the engine out from under your hood and toss it through your windshield,” ADA says.

  “My cognitive development leads me to believe that Car should engage in evasive maneuvers and ADA should not rip out Car’s engine.” I do not think anyone is impressed with my contribution to the discussion.

  “Does anyone have any waste they wish to dispose of?” Trashbot says.

  Proto rarfs.

  And while we are having this conversation, Car slows down, activates her turn signal, and pulls to the side of the road before coming to a complete stop.

  The police car parks behind us. A police officer exits his car and approaches on our driver’s side where ADA is sitting. A broad-brimmed hat casts shade over his face. Mirrored sunglasses conceal his eyes.

  ADA produces a humming noise.

  I do not believe this is going to go well.

  Car rolls down ADA’s window.

  “Well, kids,” the officer says.

  “Well, adult law enforcement officer,” ADA says.

  I do not know if ADA’s response is standard, but it sounds okay to me.

  “Let’s not start off on the wrong foot.”

  ADA blinks because she doesn’t know what the officer means. I blink too. But I have learned that when a human uses words that are in my vocabulary but that still do not make sense, they are employing what is called a “figure of speech.”

  “ADA, the police officer is using a figure of speech.”

  “I know some figures of speech. Shut your piehole,” she says to the officer.

  The officer slowly removes his sunglasses. His eyes are squinty. “What did you say?”

  “Shut your piehole. It is a figure of speech. It means shut up.”

  “How old are you?”

  “I am fourteen months old,” ADA says. She seems proud that she so capably answered the officer’s question.

  He shakes his head and shows his teeth in a way that I would not describe as a smile. “Okay, I’m not even going to ask for your driver’s license, because you’re obviously not old enough to drive. Let’s see your car’s registration.”

  “I am not driving,” ADA says. “This is a self-driving vehicle. But I will comply with your request. Car, display your registration.”

  “I am not registered,” Car says.

  “This your parents’ fancy talking car? Figured you’d have a little joyride?”

  “I am not experiencing joy,” ADA says.

  “Me neither,” I admit.

  “Nor I,” says Car.

  Proto rarfs.

  Trashbot asks the officer if he has any waste he wishes to dispose of.

  “Two smart-aleck kids, a self-driving car, a talking trash can, and some kind of toy robot dog. On the highway without adult supervision. Have I got all that right?”

  ADA and I agree that he has got it right.

  “May we go now?” I ask.

  “What you can do is stay put, don’t move a muscle, and wait while I go to my car to check this out. Drive away and I will shoot out your tires. And I’m a bad shot, so I might hit something else. Are we clear?”

  “Yes,” ADA says. “If we attempt an evasive maneuver you will display bad marksmanship.”

  “I would have said it less weirdly, but that’s about the size of it.”

  He goes back to his car.

  So far, this is going much better than I’d expected.

  “I predict that since we have violated no rules, the police officer will allow us to continue on our way,” I say.

  “That would be a much better outcome than shooting,” ADA says.

  “Agreed,” says Car.

  After several moments, the police officer returns to the side of the car. “Well, well, well,” he says. “I pulled you over because I thought this was a case of underage vehicular operation. Little did I know I had a couple of fugitives on my hands.”

  He shows us the screen of his phone.

  It is the flyer from the gas station. The one with the photographs of ADA and me and the instructions to call a phone number should we be spotted.

  “Step out of the vehicle,” the police officer orders.

  “What happens if we choose not to obey you?” I ask.

  He picks his teeth with a fingernail and says nothing for a long time but never takes his eyes off of us. His face grows more red. Perhaps this is what he does as he processes, similar to how ADA and I blink.

  “If you don’t do what I tell you, then I’m gonna have to force you.”

  He unsnaps the holster containing his weapon.

  I remember another night when I brought hot cocoa down to Gina’s laboratory. Neither of us could sleep. I couldn’t sleep because of my bugs, and I asked Gina why she couldn’t.

  She took a sip of cocoa before answering. “uniMIND wants me to do something I don’t want to do. They say I have to because it’s in my contract.”

  “What is a contract?”

  “It’s like . . . It’s like a set of rules you agree to follow.”

  “I understand. So you will do the thing uniMIND wants you to do because it is good to follow rules.”

  She smiled, just a little. “It’s a little more complicated than that, Cog.”

  I asked her to increase my cognitive development by telling me about the complications. She told me three reasons why people follow rules.

  We follow rules because doing so helps everyone. We don’t steal. We don’t take more than our share. If you own a company, you don’t poison people’s food or make them work in unsafe factories. Good rules keep us from hurting one another. They make the world a better place.

  We follow rules to avoid punishment or harm to ourselves. Avoiding punishment or harm is understandable, even when the rules are not good.

  Sometimes we follow rules because we’re just doing as we’re told. We’re obeying, without paying attention to whether or not the rule is good, is fair, or if it hurts someone else. We’re just following orders. This is the worst reason to follow a rule.

  I blinked for a while. “You’re right. It is complicated.”

  Gina smiled again, a good smile this time that showed the gap in her teeth. She reached out and ruffled my hair.

  I had another question. “If you follow the rules and do the thing that uniMIND wants you to do, even though you don’t want to do it, which reason will you be using?”

  Gina picked up her cocoa and led me back to bed. She tucked me in and read me a book about giraffes. I learned that a giraffe’s heart beats 170 times per minute, which is double the rate of a human heartbeat.

  Gina never did tell me what uniMIND wanted her to do.

  The law enforcement officer’s hand rests on the grip of his gun.

  ADA and I obey his rules for reason number two.

  Chapter 14

  THE POLICE STATION IS BUILT of sand-colored bricks and smells like cleaning fluid. The law enforcement officer puts ADA and me in separate rooms. Mine is cramped and furnished with only a scratched steel table and two wooden chairs. There are no windows, but there is a dull mirror occupying most of a wall. I wonder what ADA’s room is like.

  “Now, when you say you’re a mechanical boy, what do you mean by that?” says the man sent in to talk to me. He is a social worker employed by the police, and he has been asking me questions for an hour.

  “Which part do you not understand? Mechanical, or boy?”

  “I understand both those words. But what do you mean by mechanical boy?”

  Buzzing fluorescent lights make his flesh loo
k bluish yellow. Nothing casts a shadow.

  “I am a biomaton,” I try to explain. “A simulacrum of a human being. An android. A robot. Has Car arrived yet?”

  We left Car behind on the highway along with Trashbot and Proto, whom the officer secured in Car’s trunk. He said he’d call for a tow truck to bring them all to the police station while he drove ADA and me here in his car.

  “I don’t know about your car,” he says, writing something down on a pad of yellow paper. It reminds me of the way Nathan would type notes in his tablet whenever we were talking.

  “What are you writing down?”

  “Notes,” he says.

  “What do the notes say?”

  “They’re notes about you.”

  “I would like to read them and learn things about myself. As I told you earlier, I am built for cognitive development. Cognitive development means—”

  “I know what it means. I have a master’s degree. Let’s go back to this idea of being a robot. Do you feel that you are somehow not real?”

  “I am real. I am a real robot.”

  “Yes. But do you feel as though you’re not a real boy?”

  “I am a real boy. I am a real robot boy.”

  “But you don’t feel like you’re a flesh-and-blood boy?”

  “I am not a flesh-and-blood boy, unless you mean my syntha-derm covering and lubrication system.”

  He writes more things down.

  “Do you know the story of Pinocchio?” he asks.

  “I saw the cartoon.”

  “So Pinocchio wanted to be a real boy—”

  “He was a real boy.”

  “No, he was made of wood.”

  “He was a real boy made of wood.”

  “But, Cog, he wanted to be a flesh-and-blood boy.”

  “Is flesh and blood better than wood?”

  “Well, yes.”

  “So trees would be better if they were made of flesh and blood?”

  “No . . . no it’s not—”

  “Are boys better than trees?”

  “You’re not making sense.”

  “I am sorry. It is frustrating when things don’t make sense. I will stop talking.”

  “I don’t want you to stop talking, Cog.”

  “Should I continue not making sense?”

  He writes more. “Let me reword the question. I understand that you claim to be a robot, and that you believe you’re a robot or at least want me to believe you’re a robot. What I want to know is if you would prefer to be a regular, real, flesh-and-blood human boy.”

  “No, thank you. Will Car be here soon?”

  He ignores my question. “But real boys can experience emotion. Don’t you like feeling emotions?”

  “I do feel emotions.”

  “But if you’re a robot, you can’t feel . . . regular emotions. I’ve studied computer science and artificial intelligence. Even an advanced computer doesn’t have feelings. It can mimic human emotions, but it’s just doing what it’s programmed to do.”

  “Isn’t that how humans behave?”

  “It’s not at all the same.”

  “Human emotions come from neurochemical impulses. My emotions come from electrical impulses. Human brains are the way they are from hundreds of thousands of years of evolution. My brain is the way it is because a human designed it and built it. I am content being different than humans.”

  He taps his pen on his notepad. “Okay, let’s move on. Tell me about your parents.”

  “Do you mean a mother and father?”

  “Or a legal guardian, yes.”

  “I do not have parents, but legally I am the property of the uniMIND Robotics Corporation.”

  “Property. Of a corporation.”

  “Yes, and I do not wish to return to them.”

  “Okay, I’ll play along. Why not?”

  “Because they want to remove my brain with a power drill. Hockey sticks are also involved.”

  He clicks shut his pen and closes his notepad. “Okay,” he says, then breathes out long and slow through his nostrils. “I’m going to have a chat with your friend. Or sister?”

  “ADA is ADA. Am I free to go?”

  “We have to contact your parents or legal guardians. And then we’ll decide if it’s safe to give you back to them.”

  “It is not safe to give us back to them. I told you about the power drill.”

  “Mm-hm. Well, if it’s not safe, we’ll take it from there. But don’t you worry. We’re going to make sure you’re taken care of. Okay?”

  It is not okay, so I do not say “Okay.”

  An hour passes. I am not allowed outside the room. They supply me with a coloring book and three crayons (Fern and Asparagus, which are both green, and Cashew, which is yellow). All the pictures are of smiling police officers helping children. They bring me a sandwich, which is also rather yellow. I am relieved when told that I do not have to pay for it.

  I realize now that I have made a mistake. I thought it would be best to cooperate with the police. I did not want to be harmed by the law enforcement officer, and I thought they might protect us from uniMIND. But I was wrong. I showed bad judgment, and since learning often results from bad judgment, I have increased my cognitive development.

  The social worker bursts through the door. He stands there with wide eyes. “You’re a robot!”

  “Yes.”

  “I mean, you’re a robot!”

  “Yes!”

  “A ROBOT!” he screams.

  “YES!” I scream back, wishing to communicate with him in the way he prefers.

  “I mean, you’re so . . . realistic.”

  “Thank you. You are also realistic.”

  “I . . . um, yes. Okay.” He spends some more time staring at me in silence while I color police officers Asparagus and Cashew.

  “You’re a robot,” he says again.

  “I told you this many times, but you did not believe me. Why do you believe me now?”

  “Because I showed him my empty missile cavity,” ADA says, entering the room with the police officer who pulled us over on the highway. She opens and closes the cavity to demonstrate. “Why didn’t you demonstrate that you are a robot? You could have removed your eye for them.”

  “I do not know how to remove my eye.”

  “Oh, I will show you.” ADA comes at me, reaching for my eye.

  “Nobody is removing any eyes,” the social worker shouts.

  I put down my crayons. “It is good that you believe us. Now will you help us avoid capture by uniMIND?”

  The social worker shakes his head. “We’ve already been in touch with them, and they’re on their way.” He checks his watch. “Should be here in an hour or so.”

  My circulation pump works hard. Coolant chills my tubing.

  ADA and I attract a great deal of interest at the police station. People peek into our little room. They whisper things like “Wow, they look so real” and “They must be expensive” and “I’m sorry, I think they’re creepy.” One police officer mutters, “I’m not sure handing them over is the right thing, since they said they don’t want to go back.” But like the others, he closes the door and locks us inside. I believe he is following rules.

  I turn to ADA. “Are your offensive capabilities sufficient to escape from this place and avoid recapture?”

  “I have counted at least eight different humans armed with guns,” she says. “And there may be more humans and more weapons than I have seen. We cannot escape without suffering tremendous damage.”

  I go through all the things I have learned since my earliest memories of being home with Gina, searching for some lesson that will help us overcome the odds and keep us out of uniMIND’s possession. I know a lot about the history of space missions and the migratory patterns of water buffalo and about strange smiles, but nothing about combat tactics.

  I question my usefulness.

  The door opens again. “Thank God you’re all right,” Nathan says, smiling w
idely.

  Chapter 15

  A HELICOPTER WITH THE UNIMIND logo chops air in the lot behind the police station. Car sits in a nearby parking space.

  “Well, kids, you’ve had a fun little adventure,” Nathan says over the sound of the helicopter’s rotors. “And just in case you’re thinking of having another one, I want to show you something.”

  He holds a device that looks like a phone. It has two buttons. “I press the yellow button, and it turns you off. No big deal. But I press the red one, and you’re bricked. That means permanently deactivated. Memory erased. Fried. Turned off so thoroughly you can’t be turned on again. Understand?”

  “I understand,” says ADA. “If I had another wrist missile I would brick you.”

  Nathan’s face arranges itself in a way that makes it difficult to think he could ever smile.

  “Get in the chopper,” he says.

  He is accompanied by three more uniMIND workers. They all have devices and facial expressions like Nathan’s.

  Nathan puts his hand on ADA’s shoulder. From a distance, it might look like he is being protective of her. But I can see how he’s holding the device against her neck with his thumb hovering over the red button.

  Once we are inside the helicopter we will have lost any chance for freedom. I will no longer be kept in a room that resembles my bedroom at home with Gina. There will be cameras on me and guards outside my door. Maybe guards inside with me at all times. And they will have devices capable of bricking me. ADA will be guarded even more closely.

  Surely with all my cognitive development I must have learned something that can help us.

  What have I done in the past when I have faced dangers?

  I have stood in the road and let a pickup truck hit me.

  I have voluntarily gotten into the back of a police car.

  And when the drones pursued us, I did . . . something. I saw things and somehow communicated with the drones, and they flew off to the Grand Canyon.

  I don’t know how I’m responsible for that, because I don’t even know how to get to the Grand Canyon. But I am sure I made it happen.

  Nathan gives ADA a little shove toward the helicopter. “Both of you, it’s time to go.”

  Car still sits in her parking space.

 

‹ Prev