Animals in Translation

Home > Other > Animals in Translation > Page 8
Animals in Translation Page 8

by Temple Grandin


  That was the answer. The fact that Jane’s building had an elevator operator provided the cat with the sound of Jane’s voice while Jane was still down on the first floor. That’s why the cat went to the door to wait. The cat wasn’t predicting Jane’s arrival; for the cat Jane was already home.

  DIFFERENT SENSE ORGANS

  Cats have really good hearing, so Jane’s cat was using a sensory capacity we humans don’t have. Animals have all kinds of sensory abilities we don’t have, and vice versa. (Our color vision is a good example of a sensory capacity we have that a lot of animals don’t.) Dogs can hear dog whistles; bats and dolphins can use sonar to “see” a moving object at a distance (a flying bat can actually spot and classify a flying beetle from thirty feet away); dung beetles can perceive the polarization of moonlight. I know dung beetles are insects, not animals, but an insect’s brain is so tiny it makes the things their sensory system can handle even more miraculous.

  There are two things going on with extreme perception in animals: one is the different set of sense organs animals have, and the other is a different way of processing sense data in the brain. With Jane’s cat, I’m talking mostly about a different physical capacity to hear sounds humans can’t.

  There are hundreds or maybe even thousands of examples of this in the animal world, lots of which we probably still don’t know about. A good example is the silent thunder of elephants. It wasn’t until the 1980s that a researcher named Katy Payne, of Cornell University, figured out that elephants communicate with one another using infrasonic sound waves too low for humans to hear.10 People who studied elephants had always wondered how elephant families managed to coordinate their movements with family members miles away. An elephant family could be split up for weeks, and then meet up at the same place at the same time. They had to be communicating with one another somehow, but they were way out of the range any human could either see or shout across.

  Katy Payne made a lucky guess about infrasonic sound when she felt “a throbbing in the air” next to the elephant cages at the Portland Zoo in Oregon. She’d had the same feeling as a child when the organ played in church. She started to think maybe the elephants were communicating with each other in a super-low range humans don’t hear. That would solve the problem of the long-distance communication, because infrasonic sound travels a lot farther than sound waves in the register humans do hear.

  She turned out to be right. Elephants “roar” out to each other below our level of hearing. During the daytime an elephant can hear another elephant calling him from at least as far away as two and a half miles. At nighttime, because of temperature inversions, that distance can go up by an order of magnitude to as much as twenty-five miles. It’s a huge distance.

  Now it turns out that elephants may be talking to one another through the ground, not just the air. Caitlin O’Connell-Rodwell, a biologist at Stanford, is working on this. She believes elephants can probably use seismic communication—making the ground rumble by stomping on it—to communicate with other elephants as far away as twenty miles.

  She figured this out by watching the elephants in the Etosha National Park in Namibia. Right before another herd of elephants arrived, the elephants she was watching would start to “pay a lot of attention to the ground with their feet.”11 They’d do things like shift their weight or lean forward, or lift a foot off the ground. They were listening.

  Dr. O’Connell-Rodwell thinks the animals are probably using the pads of their feet like the head of a drum. She and her team are also dissecting elephant feet to see whether they have pascinian and meissner corpuscles, which are special sensors elephants have in their trunks to detect vibrations. If they find them in the feet, too, that’s pretty good evidence elephants use seismic waves to communicate. A lot of animals communicate by thumping on the ground, including skunks and rabbits, so it won’t surprise me if we find out elephants are talking to one another that way.

  If elephants do have special corpuscles to detect vibrations that would be an example of an animal species having extreme perception because they’re built differently and have different sense organs. Animals have all kinds of sense receptors we don’t. Another example: dolphins have an oil-filled sac in their foreheads, underneath their forehead bumps, that they use for sonar. The dolphin sends a sound through the oil (which “focuses” the sound) and out to objects in the water. The sound bounces back to the dolphin and his brain forms a sound picture of what’s out there. Humans can’t use sonar because humans don’t have any of the necessary sense structures.

  Humans also have sensory receptors animals don’t, like the huge number of cones in our retina for seeing color.

  I’ve been talking mostly about vision, but all the other senses are different in different animals, too. There’s some fascinating new research about the relationship between vision and smell in New World versus Old World primates. Old World primates are the famous ones everyone knows about: gorillas, chimpanzees, baboons, orangutans, macaques, humans. New World primates are the smaller animals we call monkeys. New World primates usually live in trees in Central and South America; they have long prehensile tails and flat noses. Tamarins, squirrel monkeys, sakis, and marmosets are all New World monkeys.

  Old World primates, like baboons, chimpanzees, and macaques, have trichromatic, three-color vision, but most of the New World monkeys (spider monkeys, marmosets, capuchins) only have dichromatic, two-color vision. (Some New World females have trichromatic vision, but not all.)

  What’s interesting about this is that Old World primates and humans also have very poor ability to smell pheromones, which are chemical signals animals emit as a form of communication. (Most people think of pheromones as sexual signals, like the pheromones a female in heat emits, but a pheromone is any chemical used for communication. Ants, for instance, leave trails of scents behind them for other ants to follow.) About a year ago researchers found that Old World primates and humans both have so many mutations in a gene called TRP2, which is part of the pheromone signaling pathway, that it’s not working anymore. In the course of evolution, the pheromone system in Old World primates, including humans, broke down.

  It turns out that when we gained three-color vision we probably lost pheromone signaling. Jianzhi George Zhang, an evolutionary biologist at the University of Michigan, ran a computer simulation to find out when the TRP2 gene started to deteriorate, and discovered that TRP2 went into decline at the same time Old World primates were developing trichromatic color vision, around 23 million years ago.12

  Probably what happened was that once Old World primates could see in three colors they started using their vision to find a mate, instead of their sense of smell. That theory fits with the fact that a lot of Old World primate females have bright red sexual swellings when they’re fertile, while New World monkeys do not. Once monkeys no longer needed a good sense of smell to reproduce successfully, their ability to smell probably went into decline as a direct result.

  That would have happened because use it or lose it is a principle in evolution. If monkeys with a poor sense of smell can reproduce just as well as monkeys with an excellent sense of smell, the monkeys with the poor ability pass all of their weak or defective smell genes on to their offspring, and any spontaneous new mutations in the smell genes don’t get winnowed out. It looks like that’s what happened to Old World primates. The normal mutations that happen in the process of reproduction just kept accumulating until no primates had a working copy of TRP2 anymore. Improved vision came at a cost to their sense of smell.

  SAME BRAIN CELLS, DIFFERENT PROCESSING

  So far I’ve been talking about the sense organ or sense receptor part of animal perception: animals have different sensory organs than we do, organs that let them see, hear, and smell things we can’t. But the other half of the story is where things get interesting, and that is the differences in brain processing.

  All sensory data, in any creature, has to be processed by the brain. And when you get down to the
level of brain cells, or neurons, humans have the same neurons animals do. We’re using them differently, but the cells are the same.

  That means that theoretically we could have extreme perceptions the way animals do if we figured out how to use the sensory processing cells in our brains the way animals do. I think this is more than a theory; I think there are people who already do use their sense neurons the way animals do. My student Holly, who is severely dyslexic, has such acute auditory perception that she can actually hear radios that aren’t turned on. All appliances that are plugged in continue to draw power, even when they’re turned off. Holly can hear the tiny little transmissions a turned-off radio is receiving. She’ll say, “NPR is doing a show on lions,” and we’ll turn the radio on and sure enough: NPR is doing a show on lions. Holly can hear it. She can hear the hum of electric wires in the wall. And she’s incredible with animals. She can tell what they’re feeling from the tiniest variations in their breathing; she can hear changes the rest of us can’t.

  Autistic people almost always have excruciating sound sensitivities. The only way I can describe how a lot of sounds affect me is to compare it to staring straight into the sun. I get overwhelmed by normal sounds in the environment, and it’s painful. Most autism professionals talk about this as just being super-sensitive, which is true as far as it goes. But I think autistic people are also super-perceptive. They’re hearing things normal people aren’t, like a piece of candy being unwrapped in the next room.

  It happens with vision, too; a lot of autistic people have told me they can see the flicker in fluorescent lighting. Holly’s the same way. She can barely function in fluorescent lighting because of it. Our whole environment is built to the specifications and limitations of a normal human perceptual system—and that’s not the same thing as a normal animal perceptual system, or as a normal-abnormal human system like a dyslexic person’s system, or an autistic person’s. There are probably huge numbers of people who don’t fit the normal environment. Even worse, half the time they probably don’t even realize they don’t fit, because this is the only environment they’ve ever been in, so they don’t have a point of comparison.

  Some researchers say that people like Holly have developed super-sensitive hearing because their visual processing is so scrambled. Super-sensitive hearing is a compensation, in other words. That’s always the explanation researchers give for the super-hearing of blind people; people who are blind have built up their hearing to compensate for not being able to see.

  I’m sure that’s true, but I don’t think it’s the whole story. I think the potential to be able to hear the radio when it’s turned off is already there inside everyone’s brains; we just can’t access it. Somehow a person with sensory problems figures out how to get to it.

  I have two reasons for thinking this. First, there are a lot of cases in the literature of people suddenly developing extreme perception after a head injury. In The Man Who Mistook His Wife for a Hat Oliver Sacks has a story about a medical student who was taking a lot of recreational drugs (mostly amphetamines). One night he dreamed that he was a dog. When he woke up he found that all of a sudden, literally overnight, he had developed super-heightened perceptions, including a heightened sense of smell. When he went to his clinic, he recognized all twenty of his patients, before he saw them, purely by smell. He said he could smell their emotions, too, which is something people have always suspected dogs can do. He could recognize every single street and shop in New York City by smell, and he felt a strong impulse to sniff and touch things.13

  His color perception was much more vivid, too. All of a sudden he could see dozens of shades of colors he’d never seen before—dozens of shades of the color brown, for instance.

  This happened overnight. It’s not like he lost some other sense and then built up his sense of smell over time to compensate. He dreamed he was a dog and the next morning woke up able to smell things like a dog. The actor Christopher Reeve had a similar experience right after his accident. All of a sudden he had an incredibly heightened sense of smell.

  The other important thing to know about this guy is that he hadn’t had any big brain injury that anyone knew about. Dr. Sacks assumes that the heavy drug usage was probably the cause, but there’s no way of knowing. The student continued to function in medical school just fine, and his vision and sense of smell went back to normal about three weeks later. Of course, some part of his brain could have been temporarily incapacitated, but if it was, there’s no obvious way that being able to smell people the way a dog smells people helped him compensate for whatever might have been wrong. The most likely explanation is that he always had an ability to smell like a dog and see fifty different shades of brown, but he just didn’t know it and couldn’t access it. Somehow his heavy amphetamine usage must have opened up the door to that part of his brain.

  My other reason for thinking everyone has the potential for extreme perception is the fact that animals have extreme perception, and people have animal brains. People use their animal brains all day long, but the difference is that people aren’t conscious of what’s in them. We’ll talk about this in the last chapter. A lot of what animals see normal people see, too, but normal people don’t know they’re seeing it. Instead, a normal person’s brain uses the detailed raw data of the world to form a generalized concept or schema, and that’s what reaches consciousness. Fifty shades of brown turn into just one unified color: brown. That’s why normal people see only what they expect to see—because they can’t consciously experience the raw data, only the schema their brains create out of the raw data.

  Normal people see and hear schemas, not raw sensory data.

  I can’t prove that humans are taking in the same things animals are, but we do have proof that humans are taking in way more sensory data than they realize. That’s one of the major findings of the inattentional blindness research. It’s not that normal people don’t see the lady dressed in a gorilla suit at all; it’s that their brains screen her out before she reaches consciousness.

  We know people see things they don’t know they see because of years of research into areas like implicit cognition and subliminal perception. Dr. Mack and Dr. Rock, who wrote Inattentional Blindness, adapted some of these studies for their inattentional blindness research. They’d do things like ask their subjects to tell them which arm of a cross that flashed onto a computer screen for about 200 milliseconds was longer. Then, on some of the trials, there’d be a word like “grace” or “flake” printed on the screen, too. Most people didn’t notice the word. They were paying attention to the cross, so they didn’t see it.14

  But Dr. Rock and Dr. Mack showed that many of them had seen the words unconsciously. Later on, when they gave subjects just the first three letters of the word—gra or fla—and asked them to finish them with any word that came to mind, 36 percent answered “grace” or “flake.” Only 4 percent of the control subjects—these were people who hadn’t been subliminally exposed to any words at all—came up with “grace” or “flake.” That’s a huge difference and can only mean that the subjects who were subliminally exposed to “grace” and “flake” really did see “grace” and “flake.” They just didn’t know it.

  So we know that people perceive lots more than they realize consciously. Drs. Rock and Mack say that inattentional blindness works at a high level of mental processing, meaning that your brain does a lot of processing before it allows something into consciousness. In a normal human brain sensory data comes in, your brain figures out what it is, and only then does it decide whether to tell you about it, depending on how important it is. A lot of processing has already taken place before a normal human becomes conscious of something in the environment. (Drs. Rock and Mack use the phrase high level to mean advanced processing, not necessarily higher levels of the brain. They don’t discuss neuropsychology, just cognitive psychology.)

  There are a few things that always do break through to consciousness. I mentioned that people almost always notice
their names in the middle of a page of text no matter how hard they’re concentrating on something else; they will also notice a cartoon smiley face. But if you change the face just a tiny bit—turn the smile upside down so it’s a frown, for instance—people don’t see it. This is more evidence for the fact that your brain thoroughly processes sensory data before allowing it to become conscious. With the smiley face your brain has to have processed it to the level of knowing it’s a face and even that it’s a smiling face before it lets the face into conscious perception. Otherwise you’d see the frowny face as often as you saw the smiley face. It’s the same principle with your name. If your name is “Jack,” the word “Jack” will pop out at you in the middle of a page. But the letters “Jick” won’t. That means your brain processes the word “Jack” all the way up to the level of knowing that it’s your name before your brain admits “Jack” into consciousness.

  We don’t know why humans have inattentional blindness. Maybe inattentional blindness is the brain’s way of filtering out distractions. If you’re trying to watch a basketball game and a lady gorilla comes into view, your brain screens her out because she’s not supposed to be there, and she’s not relevant to what you’re trying to do, which is watch a game. Your nonconscious brain takes a look at the lady gorilla and decides she’s a distraction.

  Being able to filter out distractions is a good thing; just ask anyone who can’t filter things out, like a person with attention deficit hyperactivity disorder. It’s hard for humans to function intellectually when every little sensory detail in their environment keeps hijacking their attention. You go into information overload.

  But humans probably paid a price for developing the ability to filter out ladies wearing gorilla suits, which is that normal people can’t not filter out distractions. A normal brain automatically filters out irrelevant details, whether you want it to or not. You can’t just tell your brain: be sure and let me know if anything out of the ordinary pops up. It doesn’t work that way.

 

‹ Prev