The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life

Home > Other > The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life > Page 4
The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life Page 4

by Robert Trivers


  Recent work shows a similar mental architecture in monkeys regarding in-groups and out-groups. When a test is performed on a monkey in which it responds visually to matched facial pictures of in-group and out-group members, corrected for degree of experience with them, there is a clear tendency to view the out-group member longer—a measure of concern and hostility. Likewise, a monkey will attach an out-group orientation to an object an out-group group member is looking at and vice versa for an in-group member. Finally, male monkeys (but not female) more readily associate out-group members with pictures of spiders but in-group members with pictures of fruits. The beauty of this work is that the monkeys were migrating in and out of different groups at various times, so one could control exactly for degree of familiarity. In-group members, for example, tend to be more familiar, but independent of familiarity, they are preferred over out-group members. That males more readily associate out-group members with negative stimuli, and in-group with positive, is consistent with work on humans in which men typically are relatively more prejudiced against out-group than in-group members.

  The Biases of Power

  It has been said that power tends to corrupt and absolute power, absolutely. This usually refers to the fact that power permits the execution of ever more selfish strategies toward which one is then “corrupted.” But psychologists have shown that power corrupts our mental processes almost at once. When a feeling of power is induced in people, they are less likely to take others’ viewpoint and more likely to center their thinking on themselves. The result is a reduced ability to comprehend how others see, think, and feel. Power, among other things, induces blindness toward others.

  The basic methodology is to induce a temporary state of mind via a so-called prime, which can be conscious or unconscious and as short as a word or considerably more detailed, as in this case. The power prime consists of asking some people to write for five minutes about a situation in which they felt powerful, supplemented by having the subjects apportion candy among a group, while the low-power prime group writes about the opposite situation and is allowed only to say the amount of candy they hope to receive.

  This modest prime produced the following striking results. When the subjects were asked to snap their right-hand fingers five times in succession and quickly write the letter E on their foreheads, an unconscious bias was revealed. Those who had been primed to feel powerless were three times as likely to write the E so that others could read it, compared to those primed to feel powerful. This effect was equally strong for the two sexes. The basic shift in focus from other to self with power was confirmed in additional work. When compared with those with a neutral prime, those with the power prime were less able to discriminate among common human facial expressions associated with fear, anger, sadness, and happiness. Again, the sexes responded similarly to the power prime, but in general women are better at making the emotional discriminations, and men are more likely to be overconfident. In short, powerful men suffer multiple deficits in their ability to apprehend the world of others correctly, due to their power and their sex. And since, at the national level, it is powerful men who usually decide for war, they have an in-built bias in the wrong direction, less oriented toward others, less inclined to value their viewpoint, with, alas, often tragic effects all the way around (see Chapter 11).

  There must be a thousand examples of power inducing male blindness, but why not look at Winston Churchill? He experienced highs and lows in his life that were often nearly absolute. One moment he was the prime minister of the UK during World War II—one of the most powerful prime ministers ever—and the next moment he is an ex–prime minister with almost no political power at all. Similar reverses were associated with World War I. At the heights of his power, he was described as dictatorial, arrogant, and intolerant, the stuff of which tyrants are made; at low power, he was seen as introspective and humble.

  Moral Superiority

  Few variables are as important in our lives as our perceived moral status. Even more than attractiveness and competence, degree of morality is a variable of considerable importance in determining our value to others—thus it is easily subject to deceit and self-deception. Moral hypocrisy is a deep part of our nature: the tendency to judge others more harshly for the same moral infraction than we judge ourselves—or to do so for members of other groups compared to members of our own group. For example, I am very forgiving where my own actions are concerned. I will forgive myself in a heartbeat—and toss in some compassionate humor in the bargain—for a crime that I would roast anybody else for.

  Social psychologists have shown these effects with an interesting twist. When a person is placed under cognitive load (by having to memorize a string of numbers while making a moral evaluation), the individual does not express the usual bias toward self. But when the same evaluation is made absent cognitive load, a strong bias emerges in favor of seeing oneself acting more fairly than another individual doing the identical action. This suggests that built deeply in us is a mechanism that tries to make universally just evaluations, but that after the fact, “higher” faculties paint the matter in our favor. Why might it be advantageous for our psyches to be organized this way? The possession of an unbiased internal observer ought to give benefits in policing our own behavior, since only if we recognize our behavior correctly can we decide who is at fault in conflict with others.

  The Illusion of Control

  Humans (and many other animals) need predictability and control. Experiments show that occasionally administering electrical shocks at random creates much more anxiety (profuse sweating, high heart rate) than regular and predictable punishment. Certainty of risk is easier to bear than uncertainty. Controlling events gives greater certainty. If you can control, to some degree, your frequency of being shocked, you feel better than if you have less control over less frequent shocks. Similar effects are well known for other animals, such as rats and pigeons.

  But there is also something called an illusion of control, in which we believe we have greater ability to affect outcomes than we actually do. For the stock market, we have no ability to affect its outcome by any of our actions, so any notion that we do must be an illusion. This was measured directly on actual stockbrokers. Scientists set up a computer screen with a line moving across it more or less like the stock market average, up and down—jagged—initially with a general bias downward but then recovering to go into positive territory, all while a subject sits in front of the screen, able to press a computer mouse, and told that pressing it “may” affect the progress of the line, up or down. In fact, the mouse is not connected to anything. Afterward, people are asked how much they thought they controlled the line’s movement, a measure of their “illusion of control.”

  A very interesting finding emerged when those taking the tests were stockbrokers (105 men and 2 women) whose firms provided data both on internal evaluation and on salaries paid. In both cases, those with a higher illusion of control did worse. They were evaluated by their superiors as being less productive and, more important, they earned less money. Cause and effect is not certain, of course. But if the direction of effect were such that poor performers responded to their own failure by asserting greater control over external events, then they would be blaming themselves more for failure than success, contrary to the well-documented human bias to rationalize away one’s failures. The alternative scenario then seems much more likely—that imagining one has greater control over events than one actually has leads to poorer performance: being a worse stockbroker. Note the absence of a social dimension here. One has no control over the movement of markets and scarcely much knowledge. There seems little possibility to fool your superiors along these lines when they can measure your success easily and directly. It is not at all clear that such an illusion in other situations may not give some social benefits—or even individual ones, as in prompting greater effort toward achieving actual control.

  It is interesting to note that lacking control increases somet
hing called illusory pattern recognition. That is, when individuals are induced to feel a lack of control, they tend to see meaningful patterns in random data, as if responding to their unfortunate lack of control by generating (false) coherence in data that would then give them greater control.

  The Construction of Biased Social Theory

  We all have social theories, that is, theories regarding our immediate social reality. We have a theory of our marriages. Husband and wife may agree, for example, that one party is a long-suffering altruist while the other is hopelessly selfish, but disagree over which is which. We each have a theory regarding our employment. Are we an exploited worker, underpaid and underappreciated for value given—and therefore fully justified in minimizing output while stealing everything that is not nailed down? We usually have a theory regarding our larger society as well. Are the wealthy unfairly increasing their share of resources at the expense of the rest of us (as has surely been happening) or are the wealthy living under an onerous system of taxation and regulation? Does democracy permit us to reassert our power at regular intervals or is it largely a sham exercise controlled by wealthy interests? Is the judicial system regularly biased against our kinds of people (African Americans, the poor, individuals versus corporations)? And so on. The capacity for these kinds of theories presumably evolved not only to help understand the world and to detect cheating and unfairness but also to persuade self and others of false reality, the better to benefit ourselves.

  The unconscious importance of biased social theory is revealed most vividly perhaps when an argument breaks out. Human arguments feel so effortless because by the time the arguing starts, the work has already been done. The argument may appear to burst forth spontaneously, with little or no preview, yet as it rolls along, two whole landscapes of information lie already organized, waiting only for the lightning of anger to reveal them. These landscapes have been organized with the help of unconscious forces designed to create biased social theory and, when needed, biased evidence to support them.

  Social theory inevitably embraces a complex set of facts, which may be only partially remembered and poorly organized, the better to construct a consistent, self-serving body of social theory. Contradictions may be far afield and difficult to detect. When Republicans in the US House of Representatives bemoaned what the Founding Fathers would have thought had they known a future president (Clinton) would have sex with an intern, the black American comedian Chris Rock replied that they were having sex not with their interns but with their slaves. This of course is an important function of humor—to expose and deflate hidden deceit and self-deception (see Chapter 8).

  False Personal Narratives

  We continually create false personal narratives. By enhancing ourselves and derogating others, we automatically create biased histories. We were more moral, more attractive, more “beneffective” to others than in fact we were. Recent evidence suggests that forty- to sixty-year-olds naturally push memories of negative moral actions roughly ten years deeper into their past than memories of positive ones. Likewise, there is a similar but not so pronounced bias regarding nonmoral actions that are positive or negative. An older self acted badly; a recent self acted better. I am conscious of this in my own life. When saying something personal, whether negative or positive, I displace it farther in the past, as if I am not revealing anything personal about my current self, but this is especially prominent for negative information—it was a former self acting that way.

  When people are asked to supply autobiographical accounts of being angered (victim) or angering someone else (perpetrator), a series of sharp differences emerges. The perpetrator usually describes angering someone else as meaningful and comprehensible, while victims tend to depict such an event as arbitrary, unnecessary, or incomprehensible. Victims often provide a long-term narrative, especially one emphasizing continuing harm and grievance, while perpetrators describe an arbitrary, isolated event with no lasting implications. One effect of this asymmetry between victim and perpetrator is that when the victim suppresses anger at a provocation, only to respond after an accumulation of slights, the perpetrator sees only the final, precipitating event and easily views the victim’s angry response as an unwarranted overreaction.

  There is also something called false internal narratives. An individual’s perception of his or her own ongoing motivation may be biased to conceal from others the true motivation. Consciously, a series of reasons may unfold to accompany actions so that when they are challenged, a convinced alternative explanation is at once available, complete with an internal scenario—“but I wasn’t thinking that at all; I was thinking . . . ”

  Unconscious Modules Devoted to Deception

  Over the years, I have discovered that I am an unconscious petty thief. I steal small objects from you while in your presence. I steal pens and pencils, lighters and matches, and other useful objects that are easy to pocket. I am completely unconscious of this while it is going on (as are you, most of the time) even though I have been doing it for more than forty years now. Perhaps because the trait is so unconscious, it appears to have a life of its own and often seems to act directly against my own narrow interests. I steal chalk from myself while lecturing and am left with no chalk with which to lecture (nor do I have a blackboard at home). I steal pens and pencils from my office, only to offload them at home—leaving me none the next day at the office—and so on. Recently I stole a Jamaican principal’s entire set of school keys from the desk between us. No use to me, high cost to him.

  In summary, there appears to be a little unconscious module in me devoted to petty thievery, sufficiently isolated to avoid interfering with ongoing activity (such as talking). I think of a little organism in me looking out for the matches, the ideal moment to seize them, the rhythm of the actual robbery, and so on. Of course, this organism will study the behavior of my victim but it will also devote time to my own behavior, in order best to integrate the thievery while not giving off any clues. Noteworthy features of this little module in my own life are that the behavior has changed little over my lifetime, and that increasing consciousness of the behavior after the fact has done little or nothing to increase consciousness prior to, during, or immediately after the behavior. The module also appears to misfire more often the older I get. Incidentally, the only time I can remember getting caught is by my brother, born a year after me—we were raised as twins. We each had an ability to read deception in the other that others in the family could not match. Once when we were both in our late forties, I began to pocket his pen, but he grabbed my hand halfway to my pocket and the pen was his again.

  I think I never pilfer from someone’s office when it is empty. I will see a choice pen and my hand moving toward it but will say, “Robert, that would be stealing,” and stop. Perhaps if I steal from you in front of your face, I believe you have given implicit approval. When I stole the principal’s keys, I was simultaneously handing him some minor repayment for a service performed and thinking I might be paying too much. Perhaps I said to myself, “Well this is for you, so this must be for me,” and he went along with the show.

  How many of these unconscious modules operate in our lives? The only way I know about this one is that my pockets fill up with contraband, and I get occasional questions from friends. Stealing ideas will not leave much evidence and is very common in academia. I once wrote a paper that borrowed heavily from a well-known book, a fact I had forgotten by the time I finished the paper. Only when I reread my copy of the book did I see where the ideas had come from—these sections were heavily underlined, with many marginal notations.

  It also seems certain that unconscious ploys to manipulate others in specific ways must be common. Specialized parts of ourselves look out for special opportunities in others. The value of this is precisely that two or more activities can go on simultaneously, with little or no interference. If an independent unconscious module studies for opportunities to steal or lie, it need not interfere (except slightly) with other
, ongoing mental activities. We really have no idea how common this kind of activity may be.

  THE HALLMARKS OF SELF-DECEPTION

  In summary, the hallmark of self-deception in the service of deceit is the denial of deception, the unconscious running of selfish and deceitful ploys, the creation of a public persona as an altruist and a person “beneffective” in the lives of others, the creation of self-serving social theories and biased internal narratives of ongoing behavior, as well as false historical narratives of past behavior that hide true intention and causality. The symptom is a biased system of information flow, with the conscious mind devoted (in part) to constructing a false image and at the same time unaware of contravening behavior and evidence.

  Of course, it must usually be advantageous for the truth to be registered somewhere, so that mechanisms of self-deception are expected often to reside side-by-side with mechanisms for the correct apprehension of reality. The mind must be constructed in a very complex manner, repeatedly split into public and private portions, with complicated interactions between them.

  The general cost of self-deception is the misapprehension of reality, especially social, and an inefficient, fragmented mental system. As we shall learn, there are also important immune costs to self-deception, and there is something called imposed self-deception, in which an organism works unconsciously to further the interests of the organism inducing the self-deception costs on all sides, the worst of all possible worlds. At the same time, as we shall also see in Chapter 3, there is sufficient slack in the system for people to sometimes deceive themselves for direct advantage (even immunological). Before we turn to that, we will review the subject of deception in nature. There is an enormous literature on this subject and a few principles of genuine importance.

 

‹ Prev