Games Primates Play: An Undercover Investigation of the Evolution and Economics of Human Relationships

Home > Other > Games Primates Play: An Undercover Investigation of the Evolution and Economics of Human Relationships > Page 15
Games Primates Play: An Undercover Investigation of the Evolution and Economics of Human Relationships Page 15

by Dario Maestripieri


  It’s the same with paying taxes to the government or to the state. We should all spontaneously pay taxes because it’s in our own interest to do so: taxpayer money is used to build roads and public schools, to fund scientific and medical research, and in many countries—which thankfully will soon include also the United States—to provide universal health care coverage. If tax-paying is not enforced, however, and tax evaders are not punished, very few people will do it. In the United States, everyone has heard scary stories about being audited by the IRS and having to pay hefty fines or go to jail for failing to pay taxes. People are willing to try to evade taxes anyway—particularly if they make a lot of money and the stakes are high—but not nearly as many as in Italy, where tax payment enforcement is lax and no one is scared of getting caught and being punished for tax evasion. As a result, tax evasion is rampant in Italy, most Italians are wealthy because they keep their tokens for themselves, the country is always bankrupt, and in the end that hurts everyone, just as a degraded environment does.

  There are many “tragedies of the commons” in human societies, but again, these situations are not unique to our species. For example, think of a situation in which several different parasite microorganisms grow in the same host—the “public resource.” The interest of each individual parasite would be to exploit the host as much as possible; if all parasites behave this way, however, the public resource is over-exploited, the host dies, and the parasites die along with it (or at least have to find a new host).

  The tragedy of the commons can be resolved not only by enforcing cooperation and punishing free riders—as is done with tax payment and tax evasion—but also by providing rewards to the individuals who spontaneously cooperate. A good reputation, for instance, can offset the cost of cooperation. To better understand why and how reputation can influence people’s tendency to cooperate in public goods situations, it is necessary to introduce the notion of indirect reciprocity. Evolutionary biologists distinguish between direct and indirect reciprocity. Direct reciprocity refers to a situation in which an individual altruistically helps another with the expectation that the beneficiary will later reciprocate. If reciprocation in fact occurs, both individuals benefit. In indirect reciprocity, an individual altruistically helps another, but the help is returned by a third individual, not by the original recipient of the help. Typically, all three individuals involved belong to the same group and have some interests in common, so that if indirect reciprocity becomes a common practice within the group, then all group members will benefit. Indirect reciprocity, however, can also work when help is offered to individuals outside one’s group (for example, by making a donation to a charity organization) because the donor gains—not directly through support received by a third party, but through an enhancement of his or her reputation. Thus, by helping others who are unable to reciprocate the help to the donor in the future, people build up a good reputation—or, in the jargon of game theorists, a “positive image score”—whereas refusing to help can damage their reputation.7

  Having a good or a bad reputation for generosity can mean the difference between good and bad business, or between political success and failure. For example, when Bill Gates was still the CEO of Microsoft and before he married his wife Melinda, he rarely if ever made large donations to charities, despite being often at the top of Forbes magazine’s list of the richest men on earth. His reputation for being stingy probably didn’t help Microsoft’s business, although the software giant was so powerful and successful that, in the end, the reputational damage was largely inconsequential. After Bill married Melinda and their Bill and Melinda Gates Foundation started making large donations to charities, Bill’s reputation and Microsoft’s image improved considerably, and this probably had positive consequences for the business as well.

  In case you don’t find the Bill Gates example compelling, consider the experimental studies that have demonstrated how concerns about reputation can influence people’s willingness to contribute to public goods and how a good reputation can translate into monetary or political gains. A few years ago, two economists, James Andreoni at the University of Wisconsin and Ragan Petrie at Georgia State University, had two hundred college students play a computerized public goods game in conditions that varied in degrees of anonymity. The students played in groups of five, and each player in a group was given twenty tokens, which could be invested in a private good that paid the investor two cents for each token invested or in a public good that paid the investor one cent for every token invested. There were four experimental conditions. In the baseline condition, all five players in a group knew the total contributions of their group to public goods, but didn’t know who the other group members were or how much they had individually contributed. In the information condition, the five players knew exactly what each group member had contributed to the public good, but didn’t know who they were. In the photos condition, subjects saw photos of other group members, but had no information on their individual contributions. Finally, in the information-and-photos condition, subjects saw photos and received information, so they knew who their group members were and how much each had contributed. The experiment showed that providing information and identification together (the information-and-photos condition) resulted in a 59 percent increase in giving to the public good relative to the baseline condition. Two other economists, Mari Rege and Kjetil Telle, later replicated these results in a study in which Norwegian students living in Oslo who had never met before played a single round of a public goods game. Thus, even strangers can be induced to invest more in public goods by concerns about reputation.8

  The effect of the perception of being watched on cooperation that Haley and Fessler demonstrated in the dyadic Dictator Game has also been demonstrated in public goods games. Harvard researchers Terence Burnham and Brian Hare had students play multiple rounds of a public goods game under conditions of anonymity; however, in one condition the students used a computer that displayed on the screen an image of Kismet, a robot built at the Massachusetts Institute of Technology that doesn’t look particularly human—except for its eyes. It turned out that players who perceived that they were being watched by Kismet contributed 29 percent more to the public good than did the players with a neutral computer screen.9

  Social psychologists believe that we care so much about reputation because we all continually seek approval and respect from others to maintain our self-esteem or to promote our social identity. In their view, the psychological reward of establishing a good reputation explains why people are willing to invest in it. This may well be true, but there may be deeper, more selfish incentives at work. Economists argue that humans invest in reputation in order to maximize their personal financial gains and minimize their losses, while evolutionary biologists think that animals do it in order to maximize their personal fitness gains and minimize their losses. (Here “financial” means “money” and “fitness” means “survival and reproduction.”) Essentially, a good reputation is an extension of our spending limit on our credit card. With no reputation, we get no credit from others. As we build a reputation with acts of cooperation, there is a corresponding increase in others’ willingness to give us credit for larger and larger amounts in future business transactions.

  Figure 5.3. Kismet, the robot built at MIT. Photo from Wikipedia.

  That a good reputation can result in financial or political gains has been demonstrated experimentally. For example, a series of studies by a Swiss biologist, Manfred Milinski, and his colleagues showed that reputation enhances cooperation in public goods games when these games are alternated with indirect reciprocity games in which players are indirectly rewarded for their generosity. Thus, somewhat like the cheating cleaner fish that gain a good reputation in one context and then benefit from it in another one, good reputation built through generosity in the public goods games results in gains in the subsequent indirect reciprocity games. However, as Milinski showed, if the indirect reciprocity games are eliminate
d, or if the players have different identities in the two types of games so that the reputation built in one game cannot be transferred to the other, then contributions to the public goods quickly disappear.10 Thus, people are aware of whether they will be recognized in a future social situation and use this information to invest in their reputation only if doing so is likely to result in tangible gains in the other context.

  This sounds pretty cynical, but again, it’s what the experiments show. One may be a little skeptical of these results, however, because they are all obtained from college students in artificial experimental conditions. Maybe people in the real world don’t act like college students earning a few bucks by volunteering to serve as subjects for an economics experiment.

  In the real world, as it turns out, revealing the identity and generosity of givers is important. Charities, for instance, often give their donors considerable opportunities to be identified, from building statues in their honor to publishing their names in a magazine or on a website. In addition, by offering donors premium gifts that vary in relation to the contribution amount, fund-raisers allow donors to broadcast to others that they gave a certain amount. Comparisons between levels of donations are important for reputation and status-building among donors. Fund-raisers seem to know this: when they solicit contributions from a particular donor, they may disclose what others have already given and suggest a contribution amount that would allow the donor to be competitive with his or her peers. Charities also promote comparisons in generosity among donors—and therefore competition for reputation—by reporting gifts in categories. For example, museums and theaters list donors in their programs by categories such as “patrons,” “sponsors,” and “fellows,” based on the amount of their donations. Carefully constructed by the institution, these categorizations are most likely intended to get people to “round up” their donations to get into a higher category.

  Andreoni and Petrie have provided us with further experimental evidence that reputation plays an important role in donations. The two economists had students play a computerized game that mimicked charity donations. Players were given the option of remaining anonymous, but when they chose to have their identities revealed, they were assigned to different categories of generosity based on the size of their gifts. People contributed more when they were given the option of having their contribution announced. Category reporting also had a significant effect of shifting gifts up to meet the lower bound of the higher category. Similarly, another experiment by Milinski and collaborators showed that donations made in public to UNICEF, a well-known world relief organization, resulted in both personal financial gain (the player-donors received more money from the members of their group) and enhanced political reputation (they were elected to represent the interests of their group).11 As evolutionary biologist Richard Alexander eloquently wrote in his book The Biology of Moral Systems, “In complex social systems with much reciprocity, being judged as attractive for reciprocal interactions may become an essential ingredient for success.”12

  In personal relationships, a person’s reputation is often based on direct observation of his or her behavior in previous interactions, while in public life reputation can be formed through well-advertised acts of cooperation or generosity. In both cases, transmission of reputation through third parties—gossip, in other words—can also be crucial. We all know that gossip can play a huge role in establishing or destroying a reputation, so the notion that gossip can influence one’s tendency to cooperate or be generous should not be surprising. Indeed, there are experiments that remove all doubt on this point. In a recent study conducted by psychologists Jared Piazza and Jesse Bering, people played the Dictator Game anonymously, but some player 1s were told that the player 2s would discuss their decision to share money with a third party who knew their identity. The threat of gossip and concern about their reputation prompted the dictators to be more generous in sharing their money.13

  Given how widespread gossip is in human societies—studies investigating the content of conversation in college cafeterias and in tribal villages have reported that over 50 percent of it consists of gossip—it is clear that anyone who has business or political ambitions should make an effort to be the subject of positive rather than negative gossip.14 Cultivating a good reputation, however, can be an expensive investment, and doing so only makes sense if there is a good chance that it will translate into future gains. The game theory models predict that people will stop investing in building their own reputation as soon as they find out that no future gain is likely to result, and the experiments show that this is indeed the case.

  THE PUNISHMENT OF DEFECTORS

  Building a reputation through flamboyant acts of generosity, such as making a million-dollar donation, is not an option for most of us. More generally, securing a good reputation is not enough of an incentive for many people to invest in public goods. For these folks, contributions to public goods must be enforced by laws and threats of fines or jail time. No external factor, however, is as powerful (and as cost-effective) as the internal control that people can exert on their own behavior. When rules are internalized—that is, when people feel a sense of ownership and believe that obeying the rules is in their best interest—then they become the most efficient enforcers. People simultaneously function as their own informants, police officers, and judges to make sure they catch themselves if they break a rule and give themselves the appropriate punishment. Self-punishment can be harsh and painful—consider the practice of self-flagellation among some medieval Catholic monks, who whipped their own backs to punish themselves for impure thoughts or actions (a phenomenon brought to our attention by the albino monk assassin in The Da Vinci Code). People can also give themselves the death penalty by committing suicide out of guilt for something they did.

  Laws that force people to pay taxes are usually not internalized—no one commits suicide because they cheated the IRS—but religious and moral rules are. This explains why religion and morality are far more effective means of controlling people’s behavior than the laws and law enforcement agencies of democratic societies, or even the violence and intimidation of oppressive dictatorial regimes. Some people are better than others at internalizing rules, or find it easier to do it in some contexts than in others. When feelings of guilt don’t work to restrain people’s selfish behaviors and tendencies to cheat, then others lend a hand with what evolutionary biologist Robert Trivers calls “moralistic aggression.”15

  When people are caught defecting in cooperative interactions—whether they involve another individual or a whole group—others will punish them by condemning their behavior in public or by spreading negative gossip. By giving defectors a bad reputation, others inflict costs on them by undermining their viability as partners in future cooperation—whether in romantic or marital relationships, business partnerships, or political activity. That people are willing to go out of their way to punish defectors has been demonstrated by countless experiments involving the Prisoner’s Dilemma, the Dictator Game, public goods games, and other economic games involving cooperation and trust.16 But let’s turn away from the experiments to take a look at more concrete examples from everyday life.

  We are all aware of the damage that malicious gossip can do to someone’s social, financial, or political reputation. Malicious gossip is a form of punishment that allows the moralistic police to do potentially lethal damage to defectors without exposing themselves to the risk of retaliation. In some cases, the defector being punished is not even aware of the gossip. When things at work or at home take a bad turn for the defector, he or she may just blame it on bad luck or karma. Other forms of moralistic punishment—the most interesting—are better advertised.

  The purpose of moralistic aggression is to inform everyone that cheating has been noticed and is disapproved. A mild form of this is horn-honking. For example, I have the aggressive driving style typical of many Italians. When I drive on the streets of California and fail to respect some traffic rul
es, other drivers honk at me even though they are not directly affected by my driving. But moralistic aggression against traffic rule transgressors such as myself is nothing compared to moralistic aggression against people who cheat on their spouses or in sports, business, or politics. To give just one example, some wives who have been cheated on by their husband have paid exorbitant sums of money to put their cheating husband’s name or face on huge posters displayed in busy urban areas to give him a bad reputation and make sure no other women will pair up with him in the future.

  Punishment of individuals who fail to cooperate has been shown to exist in many animal societies in which cooperation is important. For example, rhesus macaques normally give calls to alert other group members when they discover a tree full of ripe fruit in the forest. Primatologist Marc Hauser reported that some macaques eat all of the fruit themselves without alerting the rest of the group, but that they are later attacked by the group if they get caught.17 I wouldn’t call this moralistic aggression—there is no morality among macaques—but the circumstances are quite similar to those of human moralistic aggression.

  People are afraid of moralistic aggression, and rightly so: the costs inflicted by others for failing to cooperate and for breaking the rules can be high. There are therefore two reasons why we are more likely to cooperate if we perceive that we are being watched or our identity is known: in addition to building a good reputation for cooperation, which might bring tangible benefits from future investors, we also want to avoid being punished if we get caught cheating. Cheating someone who expects us to cooperate is always selfish and may also be immoral or unlawful. Defecting in a cooperative game—regardless of the nature of the game—also signifies something else: competition. When we cheat a partner or a group, we want our interests to prevail over theirs. We make a conscious choice to compete instead of cooperating. Just as cooperation has benefits—such as the brownie points in reputation we accumulate if our altruism is well advertised—so competition has costs, and these costs can be minimized or avoided altogether if our competitive/selfish behavior is hidden under the blanket of anonymity. Just as people who choose to cooperate with others like to be in the spotlight—so that everyone can see, appreciate, and eventually, they hope, reward their acts—when people choose to defect and hurt rather than help others, they prefer to operate in darkness.

 

‹ Prev