You Are Not So Smart

Home > Other > You Are Not So Smart > Page 11
You Are Not So Smart Page 11

by David McRaney


  19

  The Public Goods Game

  THE MISCONCEPTION: We could create a system with no regulations where everyone would contribute to the good of society, everyone would benefit, and everyone would be happy.

  THE TRUTH: Without some form of regulation, slackers and cheaters will crash economic systems because people don’t want to feel like suckers.

  Before you hear about the public goods game, you need to understand the tragedy of the commons. The idea comes from a 1968 essay by geologist Garrett Hardin that suggested you aren’t very good at sharing.

  Imagine a giant lake filled with fish. You and three others are the only people who know about it. You all agree to take just as many fish from the lake as you need to eat. As long as everyone takes just what he or she needs, the lake will stay full of fish.

  One day, you happen to notice one of the others has started taking more than he or she needs and is selling the extra fish in a nearby town. Eventually that person has a better fishing rod than you.

  What do you do?

  If you start overfishing too, you will also be able to get a better rod, maybe even a boat. Maybe you could partner up against the cheater. Maybe everyone will just start taking as many fish as desired. Maybe you could just tell the world about the lake. All of these scenarios will probably lead to the ruin of the common good. If you do nothing, the lake will still be able to support you and the other two, but the cheater wins. Anger over unfair situations is something you can’t help but feel.

  In situations like the imaginary lake above, in an effort not to fall behind, everyone loses. A big holiday meal, for example, can become a zero-sum game if everyone piles a plate, but if everyone takes only what he or she needs, everyone wins. The tragedy of taking from a common good is over time the common good will be depleted out of just a tiny amount of greed. One misguided exploiter can crash the system. Greed is contagious.

  So what about a public good, a thing which everyone contributes to instead of takes from? It seems the same is true. Cheaters can ruin the system, not by themselves, but because the infectious nature of their gluttony is spread as people catch on to being shortchanged. Unfortunately, research into human behavior shows you are not so smart when it comes to contributing to the public good.

  The public goods game works like this:

  A group of people sits around a table, and each person is given a few dollars. The group is told they can put as much money as they want in the community pot. An experimenter then doubles the pot, and everyone then gets an equal portion back.

  If it’s ten people and everyone gets $2, and everyone puts in that money, the pot would be $20. It gets doubled to $40 and divided by ten. Everyone gets back $4. The game proceeds in rounds, and you would think everyone would just put the maximum amount in the pot each time—but they don’t. Someone usually gets the gist of the game and realizes that one can put in very little, or nothing at all, and start making more money than everyone else.

  If everyone but you puts in $2, the pot would be $18. It gets doubled to $36 and everyone gets back $3.60—including you, the one who put nothing in at all.

  In experiments where this game is played so everyone can see who puts in a fair share, the pot tends to grow for a while and then it starts to shrink as people test the water by withholding funds. The behavior spreads, because no one wants to be a chump, and eventually the economy grinds to a halt. If people are allowed the option of punishing cheaters, the cheating stops, and everyone wins. If instead of punishment, people are given the choice to reward good players, the economy again crashes after a few rounds.

  The crazy thing about this game is how illogical it is to stop contributing just because someone in the group is free riding. If everyone else is still being a good citizen of the game, everyone will still win. The old emotional brain kicks in, however, when you see cheating. It’s an innate response that served your ancestors well. You know deep down that cheaters must be punished because it takes only one cheater to make the economy sputter out. You would rather lose the game than help someone who isn’t helping you.

  This game is sometimes used to illustrate how regulation is necessary to keep any sort of nonprofit public good alive. Streetlights would never get put along dark roads, and bridges would collapse if people weren’t forced to pay taxes. Purely logical creatures could be trusted to figure out life isn’t a zero-sum game, but you are not a purely logical creature. You will cheat if you think the system is cheating you.

  The urge to help others and discourage cheating is something that helped primates like you survive in small groups for millions of years, but when the system becomes gigantic and abstract like the budget for a nation or the welfare system for an entire state, it becomes difficult to make sense of the world through those old evolutionary behaviors.

  The tragedy of the commons can be used to make a case for private property in order to encourage you to take care of your piece of the world, but you might think not everyone is going to buy a fuel-efficient car and recycle plastic, so why should you?

  The public goods game suggests regulation through punishment discourages slackers.

  It isn’t you don’t want to help; you just don’t want to help a cheater or do more work than a slacker—even if your not helping leads to ruining the game for you and everyone else.

  20

  The Ultimatum Game

  THE MISCONCEPTION: You choose to accept or refuse an offer based on logic.

  THE TRUTH: When it comes to making a deal, you base your decision on your status.

  Imagine you win $1 million in the lottery, but there’s a catch.

  This is a new experimental lottery in which the state says you must share your winnings with a stranger. You get to decide how the money is split, but the other person can reject your offer. If the other person rejects it, you both get nothing. You get only one chance, and the two of you will never see each other again. How much do you offer?

  Right about now the very thing that most makes you human has been activated. What separates you most from the rest of the animals is your complex social reasoning skills. Millions of variables are interplaying in your head, and you are running as many simulations as you can conjure to predict the future. You are imagining what the other person will do based on all your instincts and experiences.

  You now have ten seconds to decide.

  Oh no. What to do?

  The most logical thing to do would be to offer the stranger a small sum. How about $1,000? After all, if that person refuses, he or she gets nothing. Unfortunately for you, people don’t approach a situation like this with logic. When fairness is at stake, emotions take over. Somewhere deep in your brain, you can predict this, and like most people, you will offer the other person something closer to half.

  When this experiment is performed with real money and real people in the lab, most offers less than 20 percent of the total amount are rejected. In this scenario, the bare minimum you would have to offer is $200,000—even though you are the one who won the money.

  Give this problem to a computer, and it will take anything above zero. Something is better than nothing to a purely logical mind. Give this problem to a human, and you must deal with 3 million years of evolution.

  In the wild, we lived in small groups—usually fewer than 150 people. It was vitally important to understand where you ranked in such a group. Survival depended on your relationships and your standing. Reputation and status are more important than money to primates. People with lots of money gain high status, but if you were in the middle of a zombie apocalypse, the money would suddenly become paper again. Your status would quickly be determined by other factors.

  In the lottery situation, the money you offer to the other person is interpreted as your estimation of his or her status in the social hierarchy. If the other person accepts less than 20 percent, he or she will feel inferior and disrespected. The person will lose status in the eyes of others. No matter how large or small the amo
unt, in experiments with real people, offering less than 20 percent ensures that both parties lose. You know this instinctively, and most people offer around half of their prize when the ultimatum game is played in a laboratory. When you know the other party could exact revenge on you for being unfair, it encourages the sort of altruism that allowed your ancestors to escape into civilization.

  This effect is even greater if the person making the final decision has low serotonin levels. If a person feels sad and unwanted, he or she will demand more money before accepting. That person’s default settings give him or her a sense of lower status, and thus the person is unwilling to lower it even further by accepting an unfair offer.

  When experimenters change the rules so the person making the offer gets to keep his or her share no matter what, just about everyone tries to screw the other person by offering around 10 percent.

  This situation comes up in life all the time. You decide when to ask for a raise, or make a move in the bar, or get up on stage and sing, based on your perceived status within a group. If it is low, you won’t risk further damage. If it is high, you expect better treatment.

  The promise of revenge is one way human beings ensure fairness, and you are precisely tuned to expect it. Your perceived status is part of the unconscious equation you work out when accepting, refusing, and making offers with other people. You are not so smart, so you are willing to get nothing if it ensures fair treatment in the future and a more secure place on the social ladder.

  21

  Subjective Validation

  THE MISCONCEPTION: You are skeptical of generalities.

  THE TRUTH: You are prone to believing vague statements and predictions are true, especially if they are positive and address you personally.

  Based on the data I’ve collected from the comments, e-mails, and other browsing information generated by the You Are Not So Smart blog, all cross-referenced with demographics information prepared in marketing studies for the placement of this book on shelves around the world, I have a pretty good idea of who you are.

  Here are my findings:

  You have a need for other people to like and admire you, and yet you tend to be critical of yourself. While you have some personality weaknesses, you are generally able to compensate for them. You have considerable unused capacity that you have not turned to your advantage. Disciplined and self-controlled on the outside, you tend to be worried and insecure on the inside. At times you have serious doubts as to whether you have made the right decision or done the right thing. You prefer a certain amount of change and variety and become dissatisfied when hemmed in by restrictions and limitations. You also pride yourself on being an independent thinker and do not accept others’ statements without satisfactory proof. But you have found it unwise to be too frank in revealing yourself to others. At times you are extroverted, affable, and sociable, while at other times you are introverted, wary, and reserved. Some of your aspirations tend to be rather unrealistic.

  Does this sound accurate? Does it describe you?

  It should. It describes everyone.

  All the above statements came from a 1948 experiment by Bertram R. Forer. He gave his students a personality test and told them each one had been personally assessed, but then gave everyone the same analysis.

  He asked his students to look over the statements and rate them for accuracy. On average, they rated the bogus analysis as 85 percent correct—as if it had been personally prepared to describe each one of them. The block of text above was actually a mishmash of lines from horoscopes collected by Forer for the experiment.

  The tendency to believe vague statements designed to appeal to just about anyone is called the Forer effect, and psychologists point to this phenomenon to explain why people fall for pseudoscience like biorhythms, iridology, and phrenology, or mysticism like astrology, numerology, and tarot cards. The Forer effect is part of a larger phenomenon psychologists refer to as subjective validation, which is a fancy way of saying you are far more vulnerable to suggestion when the subject of the conversation is you.

  Since you are always in your own head, thoughts about what it means to be you take up a lot of mental space. With some cultural variations, most people are keen on being individuals, unique and special persons whose hopes and dreams and fears and doubts are all their own. If you have the means, you personalize everything: your license plate, your ring tone, your computer’s desktop wallpaper, your bedroom’s walls.

  Everything around you says something about your personality. Cultivating an incomparable self either through consumption or creation is not something you take lightly. Yet somewhere between nature and nurture, we are all far more similar than we think. Genetically, you and your friends are almost identical. Those genes create the brain that generates the mind from which your thoughts spring. Thus, genetically, your mental life is as similar to everyone else’s as the feet in your shoes. Culturally, we differ. Our varying experiences in our varying environments shape us. Still, deep below, we are the same, and the failure to notice this can be exploited.

  If a statement is ambiguous and you think it addresses you directly, you will boil away the ambiguity by finding ways to match the information up with your own traits. You think back to all the time spent figuring out who you are, dividing your qualities from the qualities of others, and apply the same logic.

  Here’s an excerpt from a real horoscope at horoscopes.com: “At some point during the day, you might have the feeling that you aren’t working hard enough to keep the forward motion going, and you might feel panic rise. This could prove a good motivating factor, but you don’t need to push yourself harder than you’re going now. You’re on a roll and it’s likely to continue. Just pace yourself.”

  Now here’s another one from the same source on the same day but for a different sign: “Don’t be too hard on yourself if you’re dragging a little toward the end of the day. You’ll be able to recharge your batteries before tomorrow. In the evening, relax at home with a good book.”

  Seen straight on, horoscopes describe the sort of things we all experience, but pluck one from the bunch, turn it ever so slightly, and you will see it matching all the details of your life. If you believe you live under a sign, and the movement of the planets can divine your future, a general statement becomes specific.

  It is this hope that gives subjective validation its power. If you want the psychic to be real, or the sacred stones to forecast the unknown, you will find a way to believe them even when they falter. When you need something to be true, you will look for patterns; you connect the dots like the stars of a constellation. Your brain abhors disorder. You see faces in clouds and demons in bonfires. Those who claim the powers of divination hijack these natural human tendencies. They know they can depend on you to use subjective validation in the moment and confirmation bias afterward.

  The psychologist Ray Hyman has spent most of his life studying the art of deception. Before he entered the halls of science, he worked as a magician and then moved on to mentalism after discovering he could make more money reading palms than performing card tricks. The crazy thing about Hyman’s career as a palm reader is, like many psychics, over time he began to believe he actually did have psychic powers. The people who came to him were so satisfied, so bowled over, he thought he must have a real gift. Subjective validation cuts both ways.

  Hyman was using a technique called cold reading, where you start with the wide-angle lens of generalities and watch the other person for cues so you can constrict the focus down to what seems like a powerful insight into the other person’s soul. It works because people tend to ignore the little misses and focus on the hits. As he worked his way through college, another mentalist, Stanley Jaks, took Hyman aside and saved him from delusion by asking him to try something new—tell people the opposite of what he believed their palms revealed. The result? They were just as flabbergasted by his abilities, if not more so. Cold reading was powerful, but tossing it aside, he was still able to amaze. Hyman
realized what he said didn’t matter as long as his presentation was good. The other person was doing all the work, tricking him- or herself, seeing the general as the specific just like in the Forer effect.

  Mediums and palm readers, those who speak for the dead or see into the beyond for cash, depend on subjective validation. Remember, your capacity to fool yourself is greater than the abilities of any conjurer, and conjurers come in many guises. You are a creature impelled to hope. As you attempt to make sense of the world, you focus on what falls into place and neglect that which doesn’t fit, and there is so much in life that does not fit.

  When you see a set of horoscopes, read all of them. When someone claims he or she can see into your heart, realize that all of our hearts are much the same.

  22

  Cult Indoctrination

  THE MISCONCEPTION: You are too smart to join a cult.

  THE TRUTH: Cults are populated by people just like you.

  Cults are a side effect of natural human tendencies. You have an innate desire to belong to a group and to hang out with interesting people. If you have ever admired someone you have never actually met—like a musician—you’ve experienced the seed of the cult phenomenon.

  The word “cult” is slippery, because seen from far away, many organizations, institutions, and religions could be seen as cults. The line between groups and cults is blurry. The fuzzy line is why you are far more likely to end up in a cult than you think.

  The research on cults suggests you don’t usually join for any particular reason; you just sort of fall into them the way you fall into any social group. After all, when did you join your circle of friends? Your group of close friends has likely changed a great deal over the years, but have you made many active choices concerning who you hang out with other than avoiding the ones who are a pain in the ass?

 

‹ Prev