Popular

Home > Other > Popular > Page 9
Popular Page 9

by Mitch Prinstein


  Results like these help explain why popularity itself has become such a valued business commodity. Marketers know that we tend to follow the herd, so they rely on who or what is popular to influence our behaviors.

  Have you read an article online recently? You may have noticed that the headline is no longer the only thing placed at the top of the page meant to attract our attention. There’s often also a row of icons above each article, one for each social media outlet, with a running tally showing how often the piece has been emailed, “liked” on Facebook, tweeted, and so on. Lists of “trending topics” have become common, too—not just on social media, but on news outlets as well. This information is intended to pique our interest, as if the popularity of the story among others should make it more enticing to us. The tactic is not much different from a TV commercial that touts the “number one movie in America” or the “leading brand of headache medicine.” In each case, we are called to follow the herd.

  We are in turn prompted to tell others what we liked, or bought, or preferred, so that the herd can follow us as well. As we finish reading an article that impresses us, we are prodded to “like” it or email it to our friends. Likewise, when we buy products, we are asked to post the news to our Facebook feed. I can’t imagine that my friends would be interested in knowing that I just bought shaving cream, but I can understand why the manufacturer wants me to tell them I have. We associate popularity with quality.

  Why is this strategy so effective when, logically, it doesn’t make much sense? Why should I care about what everyone else is reading? I want to read what interests me, not what appeals to ten thousand complete strangers. I want to watch movies that match my own tastes, and I assume that my own body’s physiology is the most important factor to consider when choosing a medicine.

  One explanation is based on the idea that we feel essentially similar to others, thus, we assume that whatever the herd likes, we’d like, too. Or perhaps our natural proclivities toward popularity come from our sense of community and our desire to feel connected. If everyone is talking about a news item, or a movie, we don’t want to be left out of the conversation.

  It’s interesting that despite all reason, we remain naturally tuned in to popularity. Yet this instinct doesn’t always help us. Sometimes our tendency to follow the herd can have serious consequences.

  Economists suggest that the lure of popularity has been responsible for some of the most peculiar and damaging trends in history. In 1841, Scottish journalist Charles MacKay wrote about the human impulse to follow the herd in his famous book Memoirs of Extraordinary Popular Delusions and the Madness of Crowds, in which he examined the tendency for an asset to gain in value well beyond its intrinsic worth simply because it is perceived as popular—a phenomenon we now refer to as a “market bubble.” In one chapter, he recounts the great fervor that ignited in the early seventeenth century over a particular tulip species that had been imported to Holland. This blossom offered no apparent superior value in its beauty, scent, or longevity when compared to indigenous species, but its popularity grew nonetheless. As this passion for the tulip spread from the aristocrats to the middle class, and eventually to those with scant means, the flower’s value soared, ultimately garnering huge sums for even a blossom weighing less than a gram. Reportedly, this type of tulip became so valuable that visitors to Holland were imprisoned if they unwittingly damaged a bulb. Such “overvaluation” of a commodity based on its popularity rather than on its actual worth is the basis for an unsustainable market, and is the same phenomenon that accounted for the stock market’s dot-com bubble of the 1990s.

  As more Dutch were afflicted by what MacKay called “Tulipomania” and the popularity of the flowers increased, prices continued to rise. The Dutch assumed that citizens from throughout Europe would share their enthusiasm, bringing increased value to their investments. Tulip dealers accordingly mortgaged their homes and spent their fortunes to purchase more bulbs, while businessmen began neglecting other profitable industries. Of course, the tulips never turned out to be worth anything near what people had paid for them, and when the flower’s price dropped dramatically, the collapse of the market bubble ultimately threatened the entire Dutch economy. MacKay concluded, “Men . . . think in herds; it will be seen that they go mad in herds, while they only recover their senses slowly, and one by one.”

  In my own research, I have found that the instinct to embrace the popular can lead to behaviors with even worse consequences. My work has been designed to understand how herd-following patterns begin when we are young, and specifically just how far youth will go to be like one another. What will adolescents do when they are told that their popular peers endorse behaviors that are dangerous or illegal, and how will they react when they are asked to be mean toward one another, even though they know it is morally wrong to do so?

  In one study, my former colleague and now Stanford psychologist Geoff Cohen and I examined a range of risky behaviors. We asked kids about drinking alcohol, having sex without a condom, smoking marijuana, and using harder drugs, like heroin and cocaine. We also questioned them about bullying, dangerous eating behaviors like bingeing and purging, and using hormones and drugs to change their body shape.

  In the United States, about one of every four adolescents has used alcohol before the age of fourteen, 25 percent have had five drinks in a row before high school graduation, one in five smoked pot before age fourteen, and 40 percent of teens report that they did not use a condom the last time they had sex. More than one out of ten say they have fasted just to try to look thinner. These are all remarkably risky behaviors that strongly predict which teens will become pregnant before graduating high school, grow up to have substance abuse problems, develop serious eating disorders, and even get cervical cancer.

  Of course, when we asked adolescents in our study whether they would be likely to engage in these behaviors, the vast majority told us that drugs are bad and that everyone should use a condom, be nice to others, and keep a healthy attitude about physical appearance.

  We then invited each of these same subjects to participate in a simulated online chat room along with three of their most popular fellow students. In fact, it was not a real chat room at all, but a computer program we developed using intricate graphics and timing to convince adolescents that they were talking live with the cool kids from their own school. The others in the chat room were phantoms, or “electronic confederates,” that we identified as highly popular by listing the first name and last initial of actual grade-mates in our participants’ school. Our deception worked. At the end of the experiment, when we revealed our procedures, our subjects told us they really believed they were online, and even reported excitedly that they believed they knew exactly which of their peers was taking part in their chat.

  In this counterfeit chat room, we again asked adolescents the same questions about risky behaviors, but this time, we had the fictitious peers take part first and report that they would be very likely to engage in each bad habit. Our participants were then asked again to respond to the same questions, first while they believed their peers were watching their responses, and then after they had ostensibly logged out of the chat room, so we could make sure they weren’t just showing off for the cool kids.

  What we found was that simply knowing that their popular peers would be likely to drink alcohol, smoke pot, or have unprotected sex was sufficient to change adolescents’ answers—and dramatically so. Suddenly our participants were far more apt to say they would engage in all these behaviors than they had been when they began our study. Even when they had logged off and were told that none of their peers were watching, our subjects continued to state that they would pursue those risky actions.

  We then took the study one step further. Rather than simply asking adolescents what they would do hypothetically, we gave our participants the chance to actually do something they shouldn’t do in real life. After responding to a few simple
questions about hobbies and interests within the chat room, during which we manipulated the responses of one confederate to seem a bit deviant from the others, we offered our participants the option to vote one of their “peers” off the experiment. They were instructed that for someone to be kicked out, he would have to be voted off by all of the others in the chat room unanimously. They were also told that the evicted participant would lose out on the chance to meet the others, and would not receive the reward we offered for completing the task.

  In each case, the subjects were asked to cast the deciding vote: the fate of their peer was in their hands, and they had the choice to be kind or to be mean. What they didn’t know was that, in this case, the individual they were voting off wasn’t a real person. They were also unaware that the other votes were fabricated.

  Once the subjects saw that their popular peers had voted against one of their own, eight out of ten of our participants voted to evict as well.

  Why are we so likely to follow the herd?

  On a sunny day in the year 60,000 BC in what is now southern Europe, a lone female enters a crowded cave where others are sitting down to eat their latest kill. But when she attempts to take her place at the rock where her hominin friends usually eat, she is shunned. It is the time of the full moon, and the others at her rock have a rule that on these days, females must wear fur clothes. This particular woman is wearing a wrap made of animal skin.

  “You can’t sit with us!” the women seem to grunt. Finally, with no place else to eat, she leaves the cave, walks a few dozen yards, and sits alone. Moments later, she is attacked by a wooly mammoth and is never heard from again.

  OK, this didn’t happen, either. But research suggests that there may be something about unpopularity then that has a lot to do with the humans we have become today.

  Back in 60,000 BC, we were not the only humanlike species on the planet. Anthropologists believe that in addition to the beings who had migrated out of Africa and closely resembled our own species today, there were Neanderthals in the north, Denisovans in Asia, and even a small humanlike species called Homo floresiensis in Indonesia. Yet only we humans ultimately survived. We endured not because we were the strongest—in fact, the Neanderthals were a bit larger, with bigger teeth, and probably could have won any battle against relatively weak humans. It wasn’t because we had bigger brains, either.

  There is one quality that is unique among humans and has been credited to be a fundamental factor in our evolutionary advantage. While some species became larger, or stronger, or able to withstand more severe temperatures, it was we humans who learned how to work together. Anthropology research reveals that, unlike other hominins, humans had the genes to form and comprehend complex vocal sounds. Language ability formed the basis for more sophisticated social interactions. Soon we became a species that could organize into groups and network with our peers in ways that were far superior to those of the others.

  Living as a herd offered many survival advantages. By working as a community and sharing tools, we could hunt more effectively. By sharing the spoils of the hunt, we could eat food while it remained fresh and safe to consume. Joining together in groups enabled us to warn and protect one another when predators threatened. We soon evolved to become acutely sensitive to social cues, and through the process of natural selection, our species eventually favored only those who were attuned to the herd. Those individuals who remained solitary became extinct.

  It’s been thousands of years since we needed one another to survive our daily lives. Rarely does someone today venture out to Starbucks alone only to be attacked by a woolly mammoth. But the vestigial effects of our evolution as social creatures are still visible in many subtle ways. Have you ever wondered why yawns are contagious and the menstrual cycles of women living together synchronize? Some hypothesize that even these phenomena reflect our genetic programming as a social species. The herd worked most effectively when everyone was able to move as a single unit and stop together for resting, mating, or childbirth. Today, we still have instincts that make us become tired simultaneously or fertile at the same times.

  —

  So what happens if we don’t follow the herd and choose to remain alone, isolated, unpopular?

  Over the past several decades, scientists have demonstrated that being unpopular can actually be harmful. It’s not hard to imagine how solitude can lead to emotional difficulties. Those who are ostracized, alienated, bullied, or victimized are more likely to experience loneliness, low self-esteem, anxiety, and depression. But there is now also evidence that being unpopular may even have dire consequences for our physical health—to the point that it can kill us.

  Julianne Holt-Lunstad, a psychologist at Brigham Young University, recently conducted a meta-analysis—a study of studies—combining the data from 148 prior investigations. Each asked the same basic question: does being unpopular increase the risk of death? Collectively, these studies included 308,000 participants between the ages of six and eighty from all over the world. Each included two basic procedures. First, the investigators measured the size of participants’ social networks, the number of their friends, whether they lived alone, and the extent to which they participated in social activities. Then, they followed each participant for months, years, and even decades to track their mortality rate.

  The results revealed that being unpopular—isolated, disconnected, lonely—actually predicts mortality rates. But perhaps even more surprising is just how powerful these effects can be. People in the study who had larger networks of friends had a 50 percent increased chance of survival by the end of the study. It didn’t matter whether the participants were male or female, whether they had health problems to begin with, or where in the world they lived. Being disconnected from the herd substantially increased the risk of death.

  But not every kind of connection was equally important. And this finding was key, because it gives us a clue as to which type of popularity really matters.

  Simply living with someone, or having a spouse, was related to increased life expectancy, but not very significantly. It was those people who actively participated in their social lives and had good-quality relationships who seemed to benefit the most. In other words, it was those individuals who had relationships that the most likable people tend to have that seemed to have the advantage. Their chance of survival was 91 percent higher than those who were essentially alone. In other words, almost twice the number of popular people were alive at the end point of the study as those who were unpopular. This is a highly significant finding. Comparing these figures to research on established health risks suggests that being unpopular more strongly increases our chance of death than does obesity, physical inactivity, or binge drinking. In fact, the only factor comparable to unpopularity as a health hazard is smoking!

  How could our social lives, or lack thereof, kill us? Could effects like these be accounted for by intentional self-harm? Perhaps those most socially isolated, ostracized, or friendless are especially likely to commit suicide?

  This is certainly true. In the United States, suicide is the second leading cause of death in adolescence and young adulthood; it remains one of the top ten causes of death until the age of sixty-five. One of the most common risk factors for suicide attempts is feeling lonely, like a burden to others, or like one doesn’t belong. Among adolescents in particular, ostracism from a peer group is an especially strong predictor of suicidal behavior. We are painfully reminded of this every time we hear of another teen who commits suicide after being tormented in school or online.

  But remarkably, intentional self-harm does not account for the link between unpopularity and mortality. In Holt-Lunstad’s meta-analysis, studies that measured death by suicide were excluded.

  In fact, recent evidence suggests that those who are socially disconnected are at risk for a wide range of physical health problems that can cause death. In 2016, Kathleen Mullan Harris, a sociologist
at the University of North Carolina at Chapel Hill, examined how social connections might predict coronary artery disease, hypertension, cancer, and stroke. Her research accessed data from four large, nationally representative samples that collectively included about fifteen thousand Americans between the ages of twelve and eighty-five. As with Holt-Lunstad’s meta-analysis, Harris’s group examined first social integration and then a range of physical health indices between five and twelve years later.

  What Harris found was that having friends or a romantic partner, socializing with neighbors, and volunteering substantially decreased the risk of physical illness. Those who were socially isolated when the study began were most likely to develop high blood pressure. They were also most likely to have high levels of C-reactive protein in their blood—a harbinger of inflammation-related health problems, like rheumatoid arthritis, inflammatory bowel disease, and heart attacks. None of these findings seemed to have any relation to participants’ gender, race, educational attainment, income, history of smoking, alcohol use, physical activity, stress, or depression. Of course, it is impossible to ascertain whether it was unpopularity that caused these health problems per se, but the results are among the most powerful to suggest that even after accounting for so many other possible explanations, social isolation seemed to be the most powerful prognosticator of illness years later.

  We now know at least a few reasons why being popular, even as adults, is more important than we ever imagined. One simply has to do with the psychological effects of unpopularity. Being disliked means that we lack social support, so in times of stress, we have no one to turn to, to help us out of trouble. In one study of women with breast cancer, simply participating in a support group with other patients was a significant predictor of life expectancy, even after accounting for other possible factors that could have explained these effects.

 

‹ Prev