Book Read Free

You May Also Like

Page 21

by Tom Vanderbilt


  When a similar experiment was performed with preschool children, however, the kids “tended to re-create the actions they observed without appearing to consider the causal efficiency of their behaviour.” It is not as if the kids could not figure out cause and effect or that opening the box was too complex (for they seemed to imitate closely even when the task was made easier). Rather, suggested Horner and Whiten, the children seemed to focus more on the model than the task, even when that model was not showing them the easiest way to open the box. To ape is to be human.

  —

  If you are the parent of a small child, as I am, you probably do not need an experiment to inform you of children’s tendency to imitate. One day, I asked my daughter why her pant legs were pushed up slightly. Because her friend Madeline’s were, she told me. “Did you like the way it looked, or is it because you like your friend?” I asked. The question seemed to confuse her, and I sensed she wanted to say, “Both,” without being able to disentangle the causes. It just seemed something worth copying, for whatever reason.

  Ironically, the things that are often the least functional—like small variations in fashion—are the ones we seem to most want to copy. This is precisely because, the sociologist Georg Simmel suggested over a century ago, “they are independent of the vital motives of human action.” Minor fashion gradations acquire such great power in their very lack of meaning, as well as the relatively low costs of switching. As noted by Adam Smith, “The modes of furniture change less rapidly than those of dress; because furniture is commonly more durable.”

  But imitation is going on everywhere. Recall the preschool experiments mentioned in chapter 1; children’s food choices depended on what the other kids at their table were eating. Humans seem programmed to learn socially, as if in the face of uncertainty we instinctively rely on what others are doing. So powerful is this instinct that we not only look to others to see what to do but choose to do the things that others are looking at. In a study conducted by Henrich and other researchers at the University of British Columbia, children watched videos of adult “models” consuming food. Some models had bystanders watching them; others had bystanders looking away. When later asked what food they would prefer, children were more likely to choose the food eaten by the model who was watched by others. “When environmental cues are not of sufficiently high quality,” write Henrich and Robert Boyd, “individuals imitate.”

  Think of the psychologist Stanley Milgram’s famous New York City street corner experiment in which he had people look up toward a building—at nothing. The more people who were doing it, the more others stopped to look. And why not? How could there not be something valuable in what so many others were doing?*2

  —

  But if social learning is so easy and efficient, if all this imitation is such a good way to ensure the survival of our genes, it raises the question of why anyone does anything different to begin with. Or indeed why someone, like Spyke, might abandon an innovation. It is a question asked of evolution itself: Why is there so much stuff for natural selection to sift through? Survival of the fittest, as the biologist Hugo de Vries pointed out, does not explain “arrival of the fittest.” Jørn Utzon could have turned in a more traditional opera house design; the Impressionists could have played more to the tastes of the current market. The artist or innovator who was attacked in his day seems like some kind of genetic altruist, sacrificing his own immediate fitness for some future payoff at the level of the group.

  Boyd and Richerson suggest there is an optimal balance between social and individual learning in any group. Too many social learners, and the ability to innovate is lost: People know how to catch that one fish because they learned it from the smart elder, but what happens when that fish dies out? Too few social learners, and people might be so busy trying to learn things on their own that the society does not thrive; while people were busily inventing their own better bow and arrow, someone forgot to actually get food.

  Perhaps some ingrained sense of the evolutionary utility of this differentiation explains why humans—particularly the “WEIRD” ones*3—are so torn between wanting to belong to a group and wanting to be distinct individuals. Let us call it conformist distinction. People want to feel that their tastes are not unique, yet they feel an “anxiety” when told they are exactly like another person. Think of the giddy discomfort you feel when a co-worker shows up wearing a similar outfit. The inevitable joke: “Did you guys coordinate your wardrobe this morning?” We seek some happy medium, like the Miss America contestant in Woody Allen’s Bananas who responds to a reporter’s question, “Differences of opinion should be tolerated, but not when they’re too different.”

  Under a theory called optimal distinctiveness, people affiliate with groups in ways that let them feel as if they belong and yet are apart (this can be felt in something as simple as ordering food at a group dinner). If all we did was conform, there would be no taste; nor would there be taste if no one conformed. We conform locally and differentiate globally. The psychologists Matthew Hornsey and Jolanda Jetten have identified the ways we try to do that. One is to select the right-sized group or, if the group is too large, choose a subgroup. Be not just a Democrat but a centrist Democrat. Do not like just the Beatles; be a fan of John’s.

  Another strategy of conformist distinction is dubbed superior conformity of the self. You can show your individuality by, paradoxically, showing how much more you conform to the norms of a group than someone else: for example, “I’m more punk/country/Republican/vegan [insert your group here] than you.” In a study of people with body piercings, the ones who identified most with the group—the most conformist—were precisely those who wanted to be as distinct as they could be from the mainstream.

  When distinguishing yourself from the mainstream becomes too exhausting, you can always just ape some perceived version of the mainstream. This was the premise behind the “normcore” antifashion trend, in which once energetically fashionable people were said to be downshifting, out of sheer fatigue, into humdrum New Balance sneakers and unremarkable denim. Normcore was more conceptual art project than business case study, but one whose premise—“the most different thing to do is to reject being different altogether,” ran the manifesto—seemed so plausible it was practically wish fulfilled into existence by a media that feasts upon novelty as Saturn did his son. As new as normcore seemed, Georg Simmel was talking about it a century ago: “If obedience to fashion consists in imitation of an example, conscious neglect of fashion represents similar imitation, but under an inverse sign.”

  And so back to Spyke. When he felt his drive for individuality (which he shared with others who were like him) threatened by someone from outside the group, he moved on. All the things he felt were threatened—the chin beard, the shell art—and that he was willing to walk away from were, of course, nonfunctional. As Jonah Berger and Chip Heath point out, we signal our identity only in certain domains: Spyke is not likely to change his brand of toilet paper or toothbrush just because he learns it is shared by his nemesis. When everyone listened to records on vinyl, they were a commodity material that allowed one to listen to music; it was not until they were nearly driven to extinction as a technology that they became a way to signal one’s identity—and as I write, there are stirrings of a “cassette revival.”

  In a revealing experiment conducted at Stanford University, Berger and Heath sold Lance Armstrong Foundation Livestrong wristbands (at a time when they were becoming increasingly popular) in a “target” dorm. The next week, they sold them in a dorm known for being somewhat “geeky.” A week later, the number of target dorm band wearers dropped by 32 percent. It was not that people from the target dorm disliked the geeks—or so they said—it was that they thought they were not like them. And so the yellow piece of rubber, worn for a good cause, became a vessel of identity signaling, of taste. The only way the target group could avoid being symbolically linked with the geeks was to “abandon” the taste and move on to something else.
As much a search for novelty, new tastes can be a conscious rejection of what has come before—and a distancing from those now enjoying that taste. “I liked that band before they got big,” goes the common refrain.

  The anthropologist Richard Wilk notes that because it is much easier to signal likes in public than dislikes, “this might help explain why consumption is often conspicuous, while avoidance and taboo is usually more subtle and subdued.” When you see someone coming out of a butcher’s, Wilk notes, you can be sure she likes meat. When you see someone buying vegetables, however, she is not necessarily signaling she does not like meat.

  Disliking is arguably more of a force in forming social cohesion than liking. As the historian John Mullan noted, one of the first references to “good taste” (not of the food kind) in England, William Congreve’s 1693 play The Double-Dealer, “is saying someone hasn’t got it.” Shared group dislikes have hugely influenced the history of art, as E. H. Gombrich pointed out: “Most movements in art erect some new taboo, some negative principle,” based on the “principle of exclusion.” From Impressionism to punk rock, artists have set themselves against some prior artistic status quo. The Dadaists simply took this to its extreme, declaring themselves “against everything.”

  What our tastes “say about us” is mostly that we want to be like other people whom we like and who have those tastes—up to a point—and unlike others who have other tastes. This is where the idea of “conformist transmission,” of simply socially learning what everyone else is doing, gets complicated. Sometimes we learn what others are doing and then stop doing that thing ourselves. Like Dr. Seuss’s Sneetches, we “counter-imitate.”

  Then there is the question of whether we are actually conscious of picking up a behavior from someone else. When someone knows he is being influenced by another and that other person knows it too, that is persuasion; when someone is unaware he is being influenced, and the influencer is unaware of his influence, that is contagion. In taste, we are rarely presumed to be picking up things randomly. Through “prestige bias,” for example, we learn from people who are deemed socially significant. The classic explanation in sociology was always trickle-down: Upper-class people embraced some taste, people lower down followed, then upper-class people rejected the taste and embraced some new taste. “Naturally the lower classes look and strive towards the upper,” wrote Simmel, as if citing a biological law.

  But it does not always work so neatly. Consider the use in English of the “quotative ‘like,’ ” that now ubiquitous tendency to say something along the lines of “I was like, ‘No way.’ ” This conquered the language via young middle-class girls (hardly Bourdieu’s cultural elite). In culture, the omnivores, as discussed in chapter 3, routinely go “down” in their listening. A food like lobster has ping-ponged multiple times in history, between aspirational upper-class treat and a sign of “poverty and degradation.” Then there is the nettlesome problem Bourdieu left off the table: Even among similar social classes, tastes will diverge. What drives that?

  Tastes can change when people aspire to be different from other people; they can change when we are trying to be like other people. Groups “transmit” tastes to other groups, but tastes themselves can help create groups. Small, seemingly trivial differences—what sort of coffee one drinks—become “real” points of cultural contention. The more people who have access to what is said to be proper taste, the finer those gradations become. Witness the varieties of “distinction” now available in things that were once rather homogeneous commodities, like coffee and blue jeans; who knew what “single origin” or “selvage” was a few decades ago? There is an ebb and flow of conformity and differentiation and an almost paradoxical cycle: An individual, like Spyke in Portland, wants to be different. But in wanting to express that difference, he seeks out others who share those differences. He conforms to the group, but the conformists of that group, in being alike, increase their sense of difference from other groups, just as the Livestrong bracelet wearers took them off when they saw another group wearing them. The adoption of tastes is driven in part by this social jockeying, this learning and avoidance. But this is not the whole picture. Sometimes tastes change simply because of errors and randomness.

  ACCIDENTALLY FAMOUS: ON THE RANDOMNESS AND THE UNPREDICTABILITY OF TASTE

  In a small patch of clearing where power lines snake through a forest in the Berkshires, a team of researchers from the University of Massachusetts has been recording, over several decades, the songs of the chestnut-sided warbler, a small New World warbler with a jaunty yellow crown. The songs, as judged by Audubon’s Guide to North American Birds, are “rich and musical with an emphatic ending.” There are two general types of songs the birds sing: “accented” and “unaccented” (the former has a “loud and distinctive terminal downsweep syllable”; the latter does not). Accented songs are generally used to attract mates; indeed, male warblers, like husbands who “give up” on their appearance after courtship is concluded, largely eschew singing them once they have shacked up. Unaccented songs, meanwhile, are often deployed in male-on-male conflict.

  Looking over the course of their warbler song recordings, the researchers found that the unaccented songs that seemed most popular with the warblers in 1988 were almost entirely gone by 1995, replaced by a whole new repertoire. Rather like the Billboard Hot 100 charts, the chestnut-sided warbler culture had in a rather short span moved on, musically, to a whole new set of “tastes.” What was going on? Why would novelty arise when the adaptive fitness of a species or an individual bird, the ability to pass on genes to the next generation, so often favors conformity in communication—singing the songs everyone knows, the way everyone knows them? Were male warblers engaging in impromptu song battles, like New York hip-hop DJs in the 1980s, trying to slay their opponent with their virtuosity, their clever turns of musical phrase?

  Bruce Byers, a biologist at the University of Massachusetts and the lead researcher on the study, thinks there is something more prosaic at work: The birds are simply getting the songs wrong. “Individuals within a species vary in the precision with which they can imitate,” he told me. “Just like people.” And perfect imitation, he noted, “presumably has some costs. You have to maintain the brainpower necessary to do precise imitations. So unless there is some huge benefit to offset those costs, you expect some slack, some slight discrepancies in the copy, as compared to the model the birds imitate.” As with a game of telephone, as the songs get passed down the line, “these slight variations accumulate rapidly enough so that songs turn over completely within a decade or so.”

  The accented, mate-attracting songs, by contrast, hardly changed at all. Byers suspects these are the songs where getting it right really counts. Females, as he has found, seem to prefer male birds who, like some avian Marvin Gaye, sang “more consistently and at higher pitch.” Having males singing the same songs makes it easier to tell who is doing it best; if you are a male (and you want your genes passed on), it makes sense spending the extra energy to really nail that song.

  With the evolving, unaccented songs, it was not as if the birds were craving novelty or that some creative bird set out to invent a new style. Nor were they slavishly imitating the new song variants of some prestigious warbler. And it was not as if the songs changed overnight. In the fashion of Raymond Loewy’s “most advanced, yet acceptable” principle, it is likely that each new set of songs was recognizably similar to those that had come before, with a small twist. Because many of the unaccented songs are used less frequently, only dusted off against rivals, the birds are probably a bit rusty with them.

  The songs that disappear first, as you might imagine, are the ones that were rare to begin with. Birdsongs are rather like the aptly named tweets of Twitter: Memes that thrive among bird populations, just like the spread of hashtags on Twitter, require, as a base condition for survival, wider sharing (that is, more “followers”) and more frequent expression (that is, retweeting). Otherwise, they are likely to suff
er “extinction by chance.”

  So error, and random copying, were driving changes in bird culture (while other elements stayed the same). In humans, we might think of the unchanging, accented songs as things like “core” beliefs: religion, morals, one’s sense of self. Because these are more important in the long-term evolutionary sense, we invest more energy in them. The unaccented songs, by contrast, are like fashion or preferences, subject to change precisely because it is not so important they stay the same, being generally less useful to our evolutionary success (think of the low success rate of online dating, which often relies heavily on pure statistical matching of easily conveyed information like favorite music or hobbies).

  For a tangible human example of what was happening with the birdsong, consider irregular verbs in English. Why have some been converted, over time, into “regularized” verbs? And why have some stayed irregular? As the data scientists Erez Aiden and Jean-Baptiste Michel note, we no longer say that something “throve” but that it “thrived.” Using a database of English texts, they found that the more often an irregular verb was used, the more likely it was to stay irregular. Why? Because the irregular verbs we hardly ever encounter are the ones whose irregular forms we are least likely to remember; hence we convert them, through error, into regular verbs.

  It is, they suggest, a process of cultural selection: “The more frequent a verb is, the more fit it is to survive.” No conspiracy sought to kill off “throve,” nor is “thrived” any more inherently appealing. “Thrived” thrived because people simply had trouble remembering the rarely used irregular form. People made mistakes, “thrived” got copied, more or less randomly, and, presto, over a few hundred years the past tense of “thrive” got changed into something new, much as the warblers’ songs did.

 

‹ Prev