The Rules of Contagion

Home > Nonfiction > The Rules of Contagion > Page 11
The Rules of Contagion Page 11

by Adam Kucharski


  By studying potentially contagious behaviour, researchers are also uncovering some crucial differences between biological and social outbreaks. In the 1970s, sociologist Mark Granovetter suggested that information could spread further through acquaintances than through close friends. This was because friends would often have multiple links in common, making most transmission redundant. ‘If one tells a rumor to all his close friends, and they do likewise, many will hear the rumor a second and third time, since those linked by strong ties tend to share friends.’ He referred to the importance of acquaintances as the ‘strength of weak ties’: if you want access to new information, you may be more likely to get it through a casual contact than a close friend.[46]

  These long distance links have become a central part of network science. As we’ve seen, ‘small-world’ connections can help biological and financial contagion jump from one part of a network to another. In some cases, these links may also save lives. There is a long-standing paradox in medicine: people who have a heart attack or stroke while surrounded by relatives take longer to get medical care. This may well be down to the structure of social networks. There’s evidence that close-knit groups of relatives tend to prefer a wait-and-see approach after witnessing a mild stroke, with nobody willing to contradict the dominant view. In contrast, ‘weak ties’ – like co-workers or non-relatives – can bring a more diverse set of perspectives, so flag up symptoms faster and call for help sooner.[47]

  Even so, the sort of network structure that amplifies disease transmission won’t always have the same effect on social contagion. Sociologist Damon Centola points to the example of hiv, which has spread widely through networks of sexual partners. If biological and social contagion work in the same way, ideas about preventing the disease should also have spread widely via these networks. And yet they have not. Something must be slowing the information down.

  During an infectious disease outbreak, infection typically spreads through a series of single encounters. If you get the infection, it will usually have come from a specific person.[48] Things aren’t always so simple for social behaviour. We might only start doing something after we’ve seen multiple other people doing it, in which case there is no single clear route of transmission. These behaviours are known as ‘complex contagions’, because transmission requires multiple exposures. For example, in Christakis and Fowler’s analysis of smoking, they noted that people were more likely to quit if lots of their contacts stopped as well. Researchers have also identified complex contagion in behaviours ranging from exercise and health habits to the uptake of innovations and political activism. Whereas a pathogen like hiv can spread through a single long-range contact, complex contagions need multiple people to transmit them, so can’t pass through single links. While small-world networks might help diseases spread, these same networks could limit the transmission of complex contagions.

  Why do complex contagions occur? Damon Centola and his colleague Michael Macy have proposed four processes that might explain what’s happening. First, there can be benefits to joining something that has existing participants. From social networks to protests, new ideas are often more appealing if more people have already adopted them. Second, multiple exposures can generate credibility: people are more likely to believe in something if they get confirmation from several sources. Third, ideas can depend on social legitimacy: knowing about something isn’t the same as seeing others acting – or not acting – on it. Take fire alarms. As well as signaling there might be a fire, alarms make it acceptable for everyone to leave the building. One classic 1968 experiment had students sit working in a room as it slowly filled with fake smoke.[49] If they were alone, they would generally respond; if they were with a group of studious actors, they would continue to work, waiting for someone else to react. Finally, we have the process of emotional amplification. People may be more likely to adopt certain ideas or behaviours amid the intensity of a social gathering: just think about the collective emotion that comes with something like a wedding or a music concert.

  The existence of complex contagions means we may need to re-evaluate what makes innovations spread. Centola has suggested that intuitive approaches for making things catch on may not work so well if people need multiple prompts to adopt an idea. To get innovation to spread in business, for example, it’s not enough to simply encourage more interactions within an organisation. For complex contagions to spread, interactions need to be clustered together in a way that allows social reinforcement of ideas; people may be more likely to adopt a new behaviour if they repeatedly see everyone in their team doing it. However, organisations can’t be too cliquey, otherwise new ideas won’t spread beyond a small group of people. There needs to be a balance in the network of interactions: as well as having local teams acting as incubators for ideas, there are benefits to having Pixar-style overlaps between groups to get innovations out to a wider audience.[50]

  The science of social contagion has come a long way in the past decade, but there is still much more to discover. Not least because it’s often difficult to establish whether something is contagious in the first place. In many cases, we can’t deliberately change people’s behaviour, so we have to rely on observational data, as Christakis and Fowler did with the Framingham study. However, there is another approach emerging. Researchers are increasingly turning to ‘natural experiments’ to examine social contagion.[51] Rather than imposing behavioural change, they instead wait for nature to do it for them. For example, a runner in Oregon might change their routine when the weather is bad; if their friend in California changes their behaviour too, it could suggest social contagion is responsible. When researchers at MIT looked at data from digital fitness trackers, which included a social network linking users, they found that the weather could indeed reveal patterns of contagion. However, some were more likely to catch the running bug than others. Over a five-year period, the behaviour of less active runners tended to influence more active runners, but not the other way around. This implies that keen runners don’t want to be outdone by their less energetic friends.

  Behavioural nudges like changes in weather are a useful tool for studying contagion, but they do have limits. A rainy day might alter someone’s running patterns, but it’s unlikely to affect other, more fundamental behaviours like their marital choices or political views. Dean Eckles points out there can be a big gap between what is easily changed and what we ideally want to study. ‘A lot of the behaviours we care the most about are not so easy to nudge people to do.’

  In november 2008, Californians voted to ban same-sex marriage. The result came as a shock to those who’d campaigned for marriage equality, especially as pre-vote polls had appeared to be in their favour. Explanations and excuses soon began to emerge. Dave Fleischer, director of the Los Angeles LGBT Center, noticed that several misconceptions about the result were becoming popular. One was that the people who voted for the ban must have hated the LGBT community. Fleischer dis­agreed with this idea. ‘The dictionary defines “hate” as extreme aversion or hostility,’ he wrote after the vote. ‘This does not describe most who voted against us.’[52]

  To find out why so many people were against same-sex marriage, the LGBT Center spent the next few years conducting thousands of face-to-face interviews. Canvassers used most of this time to listen to voters, a method known as ‘deep canvassing’.[53] They encouraged people to talk about their lives, and reflect on their own experiences of prejudice. As they conducted these interviews, the LGBT Center realised that deep canvassing wasn’t just providing information; it appeared to be changing voters’ attitudes. If so, this would make it a powerful canvassing method. But was it really as effective as it seemed?

  If people are rational, we might expect them to update their beliefs when presented with new information. In scientific research this approach is known as ‘Bayesian reasoning’. Named after eighteenth-century statistician Thomas Bayes, the idea is to treat knowledge as a belief that we have a certain level of confidence
in. For example, suppose you are strongly considering marrying someone, having thought carefully about the relationship. In this situation, it would take a very good reason for you to change your mind. However, if you’re not totally sure about the relationship, you might be persuaded against marriage more easily. Something that might seem trivial to the infatuated may be enough to tip a wavering mind towards a break-up. The same logic applies to other situations. If you start with a firm belief, you’ll generally need strong evidence to overcome it; if you are unsure at first, it might not take much for you to change your opinion. Your belief after exposure to new information therefore depends on two things: the strength of your initial belief and the strength of the new evidence.[54] This concept is at the heart of Bayesian reasoning – and much of modern statistics.

  Yet there are suggestions that people don’t absorb information in this way, especially if it goes against their existing views. In 2008, political scientists Brendan Nyhan and Jason Reifler proposed that persuasion can suffer from a ‘backfire effect’. They’d presented people with information that conflicted with their political ideology, such as the lack of weapons of mass destruction in Iraq before the 2003 war, or the decline in revenues following President Bush’s tax cuts. But it didn’t seem to convince many of them. Worse, some people appeared to become more confident in their existing beliefs after seeing the new information.[55] Similar effects had come up in other psychological studies over the years. Experiments had tried to persuade people of one thing, only for them to end up believing something else.[56]

  If the backfire effect is common, it doesn’t bode well for canvassers hoping to convince people to change their minds about issues like same-sex marriage. The Los Angeles LGBT Center thought they had a method that worked, but it needed to be evaluated properly. In early 2013, Dave Fleischer had lunch with Donald Green, a political scientist at Columbia University. Green introduced Fleischer to Michael LaCour, a graduate student at UCLA, who agreed to run a scientific study testing the effectiveness of deep canvassing. The aim was to carry out a randomised controlled trial. After recruiting voters to participate in a series of surveys, LaCour would randomly split the group. Some would get visits from a canvasser; others, acting as a control group, would have conversations about recycling.

  What happened next would reveal a lot about how beliefs change, just not quite in the way we might expect. It started when LaCour reported back with some remarkable findings. His trial had shown that when interviewers used deep canvassing methods, there was a large increase in interviewees’ support for same-sex marriage on average. Even better, the idea often stuck, with the new belief still there months later. This belief was also contagious, spreading to interviewees’ housemates. LaCour and Green published the results in the journal Science in December 2014, attracting widespread media attention. It seemed to be a stunning piece of research, showing how a small action could have a massive influence.[57]

  Then a pair of graduate students at the University of Berkeley noticed something strange. David Broockman and Joshua Kalla had wanted to run their own study, building on LaCour’s impressive analysis. ‘The most important paper of the year. No doubt,’ Broockman had told a journalist after the Science paper was published. But when they looked at LaCour’s dataset, it seemed far too pristine; it was almost as if someone had simulated the data rather than collecting it.[58] In May 2015, the pair contacted Green with their concerns. When questioned, LaCour denied making up the data, but couldn’t produce the original files. A few days later, Green – who said he’d been unaware of the problems until that point – asked Science to retract the paper. It wasn’t clear exactly what had happened, but it was clear that LaCour hadn’t run the study he said he had. The scandal came as a huge disappointment to the Los Angeles LGBT Center. ‘It felt like a big punch to our collective gut,’ said Laura Gardiner, one of their organisers, after the problems emerged.[59]

  Media outlets quickly added corrections to their earlier stories, but perhaps journalists – and the scientific journal – should have been more sceptical in the first place. ‘What interests me is the repeated insistence on how unexpected and unprecedented this result was,’ wrote statistician Andrew Gelman after the paper was retracted. Gelman pointed out that this seems to happen a lot in psychological science. ‘People argue simultaneously that a result is completely surprising and that it makes complete sense.’[60] Although the backfire effect had been widely cited as a major hurdle to persuasion, here was a study claiming it could be cleared in one short conversation.

  The media has a strong appetite for concise yet counter-intuitive insights. This encourages researchers to publicise results that show how ‘one simple idea’ can explain everything. In some cases, the desire for surprising-yet-simple conclusions can lead apparent experts to contradict their own source of expertise. Antonio García Martínez, who spent two years working in Facebook’s ads team, recalled such a situation in his book Chaos Monkeys. Martínez tells the story of a senior manager who built a reputation with pithy, memorable insights about social influence. Unfortunately for the manager, these claims were undermined by research from his company’s own data science team, whose rigorous analysis had shown something different.

  In reality, it’s very difficult to find simple laws that apply in all situations. If we have a promising theory, we therefore need to seek out examples that don’t fit. We need to work out where its limits are and what exceptions there might be, because even widely reported theories might not be as conclusive as they seem. Take the backfire effect. After reading about the idea, Thomas Wood and Ethan Porter, two graduate students at the University of Chicago, set out to see how common it might actually be. ‘Were the backfire effect to be observed across a population, the implications for democracy would be dire,’ they wrote.[61] Whereas Nyhan and Reifler had focused on three main misconceptions, Wood and Porter tested thirty-six beliefs across 8,100 participants. They found that although it can be tough to convince people they’re wrong, an attempted correction doesn’t necessarily make their existing belief stronger. In fact, only one correction backfired in the study: the false claim about weapons of mass destruction in Iraq. ‘By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments,’ they concluded.

  Even in their original study, Nyhan and Reifler found that the backfire effect is not guaranteed. During the 2004 presidential campaign, Democrats claimed that George Bush had banned stem cell research, whereas in reality, he’d limited funding for certain aspects of it.[62] When Nyhan and Reifler corrected this belief among liberals, the information was often ignored, but didn’t backfire. ‘The backfire effect finding got a lot of attention because it was so surprising,’ Nyhan later said.[63] ‘Encouragingly, it seems to be quite rare.’ Nyhan, Reifler, Wood and Porter have since teamed up to explore the topic further. For example, in 2019 they reported that providing fact-checks during Donald Trump’s election speeches had changed people’s beliefs about his specific claims, but not their overall opinion of the candidate.[64] It seems some aspects of people’s political beliefs are harder to alter than others. ‘We have a lot more to learn,’ Nyhan said.

  When examining beliefs, we also need to be careful about what we mean by a backfire. Nyhan has noted that there can be confusion between the backfire effect and a related psychological quirk known as ‘disconfirmation bias’.[65] This is when we give more scrutiny to arguments that contradict our existing beliefs than those that we agree with. Whereas the backfire effect implies that people ignore opposing arguments and strengthen their existing beliefs, disconfirmation bias simply means they tend to ignore arguments they view as weak.

  It might seem like a subtle difference, but it’s a crucial one. If the backfire effect is common, it implies that we can’t persuade people with conflicting opinions to change their stance. No matter how convincing our arguments, they will only retreat further into their beliefs. Debate becomes hopeless and
evidence worthless. In contrast, if people suffer from disconfirmation bias, it means their views could change, given compelling enough arguments. This creates a more optimistic outlook. Persuading people may still be challenging, but it is worth trying.

  A lot rides on how we structure and present our arguments. In 2013, the UK legalised same-sex marriage. John Randall, then a Conservative MP, voted against the bill, a decision he later said he regretted. He wished he’d talked to one of his friends in Parl­iament beforehand, someone who – to many’s surprise – had voted in favour of marriage equality. ‘He said to me that it was something that wouldn’t affect him at all but would give great happiness to many people,’ Randall recalled in 2017. ‘That is an argument that I find it difficult to find fault with.’[66]

  Unfortunately, there is a major obstacle when it comes to finding a persuasive argument. If we have a strong opinion, Bayesian reasoning implies that we will struggle to distinguish the effects of arguments that support this existing view. Suppose you strongly believe in something. It could be anything from a political stance to an opinion about a film. If someone presents you with evidence that is consistent with your belief – regardless of whether this evidence is compelling or weak – you will go away with a similar opinion afterwards. Now imagine someone makes an argument against your belief. If that argument is weak, you won’t change your view, but if it is watertight, you might well do so. From a Bayesian point of view, we are generally better at judging the effect of arguments that we disagree with.[67]

 

‹ Prev