Book Read Free

Humans: A Brief History of How We F*cked It All Up

Page 3

by Tom Phillips


  We do this to an almost ridiculous degree: the piece of information we use as an anchor can be as explicitly unhelpful as a randomly generated number, and our brains will still latch on to it and skew our decisions toward it. This can get frankly worrying; in his book Thinking, Fast and Slow, Daniel Kahneman gives the example of a 2006 experiment on a group of highly experienced German judges. They were shown details of a court case in which a woman was found guilty of shoplifting. They were then asked to roll a pair of dice, which (unknown to them) were weighted to only ever produce a total of 3 or 9. Then they were asked if the woman should be sentenced to more or fewer months than the figure produced by the dice, before finally being asked to provide their best recommendation for how long her sentence should be.

  You can pretty much guess the result: the judges who rolled the higher figure on the dice sentenced her to much longer in prison than the ones who rolled low. On average, the roll of the dice would have seen the woman spend an extra three months in jail. This is not comforting.

  Availability, meanwhile, means that you make judgment calls on the basis of whatever information comes to mind easiest, rather than deeply considering all the possible information that might be available to you. And that means we’re hugely biased toward basing our worldview on stuff that’s happened most recently, or things that are particularly dramatic and memorable, while all the old, mundane stuff that’s probably a more accurate representation of everyday reality just sort of...fades away.

  It’s why sensational news stories about horrible crimes make us think that crime levels are higher than they are, while dry stories about falling crime statistics don’t have anywhere near as much impact in the opposite direction. It’s one reason why many people are more scared of plane crashes (rare, dramatic) than they are of car crashes (more common and as such a bit less exciting). And it’s why terrorism can produce instant knee-jerk responses from the public and politicians alike, while far more deadly but also more humdrum threats to life get brushed aside. More people were killed by lawn mowers than by terrorism in the USA in the decade between 2007 and 2017, but at the time of writing, the US government has yet to launch a War on Lawn Mowers. (Although, let’s be honest, given recent events you wouldn’t rule it out.)

  Working together, the anchoring heuristic and the availability heuristic are both really useful for making snap judgments in moments of crisis, or making all those small, everyday decisions that don’t have much impact. But if you want to make a more informed decision that takes into account all the complexity of the modern world, they can be a bit of a nightmare. Your brain will keep trying to slide back to its evidential comfort zone of whatever you heard first, or whatever comes to mind most quickly.

  They’re also part of the reason why we’re terrible at judging risk and correctly predicting which of the many options available to us is the one least likely to lead to catastrophe. We actually have two separate systems in our minds that help us judge the danger of something: the quick, instinctive one and a slow, considered one. The problems start when these conflict. One part of your brain is quietly saying, “I’ve analyzed all the evidence and it appears that Option 1 is the riskiest alternative,” while another part of your brain is shouting, “Yes, but Option 2 SEEMS scary.”

  Sure, you might think, but luckily we’re smarter than that. We can force our brains out of that comfort zone, can’t we? We can ignore the instinctive voice and amplify the considered voice, and so objectively consider our situation, right? Unfortunately, that doesn’t take confirmation bias into account.

  Before I began researching this book, I thought that confirmation bias was a major problem, and everything I’ve read since then convinces me that I was right. Which is exactly the problem: our brains hate finding out that they’re wrong. Confirmation bias is our annoying habit of zeroing in like a laser-guided missile on any scrap of evidence that supports what we already believe, and blithely ignoring the possibly much, much larger piles of evidence that suggest we might have been completely misguided. At its mildest, this helps explain why we prefer to get our news from an outlet that broadly agrees with our political views. In a more extreme instance, it’s why you can’t argue a conspiracy theorist out of their beliefs, because we cherry-pick the events that back up our version of reality and discard the ones that don’t.

  Again, this is quite helpful in some ways: the world is complex and messy and doesn’t reveal its rules to us in nice, simple PowerPoint presentations with easy-to-read bullet points. Coming up with any kind of mental model of the world means discarding useless information and focusing on the right clues. It’s just that working out what information is the stuff worth paying attention to is a bit of a cognitive crapshoot.

  It gets worse, though. Our brain’s resistance to the idea that it might have screwed up goes deeper. You’d think that once we’d made a decision, put it into action and actually seen it start to go horribly wrong, we would then at least become a bit better at changing our minds. Hahaha, no. There’s a thing called “choice-supportive bias,” which basically means that once we’ve committed to a course of action, we cling on to the idea that it was the right choice like a drowning sailor clinging to a plank. We even replay our memories of how and why we made that choice in an attempt to back ourselves up. In its mild form, this is why you end up hobbling around in agony after buying a new pair of shoes, insisting to everybody that “they make me look POWERFUL yet ALLURING.” In a stronger form, it is why government ministers continue to insist that the negotiations are going very well and a lot of progress has been made even as it becomes increasingly apparent that everything is going quite profoundly to shit. The choice has been made, so it must have been the right one, because we made it.

  There’s even some evidence that, in certain circumstances, the very act of telling people they’re wrong—even if you patiently show them the evidence that clearly demonstrates why this is the case—can actually make them believe the wrong thing more. Faced with what they perceive as opposition, they double down and entrench their beliefs even more strongly. This is why arguing with your racist uncle on Facebook, or deciding to go into journalism, may be an ultimately doomed venture that will only leave you despondent and make everybody else very angry with you.

  None of this means that people can never make sensible and well-informed decisions: very obviously they can. I mean, you’re reading this book, after all. Congratulations, you excellent choice-maker! It’s just that our brains often put a remarkably large number of obstacles in the way, all the time thinking they’re being helpful.

  Of course, if we’re bad at making decisions on our own, it can get even worse when we make decisions along with other people. We’re a social animal, and we reeeaaalllly don’t like the feeling of being the odd one out in a group. Which is why we frequently go against all our better instincts in an effort to fit in.

  That’s why we get groupthink—when the dominant idea in a group overwhelms all the others, dissent being dismissed or never voiced thanks to the social pressure to not be the one saying, “Uh, I’m not sure this is the greatest idea?” It’s also why we jump on bandwagons with wild abandon: the very act of seeing other people do or believe a thing increases our desire to match them, to be part of the crowd. When your mum asked you as a kid, “Oh, and if the other kids jumped off a bridge, would you do that, too?” the honest answer was, “Actually, there’s a pretty good chance, yeah.”

  And finally, there’s the fact that—bluntly—we think we’re pretty great when we are not, in fact, all that. Call it hubris, call it arrogance, call it being a bit of a pillock: research shows that we wildly overestimate our own competence. If you ask a group of students to predict how high up the class they’ll finish, the overwhelming majority will put themselves in the top 20 percent. Hardly any will say, “Oh yeah, I’m probably below average.” (The most common answer is actually outside the top 10 percent, but inside the top 20 percent, like
a boastful version of ordering the second-cheapest glass of wine.)

  There’s a well-known cognitive problem called the Dunning–Kruger effect, and beyond sounding like an excellent name for a seventies prog rock band, it may be the patron saint of this book. First described by the psychologists David Dunning and Justin Kruger in their paper “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments,” it provides evidence for something that every one of us can recognize from our own lives. People who are actually good at any particular skill tend to be modest about their own abilities; meanwhile, people with no skills or talent in the field wildly overestimate their own competence at it. We literally don’t know enough about our own failings to recognize how bad we are at them. And so we blunder on, overconfident and blissfully optimistic about whatever it is we’re about to get horribly, horribly wrong. (As the rest of this book will show, of all the mistakes our brains make, “confidence” and “optimism” may well be the most dangerous.)

  All of these cognitive failures, piled one on top of the other in the form of society, lead us to make the same types of mistakes over and over again. Below are just a few of them: think of this like a spotter’s guide for the rest of the book.

  For starters, our desire to understand the world and discern patterns in it means that we spend quite a lot of our time convincing ourselves that the world works a certain way when in fact it absolutely doesn’t work like that. This can encompass everything from small personal superstitions to completely inaccurate scientific theories, and explains why we fall for propaganda and “fake news” so readily. The real fun starts when somebody manages to convince large numbers of other people that their pet theory about how the world works is true, which gives you religion and ideology and all those other Big Ideas that have proved so entertaining over the course of human history.

  Humans are also very bad at risk assessment and planning ahead. That’s partly because the art of prediction is notoriously difficult, especially if you’re trying to make predictions about a highly complex system, like the weather or financial markets or human society. But it’s also because once we’ve imagined a possible future that pleases us in some way (often because it fits with our preexisting beliefs), we’ll cheerfully ignore any contrary evidence and refuse to listen to anybody who suggests we might be wrong.

  One of the strongest motivators for this kind of wishful-thinking approach to planning is, of course, greed. The prospect of quick riches is one that’s guaranteed to make people lose all sense—it turns out we’re very bad at doing a cost–benefit analysis when the lure of the benefit is too strong. Not only will humans cross oceans and climb mountains for the (often fanciful) promise of wealth, we’ll also happily cast aside any notion of morality or propriety as we do it.

  Greed and selfishness also play into another common mistake: that of us collectively ruining things for everybody because we each wanted to get an advantage for ourselves. In social science, this category of screw-ups goes by names like the “social trap” or the “tragedy of the commons,” which is basically when a group of people all do things that on their own would be absolutely fine in the short term, but when lots of people do them together, it all goes horribly wrong in the long term. Often this means destroying a shared resource because we exploit it too much: for example, fishing an area of water so much the fish stocks can’t replenish themselves. There’s also a related concept in economics known as a “negative externality”—basically a transaction where both parties do well out of it, but there’s a cost that’s paid elsewhere, by someone who wasn’t even part of the transaction. Pollution is the classic example of that; if you buy something from a factory, that’s a win for both you and the manufacturer, but it might be a loss for the people who live downstream of the toxic waste the factory pours out.

  This group of related mistakes are behind a really large proportion of human fuck-ups—in systems from capitalism to cooperatives, and from issues that can be as vast as climate change or as small as splitting the bill in a restaurant. We know that it’s a bad idea for everyone to underplay how much they owe, but if everyone else is doing it, we don’t want to be the ones to lose out by not doing it. And so we shrug, and say, “Not my problem, mate.”

  Another one of our most common mistakes is prejudice: our tendency to split the world into “us” and “them” and quickly believe the worst thing possible about whoever “them” is. This is where all our cognitive biases get together and have a bigotry party: we divide the world up according to patterns that might not exist, we make snap judgments based on the first thing to come to mind, we cherry-pick evidence that backs up our beliefs, we desperately try to fit into groups and we confidently believe in our own superiority for no particularly good reason.

  (This is reflected in the book in more ways than one: while this is a history of humanity’s failures, with a couple of exceptions it’s really a history of failures by men; and more often than not, white men. That’s because they were often the only ones who were given the chance to fail. Generally it’s not a good thing for history books to focus almost exclusively on the deeds of old white guys, but given the subject matter of this one, I think it’s probably a fair cop.)

  And finally, our desire to fit in with a crowd means that we’re extremely prone to fads, crazes and manias of all kinds—brief, flaring obsessions that grip society and send rationality out of the window. These take many different forms. Some can be purely physical, like the inexplicable dancing manias that periodically gripped Europe for about seven centuries in the Middle Ages, in which hundreds of thousands of people would become infected with a sudden and irresistible urge to dance, sometimes to death.

  Other manias are financial, as our desire for money combines with our eagerness to be part of a crowd and to believe the stories of whatever get-rich-quick scheme is going around. (In London in 1720, there was such a frenzy of interest in investing in the South Sea that one group of chancers managed to sell stock described as “a company for carrying out an undertaking of great advantage, but nobody to know what it is.”) This is how we get financial bubbles—when the perceived value of something far outstrips its actual value. People start investing in the thing not because they necessarily think it has any intrinsic worth, but simply because as long as enough other people think it’s worth something, you can still make money. Of course, eventually reality kicks back in, a lot of people lose a lot of money and sometimes the entire economy goes down the pan.

  Yet other manias are mass panics, often founded on rumors that play on our fears. That’s why witch hunts in one form or another have happened at some point in history in virtually every culture around the world (an estimated 50,000 people died across Europe during the witch manias that lasted from the sixteenth to the eighteenth centuries).

  These are just some of the mistakes that recur with wearying predictability throughout the history of human civilization. But, of course, before we could start making them in earnest, we had to invent civilization first.

  5 OF THE WEIRDEST MANIAS IN HISTORY

  Dancing Manias

  Outbreaks of inexplicable, uncontrollable dancing were common in much of Europe between the 1300s and the 1600s, sometimes involving thousands of people. Nobody’s entirely sure why.

  Well Poisoning

  Around the same time, mass panic at false rumors of wells being poisoned were also common—normally blamed on Jews. Some panics led to riots and Jewish homes being burned.

  Penis Theft

  Outbreaks of panic that malign forces are stealing or shrinking men’s penises appear all around the world—blamed on witches in medieval Europe, on poisoned food in Asia or on sorcerers in Africa.

  Laughing Epidemics

  Since the 1960s, epidemics of unstoppable laughter have occurred in many African schools—one famous outbreak in Tanzania in 1962 lasted a year and a half, forcing s
chools to temporarily close.

  The Red Scare

  A classic “moral panic,” a wave of anticommunist hysteria swept the USA in the 1940s and 1950s, as the media and populist politicians spread the exaggerated belief that communist agents had infiltrated every part of US society.

  2

  Nice Environment You’ve Got Here

  Around 13,000 years ago in the Fertile Crescent of ancient Mesopotamia, humans started doing things very differently. They had what you might describe as “a change of lifestyle,” and in this case it meant a lot more than cutting down on carbs and joining a gym. Rather than the traditional approach to obtaining food—namely, going to look for it—they hit upon the neat trick of bringing the food to them. They started planting crops.

  The rise of agriculture wouldn’t just make it easier to grab some lunch; it would completely upturn society and profoundly change the natural world around us. Before agriculture, the standard thing for human groups was to move around with the seasons, following where the food was. Once you’ve got a load of rice or wheat growing, though, you really need to stick around to look after it. And so you get permanent settlements, villages and, sometime after that, towns. And, of course, all the stuff that goes with that.

 

‹ Prev