The Half-Life of Facts

Home > Other > The Half-Life of Facts > Page 19
The Half-Life of Facts Page 19

by Samuel Arbesman

In December 2009, Bradley Wray was preparing his high school students6 for a test in his Advanced Placement psychology class. Wray developed a moderately catchy song for his students that would help them review the material and posted a video of it online.

  What was the topic of this song? Cognitive bias. There is a whole set of psychological quirks we are saddled with as part of our evolutionary baggage. While these quirks might have helped us on the savannah to figure out how the seasons change and where food might be year after year, they are not always the most useful in our interconnected, highly complex, and fast-moving world. These quirks are known as cognitive biases, and there are lots of them, creating a publishing cottage industry devoted to chronicling them.

  As sung by Wray, here are a couple (the lyrics are far from Grammy quality):

  I’m biased because I put you in a category in which you may or may not belong

  Representativeness Bias: don’t stereotype this song. . . .

  I’m biased because I take credit for success, but no blame for failure.

  Self-Serving Bias: my success and your failure.

  These biases are found throughout our lives. Many people are familiar with self-serving bias, even if they might not realize it: It happens all the time in sports. In hockey or soccer, if the team wins, the goal scorer is lauded. But if the team loses? The goalie gets the short end of the deal. The other players are the beneficiaries of a certain amount of self-serving bias—praise for success, without the burden of failure—at least that’s how the media portray it, even if they are not subject to this cognitive bias themselves. There are well over a hundred of these biases that have been cataloged.

  • • •

  IN the 1840s, Ignaz Semmelweis was a noted physician with a keen eye. While he was a young obstetrician working in the hospitals of Vienna, he noticed a curious difference between mothers who delivered in his division of the hospital and those who delivered at home, or using midwives in the other part of the hospital. Those whose babies were delivered by the physicians at the hospital had a much higher incidence of a disease known as childbed fever, which often causes a woman to die shortly after childbirth, than the women delivering with midwives. Specifically, Semmelweis realized that those parts of the hospital that did not have their obstetricians also perform autopsies had similarly low amounts of childbed fever as home deliveries.

  Ignaz Semmelweis argued that the doctors7—who weren’t just performing autopsies in addition to deliveries but were actually going directly from the morgue to the delivery room—were somehow spreading something from the cadavers to the women giving birth, leading to their deaths.

  Semmelweis made a simple suggestion: Doctors performing deliveries should wash their hands with a solution of chlorinated lime beforehand. And this worked. It lowered the cases of childbed fever to one tenth the original amount.

  However, rather than being lauded for an idea that saved lives for essentially no cost, Semmelweis was ostracized. In the mid-nineteenth century, there was no germ theory. Instead, the dominant paradigm was a certain theory of biology that blamed disease upon imbalances of “humors.” If you’ve ever noted that someone is in a “good humor,” this is a vestige of this bygone medical idea. So the medical establishment for the most part ignored Semmelweis. This quite likely drove him mad, and he spent his final years in an asylum.

  This tendency to ignore information simply because it does not fit within one’s worldview is now known as the Semmelweis reflex, or the Semmelweis effect. It is related to its converse, confirmation bias, where you only learn information that adheres to your worldview.

  The Semmelweis reflex and confirmation bias are important aspects of our factual inertia. Even if we are confronted with facts that should cause us to update our understanding of the way the world works, we often neglect to do so. We persist in only adding facts to our personal store of knowledge that jibe with what we already know, rather than assimilate new facts irrespective of how they fit into our worldview. This is akin to Daniel Kahneman’s idea of theory-induced blindness:8 “an adherence to a belief about how the world works that prevents you from seeing how the world really works.”

  In general, these biases are useful. They let us quickly fill in gaps in what we don’t know or help us to extrapolate from a bit of information so we can make quick decisions. When it comes to what we can literally see, our ancestors no doubt did this quite often. For example, we could have expected the top of a tree to look like other trees we have seen before, even if it were obscured from view. If it didn’t look right, it should still fit into our mental worldview (for example, it looked strange because there was a monkey up there). But when it comes to properly evaluating truth and facts, we often bump up against this sort of bias.

  The Semmelweis reflex is only one of many cognitive biases, and it is related to another problem of our mental machinery: change blindness. This refers to a quirk of our visual-processing system. When we concentrate on one thing or task very intently, we ignore everything else, even things that are important, or at the very least, surprising. A series of seminal experiments were done in this field9 by Christopher Chabris and Daniel Simons, professors at Union College and the University of Illinois, respectively. You’ve probably seen their experiments, in the form of fun little videos online.

  In one, subjects are shown a video of individuals in a gymnasium. The people in the video begin passing basketballs to one another and the subjects are supposed to keep track of the types of passes (such as bounce passes) or who passes to whom, since the players have different colored jerseys.

  Then something intriguing happens. Partway through the video, a woman dressed in a full-body gorilla suit walks among the basketball players. She stops in the center, beats her chest in true gorilla style, and continues walking through the players. Of course, this is surprising and strange and all kinds of adjectives that describe something very different from a normal group of people passing basketballs to one another.

  But here’s the startling thing: 50 percent of the observers of this video miss the gorilla entirely. This change blindness, also known as inattentional blindness, is a quirk of our information-processing system. When looking for one thing, we completely ignore everything else around us.

  This bug is turned into a feature by magicians, who exploit our change blindness through the use of misdirection. A magician gets you to concentrate on his left hand, while the right hand is doing all the important sleight-of-hand. This kind of thing can even fool trained magicians, who are trying to learn the illusion.

  One common way that magicians learn a new trick is through an instructional video. The magician will show you the trick, through the eyes of the spectator, then explain it, show it again, and then show it from a different perspective, or at least more slowly. An illusion that I once observed involved the use of a thumb tip—a false rubber thumb that can be used to conceal various objects, such as handkerchiefs. After the magician showed the trick, he informed the viewer that he made sure to make it easy to follow by using a bright red thumb tip.

  Upon hearing this, I was shocked. I had been so focused on all the other aspects of the illusion, and the magician’s use of misdirection, that I had entirely missed what was right before my eyes: a ridiculous red artificial thumb. I had been a victim of change blindness.

  Change blindness in the world of facts and knowledge is also a problem. Sometimes we are exposed to new facts and simply filter them out, along the lines of the Semmelweis reflex. But more often we have to go out of our way in order to learn something new. Our blindness is not a failure to see the new fact; it’s a failure to see that the facts in our minds have the potential to be out-of-date at all. It’s a lot easier to keep on quoting a fact you learned a few years ago, after having read it in a magazine, than to decide it’s time to take a closer look at the current ten largest cities in the United States, for example, and no
tice that they are far different from what we learned when we were younger.

  But whichever bias we are subject to, factual inertia permeates our entire lives.

  • • •

  A clear example of how we often neglect to respond to change is when it comes to writing the date or the year on documents.

  Have you ever written the wrong year during the first weeks of January? This happens in everything from homework assignments to legal documents. This can even happen in the extreme. On May 24, 2011, President Barack Obama visited the United Kingdom. While making a stop at Westminster Abbey, Obama decided to sign the guestbook.10 He wrote a very nice little note about the special relationship that the United States shares with Great Britain. The only problem was that he dated his signature May 24, 2008. Perhaps he hadn’t had to write the date since he won election three years earlier. Either way, the inability to respond to change is not an issue only for the everyday; it reaches all the way to the top. Thankfully, even the law has taken into account our foibles and our inability to always update our facts. In courts, intent is what matters, and not unconscious muscle memory, so if you do this on a legal document, you’re generally fine.

  I decided to conduct a simple experiment to actually get a handle on people’s factual inertia. To do this, I used a Web site created by Amazon called Mechanical Turk. The label Mechanical Turk derives from a well-known hoax from the eighteenth and nineteenth centuries. The Turk was a complex device that was displayed all throughout Europe. While appearing to be a chess-playing automaton, the Turk actually had a person in a hidden compartment, controlling the machine.

  In homage to this, Amazon named its online labor market—a clearinghouse for simple tasks humans can easily perform but computers cannot—Mechanical Turk. These tasks include things like labeling photographs when they are posted, and Turkers, as the laborers are called, will often solve these problems for pennies. Mechanical Turk has recently become a wonderful test bed for social science experiments, due to the large supply of subjects, the low wages required, and the fast turnaround time for running an experiment. While certainly not a perfectly representative distribution of humanity, it is much better than most traditional experimental populations, which are generally college undergrads.

  As part of my scientific research, I’ve been part of a team that has developed software infrastructure for running online experiments to see how people cooperate in networks on Mechanical Turk. But due to this, I have also gained an appreciation for Mechanical Turk. To get a sense of people’s factual inertia, I thought it would be a good place to quickly survey a population about their beliefs and knowledge.

  I decided to examine people’s knowledge of best practices when it comes to treating a nosebleed. I can distinctly remember a nosebleed of mine about ten years ago, when my nose started bleeding spontaneously one evening. With one hand holding a tissue to stanch the blood flow, I used the other hand to search online for how to properly treat it. Do I lean back? Do I lean forward? Where do I pinch my nose? I had heard so much competing information that I really didn’t remember the “right” thing to do any longer. Searching online mid-nosebleed was my only recourse.

  So I put a similar question to my Turkers:

  “If you have a nosebleed, what’s the best way to handle it?”

  A. Lean your head back

  B. Lie down

  C. Lean your head slightly forward

  D. Lean your head all the way forward

  “And how would you hold your nose?”

  A. Hold nose completely closed

  B. Pinch bridge of nose

  I asked for one hundred responses and offered five cents for each survey completion. Even with paying the overhead to Amazon, I was paying less than six dollars for new knowledge. I had my first response in less than a minute, which incidentally was incorrect. After a couple of days, all the responses were in.

  For the record, according to WebMD,11 the proper procedure is to lean your head forward and pinch the bridge of your nose, so the correct answers should be C (and perhaps D), and B.

  But how did my subjects respond? Here’s the breakdown for the Turkers:

  “If you have a nosebleed, what’s the best way to handle it?”

  Lean your head back: 50

  Lie down: 14

  Lean your head slightly forward: 31

  Lean your head all the way forward: 5

  “And how would you hold your nose?”

  Hold nose completely closed: 17

  Pinch bridge of nose: 83

  While the Turkers seem to have generally grasped the idea that holding your nose completely closed during a nosebleed is a bad idea, they do not know the proper position to assume, with only about a third correctly identifying the way to position oneself.

  The results of this experiment are not terribly surprising. In addition to all of the cognitive biases that we are saddled with, it is difficult for us to keep abreast of all the information around us. When we are young, we are treated as little generalists, absorbing all manner of information. We learn geography, history, mathematics, how to read a map, and lots of science trivia. We are even able to learn entire languages relatively effortlessly.

  But then, as we get older, a curious thing happens with our approach to education. In addition to no longer being compelled to learn all manner of things (because we are, after all, adults, and we really can’t be compelled to learn anything at all), if we do continue to educate ourselves, we focus. We choose a major and learn all that there is to learn about a single topic, such as biology. Then we become experts in that area, well aware of all of the nuances, debates, and changes in facts within that field. We learn more and more about less and less.

  But all of our earlier knowledge remains in stasis. Instead of it all growing and developing in a rigorous fashion, like whatever we choose to make our careers in, it generally stays the same. Unless we happen to stumble upon an article in a magazine or newspaper about a certain scientific finding, or unless something is so important and earth shattering that we can’t help but remark upon this new fact’s novelty, we remain stuck at the factual level of our grade-school selves.

  We continue to refer to different countries as First World, Second World, or Third World, not recognizing that these terms refer to alignments in the Cold War. Or our awareness of the periodic table remains stuck at high school levels, not realizing that the number of elements has grown a lot since we were in chemistry class.

  But this description of how we learn things when we’re young, and then stop learning, is a little too oversimplified and straightforward. It turns out that many of us do update these sorts of facts, but it often happens only in bursts. And these jumps occur at precise intervals: the length of a single human generation.

  • • •

  THE science writer Brian Switek pointed out to me that when most people learn about a topic—we were chatting about dinosaurs, but it works for most anything—we learn it when young, in the time of our lives when obsessive knowledge-gathering is the default mode, and then we leave it aside as we turn to more mature topics, or simply other things that we are now interested in.

  But we do return to the subject, if only when our own children have reached the same point. Rather than seeing these mesofacts change slowly, in a relatively smooth advancement of knowledge, you only encounter them in bursts, when the next generation does, such as when your child comes home and informs you that dinosaurs were warm-blooded and looked like birds. This generational knowledge appears staccato, even though the knowledge changes and accretes steadily.

  Whatever doesn’t conform to your childhood, and especially when it comes to dinosaurs, often seems wrong. Just as shifting baseline syndrome makes us assume that whatever state of affairs we were born into is the normal one, we don’t often confront changing facts until another gener
ation grows up with a different baseline. We are then forced to confront the difference between them.

  This is true of what’s currently happening with Pluto. If you ask young children in 2012 to name the planets, they go up to Neptune, and they finish by saying that Pluto is a dwarf planet, or distinguish Pluto in some other way.12 But this is likely a temporary condition. Those teaching want these kids to know about Pluto and its curious status. But soon enough, it will just fade away into a strange footnote, paralleling what happened back in the nineteenth century: Just as Ceres and the other large asteroids were once counted as planets (marked as such on charts and taught to schoolchildren for decades) until the discovery of the abundant minor planets of the asteroid belt, Pluto’s special place will likely fade away.

  Of course, what generation means needn’t be literal, although it is often the case that the facts in our brain—and their lifetime—are tied to childbirth. We can also understand what a generation is more figuratively. For example, when it comes to university-specific knowledge, a generation time is far closer to four years than multiple decades, due to the turnover of students. Institutional memory, and its attendant facts and knowledge, are only as permanent as its generation time.

  This was made clear to me when reading an essay by Michael Chabon.13 He was bemoaning the recent commercialization and corruption of the purity that is Lego. He began with noting how the Lego sets that children have nowadays are fraught with pieces of every color of the rainbow: pink, purple, sky blue, and more. Furthermore, there are themed sets, from Harry Potter and Star Wars, replete with specialized pieces. But back in his day—and I read this approvingly—there were only a small handful of colors: red, blue, green, black, white, and yellow.

  Then he continued by lamenting the cause of the downfall from Lego’s pristine nature: the minifigure. Those small people with the yellow faces and simple grins, who, as Chabon argued, have constituted nothing more than a bastardization of the Lego aesthetic.

  Suddenly I was no longer in agreement with the essay. This was wrong. While I don’t care for the themed and specialized sets, which even include a set that seems to revolve around alien abduction, I grew up with these minifigs. They were a part of my childhood! How dare Chabon view these elemental little men as a corruption of the Lego ideal? They are part of Lego’s nature.

 

‹ Prev