The Half-Life of Facts

Home > Other > The Half-Life of Facts > Page 18
The Half-Life of Facts Page 18

by Samuel Arbesman


  There is a sifting and filtering process that moves knowledge from the frontier to the relatively compact and tiny core of knowledge. We should enjoy this process, rather than despair. One of the most fulfilling aspects is not the upheaval and churning of facts, but rather being able to grapple with concepts that explain our world. And these new facts are now only possible due to measurement.

  • • •

  IN addition to exposing quantitative error or delimiting what’s around us (as in the case of Mount Everest’s height), measurement can also have the profound benefit of overturning simple ideas and creating new pieces of knowledge, things we never could have known before.

  A stark example is that of war,22 and whether it exists. On its face, this is a silly question: Of course war exists. It has always existed and seems to be a regular state of affairs, at least somewhere in the world.

  However, John Mueller, a professor of political science at Ohio State University, decided to actually test this notion. Mueller began with a careful definition of what a war consists of, which is a reasonably well-accepted definition that it is a conflict between two governments, or a government and an organized group (relevant for a civil war), in which at least one thousand people are killed each year as a direct consequence of the fighting.

  Mueller compiled all of the data since 1946 from a variety of sources and showed that, after an increase from the beginning of the data set until the end of the Cold War, the number of wars has plummeted precipitously. He also showed that the vast majority of wars are civil wars. Aside from why this has occurred, the presence of war has now become a clear mesofact. In the past several decades, war has gone from being a common and growing occurrence to something that is quite rare.

  This fact is astonishing. But it’s also astonishing for a reason besides its counterintuitive aspect: This fact could not have been known without careful measurement. There is a burgeoning group of scientists who use such data, which has been derived through careful measurement, to discover whole new ways of thinking about our world. The patron saint of this kind of careful measurement is Francis Galton.

  • • •

  WHEN born in 1822, Francis Galton23 was the scion of an esteemed family. Galtons had been scientists and businessmen for generations, and his half cousin happened to be Charles Darwin. But Galton’s own career did not get off to a particularly auspicious start. Despite being considered a prodigy at an early age, he had a rather peripatetic and undistinguished young adulthood, traveling the world and performing rather poorly at Cambridge University.

  But beginning with the publication of the chronicles of these explorations, Galton burst forth as a scientist. Soon he was involved in fields ranging from biology and mathematics to photography and anthropology, making numerous contributions in each of these areas.

  Running through all of his work was an obsession with data and numbers. Galton collected data on everything. He examined data on contemporary illustrious men. He looked at data on the performance of university students. He looked at people’s heights. He wrote a paper24 entitled “On Head Growth in Students at Cambridge.” He even wrote a letter to Nature about how people visualize numbers in their minds,25 based on his own research. Galton was the man who introduced fingerprinting to Scotland Yard. He even constructed a map of beauty in the British Isles, based on how many pretty women he encountered26 in various locations. Stephen Stigler, a statistician at the University of Chicago, has argued that Galton was the man who ushered in the Statistical Enlightenment.27

  Derek de Solla Price, a kindred spirit when it came to data, wrote the following of Galton:28

  Galton’s passion shows itself best, I feel, in two essays that may seem more frivolous to us than they did to him. In the first, he computed the additional years of life enjoyed by the Royal Family and the clergy because of the prayers offered up for them by the greater part of the population; the result was a negative number. In the second, to relieve the tedium of sitting for a portrait painter, on two different occasions he computed the number of brush strokes and found about 20,000 to the portrait; just the same number, he calculated, as the hand movements that went into the knitting of a pair of socks.

  Galton was not a man to shy away from data. While many of his results may no longer be accepted, he combined analysis and mathematical techniques to great effect, and in so doing, brought many new facts to light, facts that could only be learned through careful, exhaustive, tedious measurement.

  Such a preoccupation with allowing data to reveal new facts is the hallmark of science. This obsession with great amounts of data is not an isolated incident, something specific to Galton and an aberration along the trajectory of science. Instead, it is part of a grand tradition that, especially in the fields of sociology and social psychology, has unleashed a great many intriguing and clever experiments.

  Stanley Milgram, known for his shock experiment that explored obedience, and for being the first to measure the six degrees of separation, conceived of numerous elegant experiments. One of these has the whiff of Galton. Known as the sidewalk experiment, he had graduate students stand on a New York City sidewalk and look up. He then measured how many students were required for this group to get passersby to stop and join them, or at least to look up themselves. These data, carefully collected, brought certain ideas about collective behavior to the fore.

  Others have done similarly odd social experiments in order to collect data. For example, researchers have examined the drinking establishment locations and characteristics in different communities, and even whether the elderly are capable of crossing the street29 in the time a given traffic light provides them.

  In the past few years there has been a surge in what is being called data science. Of course, all science uses data, but data science is more of a return to the Galtonian approach, where through the analysis of massive amounts of data—how people date on the Internet, make phone calls, shop online, and much more—one can begin to visualize and make sense of the world, and in the process discover new facts about ourselves and our surroundings.

  Simply put, measurement teaches us new things, things that we can only know because we have the tools to quantify our surroundings. However, measurement cannot be used in every case, with its light of quantification shining evenly on every surface. Some measurements are easier than others, and this unevenness can profoundly affect what we know.

  • • •

  WHAT can be measured, and when, affects what can be learned. If we can’t measure something, this can actually create a bias in what we know. For example, in biology, there is something known as taxonomic bias. This is when we study certain livings things not because they are more prevalent but because we like them more, or because they are simply easier to find. Vertebrates—those animals that have a backbone and comprise most of the creatures we are familiar with—are the subject of the vast majority of scientific papers,30 despite being only a tiny fraction of the different types of animals on Earth. Sometimes, when this seems overly malicious—amphibians and reptiles getting less attention than birds and mammals, because they are slimier or otherwise less cuddly—some scientists even call it31 taxonomic chauvinism.

  Far from being an obscure point in scientific knowledge, taxonomic bias can ripple outward into popular culture. Let’s return to dinosaurs. These massive creatures, beloved by millions, are firmly ingrained in the popular consciousness. And none more so than the staples tyrannosaurus rex, triceratops, and stegosaurus: the usual suspects.

  But there’s a reason for this. When dinosaur paleontology began to truly take off in the United States, there were two main dig sites: Hell Creek in Montana and Como Bluff in Wyoming. While Hell Creek was a somewhat later site, dug in the 1900s, Como Bluff in particular had an outsize influence, as it was the dig site of Othniel Marsh. And of course, Edward Cope contested this claim, making it another of the many battlegrounds in th
e Bone Wars.

  These two sites have finds from the Jurassic and the Late Cretaceous. As some of the first big dig sites, due to being the easiest to work with and the most abundant in fossils, they hold a special place in American dinosaur history. What dinosaurs were first found, and found in abundance, at these sites? Tyrannosaurus rex, triceratops, stegosaurus. In fact, one of the main reasons that the brontosaurus32 rose to such prominence is due to Marsh’s discovery of a complete “brontosaurus” skeleton at Como Bluff.

  Today, any museum worth its voluntary admission price has a display featuring at least some of the big names of Dinosauria. Most people don’t realize that it is for a simple reason: These were the easiest dinosaurs to find—the low-hanging terrible lizard fruit, as it were. They had the first-mover advantage in fossils, and have therefore gained an outsize share of our brains’ stock of dinosaur knowledge. What we study is not always what is actually out there; it’s often what we’re interested in, or what’s easiest to discover.

  • • •

  MEASUREMENT is a double-edged sword. It can create errors where none existed before. It can lead us to information about certain topics more frequently than we might have expected, creating a sort of informational bias. And it can create spurious scientific knowledge, facts that must be examined with some hesitation. While we might not be able to know at a glance which results will be wrong, there are scientific principles to understanding how measurement can lead us astray.

  But measurement can also create new knowledge, whether overturning false facts or finding something out about our world we’d never known before. As our methods of measurement have improved—according to mathematical regularities—we are now in the position to know more about our world.

  Measurement of our surroundings is an inherently human process. But separating facts from the people who make them, spread them, or debunk them is nearly impossible. And measurement is not even the half of it. Building on everything we have seen about how facts change, it’s time to finally tackle that last realm: the human aspect of how knowledge changes.

  CHAPTER 9

  The Human Side of Facts

  FROGS have a curious type of vision.1 If you hang a dead fly on a string in front of a hungry frog, it won’t eat it. It is entirely unaware that it’s there. But put a live fly into a room with a frog, and the frog will actively work on pursuing its lunch.

  It seems that frogs can only see certain objects when they are in motion: If the food is not moving, it might as well not exist.

  While this is not how human vision operates (in fact, I don’t want my food to be moving), there is some truth in this for us. If something moves rapidly, or changes quickly, we notice it easily. But if there is only very slow change, we are often not aware of it. This is similar to the story I repeated of the frog that, if slowly boiled, will willingly submit to its own death. While there isn’t any evidence of this, I, as well as many others, erroneously continued to mention this story. Why? Because I sensed some truth in it. From the weird “human statues” in parks whose slow movements we cannot see, to our inability to see plant growth, or our failure to wear layered clothing and be able adapt to slow temperature changes throughout the day, we are not well equipped to deal with slow change.

  Humans are imperfect. We fall prey to optical illusions, heal better the more expensive placebo we receive, and often have faulty memories. Having evolved in East Africa, we are confronted on a daily basis with situations far removed from those we encountered as hunter-gatherers. We have a certain amount of evolutionary baggage that makes us ill equipped to deal with many aspects of modernity.

  This does not mean that all of our choices are irrational or ridiculous, or that we should simply curl into a fetal position and give up dealing with our incredible world, which is so full of technology and modernity. But we do approach the world in what Dan Ariely calls “predictably irrational” ways. If we are aware of the quirks of our brains and psychology, however, we can better understand the decisions we make and the world we create for ourselves. Crucially for us, we approach new knowledge and the world of facts in a manner that is far from completely rational yet is still regular and predictable in its biases. This chapter is about how our brains, and our psychological quirks, affect the facts that each of us holds within our minds.

  When John Maynard Keynes was asked about why he switched his position on monetary policy, he uttered the immortal, though likely apocryphal, bon mot: “When the facts change, I change my mind. What do you do, sir?” All too often, we don’t act like Keynes. We get stuck in ruts and don’t change, even when the facts change around us. Why is that? For a variety of reasons—whether we don’t perceive the change, don’t believe that change is occurring, or simply don’t believe the facts—we are not perfect beings who adapt immediately to the changes around us.

  This can be seen clearly when it comes to slow change over the course of our lifetimes, the mesofacts of life. Whatever the state of the world we are born into quickly becomes what we expect to be normal.

  This condition is known as shifting baseline syndrome, and it refers to how we become used to whatever state of affairs is true when we are born, or when we first look at a situation. Since we are only capable of seeing change over a single generation, if slow change occurs over many lifetimes, we often fail to perceive it.

  Shifting baseline syndrome was first identified and named by Daniel Pauly to refer to what happened with fish populations throughout the world. When the Europeans first began fishing off Newfoundland and Cape Cod, fish were incredibly abundant. In the seventeenth century, the abundance of cod2 was “so thick by the shore that we hardly have been able to row a boat through them.” It seemed as if nothing could ever deplete their numbers. But within less than two hundred years of fishing, many species were entirely wiped out.

  How could this have happened? Pauly described the situation as follows:

  Each generation of fisheries scientists accepts as a baseline the stock size and species composition that occurred at the beginning of their careers, and uses this to evaluate changes. When the next generation starts its career, the stocks have further declined, but it is the stocks at that time that serve as a new baseline. The result obviously is a gradual shift of the baseline, a gradual accommodation of the creeping disappearance of resource species.

  It’s easy to remember what is normal and the “correct” state of affairs when we start something new, after having set our baseline, even if we only do this subconsciously. But we mustn’t let that guide all of our thinking, because the result can be catastrophic.

  Of course, shifting baseline syndrome can even affect us in smaller, more subtle ways. Alan Kay, a pioneering computer scientist, defined technology as “anything that was invented after you were born.”3 For many of us, this definition of technology captures the whiz-bang innovations of the Web browser and the iPad: anything that appeared recently and is different from what we are used to. In this way, we fail to notice all the older but equally important technologies around us, which can include everything from the pencil to window glass.

  But factual inertia in general, even within a single life span, is all around us. Ever speak with a longtime New Yorker and ask for subway directions? You’ll be saddled with information about taking the IND, BMT, and IRT, when you were hoping for something that would mention a numbered or lettered train. These mysterious acronyms are the names of the agencies—Independent Subway, Brooklyn-Manhattan Transit, Interborough Rapid Transit—that formerly ran the subways in New York City. Despite the unification that began in the 1940s of these competing systems, many people still refer to them by their former names. Even if facts are changing at one rate, we might only be assimilating them at another.

  Adhering to something we know (or at least knew), even in the face of change, is often the rule rather than the exception. On January 13, 1920, the New York Times ridiculed
the ideas of Robert H. Goddard. Goddard, a physicist and pioneer in the field of rocketry, was at the time sponsored by the Smithsonian. Nonetheless, the Gray Lady argued in an editorial that thinking that any sort of rocket could ever work in the vacuum of space is essentially foolishness and a blatant disregard for a high school understanding of physics. The editors even went into reasonable detail in order to debunk Goddard.

  Luckily, the Times was willing to print a correction. The only hitch: They printed it the day after Apollo 11’s launch in 1969. Three days before humans first walked on the moon, they recanted their editorial4 with this bit of understatement:

  Further investigation and experimentation have confirmed the findings of Isaac Newton in the 17th century and it is now definitely established that a rocket can function in a vacuum as well as in an atmosphere. The Times regrets the error.

  Why do we believe in wrong, outdated facts?5 There are lots of reasons. Kathryn Schulz, in her book Being Wrong, explores reason after reason why we make errors. Sometimes it has to do with our desire to believe a certain type of truth. Other times it has to do with being contrary (Schulz notes one surefire way of adhering to a certain viewpoint: Have a close relative take the opposite position). But oftentimes it is simply due to a certain amount of what I dub factual inertia: the tendency to adhere to out-of-date information well after it has lost its truth.

  Factual inertia takes many forms, and these are described by the relatively recent field of evolutionary psychology. Evolutionary psychology, far from sweeping our biases under the rug, embraces them, and even tries to understand the evolutionary benefit that might have accrued to what may be viewed as deficits.

  So what forms can factual inertia take? Look to the lyrics of Bradley Wray.

 

‹ Prev