The Half-Life of Facts

Home > Other > The Half-Life of Facts > Page 4
The Half-Life of Facts Page 4

by Samuel Arbesman


  • • •

  AS scientific knowledge grows rapidly, it leads to a certain overturning of old truths, a churning of knowledge. While this churning is hard to deny—recall my inability to recall the health benefits of red wine despite having seen it in the newspapers many times—it is difficult to measure. But if we could quantify this churn, that could provide a handle for our uncertainty, and even a metric for how often we should revisit a subject.

  A few years ago a team of scientists at a hospital in Paris4 decided to actually measure this. They decided to look at fields that they specialized in: cirrhosis and hepatitis, two areas that focus on liver diseases. They took nearly five hundred articles in these fields from more than fifty years and gave them to a battery of experts to examine.

  Each expert was charged with saying whether the paper was factual, out-of-date, or disproved, according to more recent findings. Through doing this they were able to create a simple chart that showed the amount of factual content that had persisted over the previous decades. They found something striking: a clear decay in the number of papers that were still valid.5

  Furthermore, they got a clear measurement for the half-life of facts in these fields by looking at where the curve crosses 50 percent on this chart: forty-five years. Essentially, information is like radioactive material: Medical knowledge about cirrhosis or hepatitis takes about forty-five years for half of it to be disproven or become out-of-date. This is about twice the half-life of the actual radioisotope samarium-151.

  Figure 2. Decay in the truth of knowledge in the areas of hepatitis and cirrhosis. The 50 percent mark is around forty-five years, meaning it takes about forty-five years for half of the knowledge in these fields to be overturned. From Poynard, et al. “Truth Survival in Clinical Research: An Evidence-Based Requiem?” Annals of Internal Medicine 136, no. 12 (2002): 888–95.

  As mentioned earlier, while each individual radioactive atom’s decay is subject to a great deal of uncertainty, in the aggregate, they are far from random. They are subject to a systematic degradation and encapsulated in the shorthand of a single number—the half-life—that denotes how long it takes for half of the material to be subject to radioactive decay.

  Knowledge in a field can also decay exponentially, shrinking by a constant fraction. It is like one of Zeno’s Paradoxes, according to which we keep getting halfway closer to the finish line but never quite reach it. In this case, the finish line is the point at which no papers from the original batch of cirrhosis and hepatitis studies are still true. While there will always be an infinitesimal number of papers cited many decades, or even centuries, from now, within a certain number of years the vast number of articles will have decayed into irrelevance. Of course, some of these are not wrong, just obsolete. These scientists noted that the effectiveness of treatments in decades past doesn’t necessarily become nullified; they simply become superseded by something newer, such as novel vaccines that make treatment of a disease no longer necessary.

  But ultimately, while we can’t predict which individual papers will be overturned, just like we can’t tell when individual radioactive atoms will decay, we can observe the aggregate and see that there are rules for how a field changes over time. In addition, these results are nearly identical to a similar study that examined the overturning of information in surgery. Two Australian surgeons found that half of the facts in that field also become false every forty-five years. As the French scientists noted, all of these results verify the first half of a well-known medical aphorism by John Hughlings Jackson, a British neurologist in the nineteenth and early twentieth centuries: “It takes 50 years to get a wrong idea out of medicine, and 100 years a right one into medicine.”

  This means that despite the ever-expanding growth of scientific knowledge, the publication of new articles, refutations of existing theories, the bifurcations of new fields into multiple subfields, and the messy processes of grant-writing and -funding in academia, there are measurable ways in which facts are overturned and our knowledge is ever renewed. I’m not simply extrapolating from this half-life of medicine to argue that all of science is like this. Other studies have been performed about the half-life of different types of scientific knowledge as well.

  Unfortunately, convening a panel of experts and having them comb through all of science’s past conclusions and giving a thumbs-up or thumbs-down to a paper’s validity isn’t quite feasible. So we have to sacrifice precision for our ability to look at lots and lots of science relatively quickly. One simpler way to do this is by looking at the lifetime of citations. As mentioned before, citations are the coin of the scientific realm and the metric by which we measure the impact of a paper.

  Most papers are never cited. And many more are cited only once and then forgotten. Others are only cited by their own authors, in their own other papers. But—and this is no doubt a point in favor of the scientific endeavor—there are numerous papers that are cited by others in the field. And there are the even rarer papers cited so many more times than those around them that they are truly fundamental to a field, towering well above other publications.6

  To understand the decay in the “truth” of a paper, we can measure how long it takes for the citation of an average paper in a field to end. Whether it is no longer interesting, no longer relevant, or has been contradicted by new research, this paper is no longer a part of the living scientific literature. It is out-of-date. The amount of time it takes for others to stop citing half of the literature in a field is also a half-life of sorts.

  This gives us a sense of how knowledge becomes obsolete, but it also has a very practical application. Scholars in the field of information science in the 1970s were concerned with understanding the half-life of knowledge for a specific reason: protecting libraries from being overwhelmed.

  In our modern digital information age, this sounds strange. But in the 1970s librarians everywhere were coping with the very real implications of the exponential growth of knowledge: Their libraries were being inundated. They needed ways to figure out which volumes they could safely discard. If they knew the half-life of a book or article’s time to obsolescence, it would go a long way to providing a means to avoid overloading a library’s capacity. Knowing the half-lives of a library’s volumes would give a librarian a handle on how long books should be kept before they are just taking up space on the shelves, without being useful.

  So a burst of research was conducted into this area. Information scientists examined citation data, and even usage data in libraries, in order to answer such questions as, If a book isn’t taken out for decades, is it that important anymore? And should we keep it on our shelves?

  Through this we can begin to see how the half-lives of fields differ. For example, a study of all the papers in the Physical Review journals,7 a cluster of periodicals that are of great importance to the physics community, found that the half-life in physics is about 10 years. Other researchers have even broken this down by subfield,8 finding a half-life of 5.1 years in nuclear physics, 6 years for solid-state physics, 5.4 years in plasma physics, and so forth. In medicine,9 a urology journal has a half-life of 7.1 years, while plastic and reconstructive surgery is a bit more long-lived, with a half life of 9.3 years (note that this is far shorter than the half-life of 45 years calculated earlier, because we are now looking only at citations, not whether something has actually been disproved or rendered obsolete). Price himself examined journals from different fields10 and found that the literature turnover is far faster in computer science than psychiatry, which are both much faster than fields in the humanities, such as Civil War history.

  Different types of publications can also have varied half-lives. In 2008, Rong Tang looked at scholarly books in different fields11 and found the following half-lives.

  Field

  Half-life (in years)

  P
hysics

  13.07

  Economics

  9.38

  Math

  9.17

  Psychology

  7.15

  History

  7.13

  Religion

  8.76

  It seems here that physics has the longest half-life of all the fields examined, at least when it comes to books. This is the opposite of what is found in the realm of articles, where the hard sciences are overturned much more rapidly than the social sciences. This could very well be due to the fact that in the hard sciences only the research that has weathered a bit of scrutiny actually makes it into books.

  Overall, though, it’s clear that some fields are like the radioactive isotopes injected into someone undergoing a PET scan that decay extremely rapidly. Other fields are much more stately, like the radiocarbons, such as carbon-14, used for the scientific dating of ancient artifacts. But overall, these measurements provide a grounding for understanding how scientific facts change around us.

  The story of why facts get overturned—sloppy scientists or something else?—is for chapter 8, and has to do with how we do science and how things are measured. But shouldn’t the very fact that most scientific knowledge decays be somewhat distressing?

  It’s one thing to be told that a food is healthy one day and a carcinogen the next. But it’s something else entirely to assume that basic tenets of our scientific framework—gravity, genetics, electromagnetism—might very well be wrong and can possibly be part of the half-life of knowledge.

  But this is not the way science works. While portions of our current state of science can be overturned, this occurs only in the service of something much more positive: an approach to scientific truth.

  • • •

  IN 1974, three scientists working at the Thermophysical Research Properties Center at Purdue University12 released a supplement to the Journal of Physical and Chemical Reference Data. This was no small undertaking—it was more like an eight-hundred-page book on a single topic: the thermal conductivity of the elements in the periodic table.

  Thermal conductivity refers to how easily each element conducts heat. For example, metals are much better conductors of heat than gases (or plastics) are; that’s why frying pans often have handles made of plastic instead of metal. But in addition to materials having different inherent thermal conductivities, there are a number of factors that influence these values. One of the most important is temperature. In general, the hotter something already is, the better it is at conducting heat.

  Figure 3. Thermal conductivity of copper versus temperature, as derived from multiple experiments. Reprinted with permission from Ho, et al. “Thermal Conductivity of the Elements.” Journal of Physical and Chemical Reference Data 1, no. 2 (April 1972): 279–421. © 1972, American Institute of Physics.

  This supplement is an exhaustive, data-point-filled text that goes through every chemical element and examines the concept of thermal conductivity. But measuring these curves—trying to determine the relationship between temperature and conductivity for each element—isn’t always that easy. Therefore, they compiled lots of previous research that had gone into measuring these properties and, based on that research, tried to determine what this curve truly is.

  By conducting lots and lots of measurements, and seeing where the results fall on the curve, we can begin to realize what the true nature of this thermal conductivity curve actually is for each element. It can be seen in the graph: There’s a lot of noise and uncertainty. But when enough measurements are taken, a really clear picture of how properties are related emerges (in this case, thermal conductivity and temperature). If a certain fraction of the results were removed, or only the results from one of the hundreds of papers cited in the supplement were looked at, there would be a different, less complete, and inaccurate picture of the relationship between thermal conductivity and temperature.

  That’s how science proceeds.

  It’s not that when a new theory is brought forth, or an older fact is contradicted, what was previously known is simply a waste, and we must start from scratch. Rather, the accumulation of knowledge can then lead us to a fuller and more accurate picture of the world around us.

  Isaac Asimov, in a wonderful essay,13 used the Earth’s curvature to help explain this:

  [W]hen people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together.

  Clearly, when humans went from believing that the Earth was flat to the belief that the Earth was a sphere, there was a big change in their view. In an entirely unmetaphorical way, the shape of people’s thoughts was changed. But, as Asimov explained, in terms of practical usage, the flat Earth perspective is not that wrong. The assumption of a flat Earth includes the concept of no curvature at all, or zero inches of curvature per mile. Due to boats appearing on the ocean from over a flat horizon, it can be seen that the curvature is not actually zero. But, as Asimov calculates, it’s not that far off. A sphere the size of the Earth has a curvature of only eight inches per mile. That adds up over the size of the Earth, but it’s not that big when you think of it as inches per mile.

  An entirely spherical world is not correct either. We in fact exist on a very large object known as an oblate spheroid, which has a curvature that varies between 7.973 and 8.027 inches per mile. Each successive worldview, fact, or theory brings us closer to actually explaining how the world truly works and what the state of our environment is. In the case of the Earth’s curvature, each new theory got us closer to the correct amount that the Earth curves below our feet. Or, in a more complex example, this is similar to how Einstein’s theory subsumed Newton’s results and made them even more general. We can still use Newtonian mechanics for everyday purposes (and, in fact, we almost always do), but Einstein refined our understanding of the world at the edges, such as when we are moving at speeds close to the speed of light.

  Sometimes we get things entirely wrong, or not as accurate as we would like. But on the whole, the aggregate collection of scientific knowledge is progressing toward a better understanding of the world around us.

  To make this abundantly clear, Sean Carroll, a theoretical physicist at Caltech, wrote a wonderful series on his blog14 that began with a piece entitled “The Laws Underlying the Physics of Everyday Life Really Are Completely Understood.” While he’s not saying that everything is known about our everyday existence, including the complex notions of “turbulence, consciousness, the gravitational N-body problem, [and] photosynthesis,” what Carroll is arguing is that the fundamental laws that underlie the functioning of subatomic particles at everyday temperatures are well-known:

  A hundred years ago it would have been easy to ask a basic question to which physics couldn’t provide a satisfying answer. “What keeps this table from collapsing?” “Why are there different elements?” “What kind of signal travels from the brain to your muscles?” But now we understand all that stuff. (Again, not the detailed way in which everything plays out, but the underlying principles.) Fifty years ago we more or less had it figured out, depending on how picky you want to be about the nuclear forces. But there’s no question that the human goal of figuring out the basic rules by which the easily observable world works was one that was achieved once and for all in the twentieth century.

  Carroll even lays down, in a single equation,15 how electrons work in normal, everyday room temperatures. While this is a very general and optimistic example (most of our world is not so easily described by a single eq
uation), this is often how we uncover everything in the environment around us: as part of our pattern of discovery we asymptotically approach the truth. Returning to species, we can see this in action.

  In 2010, the Census of Marine Life completed its first decade of work. This project, involving more than two thousand scientists from more than eighty countries, was tasked with chronicling and classifying all living things in the ocean. It involves more than a dozen smaller projects and collaborations with organizations and companies, from NASA to Google.

  While they are aware that their work is by no means complete, the team has already produced thousands of scientific papers and discovered well over a thousand new species. A quote from Science Daily16 gives a sense of how unbelievable this is:

  On just two stops in the southeast Atlantic Angola Basin, they found almost 700 different copepod species (99 percent of them unfamiliar) in just 5.4 square meters (6.5 square yards), nearly twice the number of species described to date in the entire southern hemisphere.

  Kevin Kelly refers to this sort of distribution17 as the “long tail of life.” In the media world, a small fraction of movies accounts for the vast amount of success and box office take—these are the blockbusters. The same thing happens on the Internet: a tiny group of Web sites commands most of the world’s attention. In the world of urban development, a handful of cities holds a vast portion of the world’s population. But these superstars aren’t the whole story. While they explain a good fraction of what’s out there, there is a long tail of smaller movies or cities that exist and are still important. Understanding how they are distributed can give us a better picture of how the world consumes popular culture or lives in cities.

 

‹ Prev