And this is precisely the problem. We both were simply rejecting anything newer than our own childhood. Just as many of us only view “technology” as anything invented after we were born, we took our baseline—when we started playing with Lego—as the way things should be in the realm of Lego.
But is it always this simple? Are we forever mentally stuck in whatever state of the world we were born into? Or can we change the knowledge in our heads, even if it’s a bit harder?
• • •
IN chapter 5, I explored how knowledge spreads and diffuses. But even if it spreads rapidly, what about the speed with which it comes to be accepted? Just as there are phase transitions when it comes to what we know, there can also be phase transitions in how knowledge is accepted and assimilated. Because even when facts spread, sometimes they take time to actually fix in our minds. And this is just as true in the realm of the scientist as it is in the world of the layman.
Clearly, science is not an abstract venture that is done in isolation from everyday human issues. It is not some endeavor immune to passions and biases. Science is an entirely human process. Science is done through hunches and chance recognition of relationships, and is enriched by spirited discussion14 and debate around the lab. But science is also subject to our baser instincts. Data are hoarded, scientists refuse to collaborate, and grudges can play a role in peer review.
The human aspect of science plays an important role when it comes to the acceptance of new knowledge. We don’t always weigh the evidence for and against a new discovery or theory and then make our decision, especially if it requires a wholesale overhaul of our scientific worldview. Too often we are dragged, spouting alternative theories and contradictory data, to the new theoretical viewpoint. This can be very good. Having more than a few contrarians keeps everyone honest. But it can also be very bad, as when Semmelweis was ignored and essentially driven mad by his colleagues’ refusal to accept the truth. But eventually, in the face of overwhelming evidence, the majority will generally accept the new theory, before their recalcitrance becomes too counterproductive.
Lant Pritchett, a professor of international development at Harvard’s Kennedy School of Government, is all too aware of this. In the field of international development there are many sacred cows, and challenges to them are not met with as much cool and calculating logic as one might wish. Pritchett recently proposed an intriguing idea15 to help developing countries: create lots of guest worker programs. But is everyone simply weighing its merits? Not exactly. Pritchett argues that a more apt way to describe how these ideas are adopted is that they often follow this trajectory: “Crazy. Crazy. Crazy. Obvious.”
Plot that on a graph, and you’ve got a phase transition, but this time it’s one about how ideas are accepted and adopted. Thomas Kuhn, a physicist turned historian of science, also discussed how such rapid transitions occur in his celebrated book, The Structure of Scientific Revolutions. Kuhn used the term paradigm to refer to a holistic worldview or theory that can be used to explain our surroundings. (While Kuhn did not invent the word paradigm, he used it so much and so often that he is credited with its popularization.) For example, Newtonian gravitation is a very good theory, and has a great deal of explanatory power. But while Newtonian mechanics is actually still used for a large number of engineering applications, it has since given way to the theoretical worldview put forth by Albert Einstein. This change in perspective was termed a paradigm shift by Kuhn.
Kuhn argued that switching from one paradigm to another16 is a messy process and often involves scientists digging in their heels to the extent that their retirement or death—with their attendant replacement by younger and more open minds—might be required for the new paradigm to become accepted.
Maxwell Planck, another physicist, codified this in a maxim: “New scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”
This seems intuitively obvious. Due to science being the biased and human affair it is, we can’t expect the old stalwarts of science to change their minds when a new idea comes along. We just have to wait for them to die.
However, Planck’s Principle turns out to be wrong.
This can be seen through a careful examination of the work of Charles Darwin. The quintessential phase transition in science, and paradigm shift, is that of the theory of evolution by natural selection. Everything in biology prior to evolution was sophisticated stamp collecting, ordering the living world around us and exploring its wonders. With the advent of evolution, biologists finally had a conceptual framework to make sense of the facts surrounding them. But the acceptance of evolution wasn’t immediate. While On the Origin of Species was a bestselling book, it did not find universal agreement within the Victorian populace.
The same was true of the scientists themselves. David Hull, a philosopher of science, examined many of Darwin’s well-known contemporaries to see who eventually accepted the theory of natural selection, and how long it took them to do so. Hull, along with two graduate students, Peter Tessner and Arthur Diamond, examined sixty-seven British scientists from Darwin’s time. They found that only about three quarters of them had accepted Darwinian evolution ten years after On the Origin of Species was first published in 1859. So evolution was not the rapid phase transition of knowledge acceptance we thought it might have been.
But is this due to the vast majority of the holdout scientists being older? Were Darwin’s ideas rapidly accepted by the younger generation, and was age simply masking what was in fact a phase transition among the younger scientists? It is true that the average age of those who accepted evolution was younger than those who still rejected it after ten years. But there are some complications. Age explains only about 5 percent of the variation of acceptance or rejection of this theory. The younger scientists didn’t necessarily accept it rapidly; they accepted it at a rate similar to the older scientists who accepted it, over the course of a decade. More recent research into Planck’s Principle has generally confirmed Hull’s initial insight: Planck’s Principle doesn’t hold.17 Younger scientists aren’t necessarily more likely to accept new ideas, and new ideas don’t spread through a population as rapidly as we might expect.
While there are biases for how we assimilate facts, we can’t even rely on common sense for understanding how factual inertia works: We have to test our irrationality. This is encapsulated in the work of Duncan Watts, a principal researcher at Microsoft Research. Watts has demonstrated,18 in numerous studies that explore everything from how certain songs become popular to how marketing works, that we are very good at telling stories to ourselves that sound true but must be subjected to the rigors of quantitative analysis for verification.
Understanding how concepts penetrate a group’s consciousness in the scientific realm combines both the spread of knowledge through a population and all the cognitive biases we’ve discussed so far. But only looking at a single discipline, like biology or even economics, doesn’t quite capture how prevalent the issues are that affect the delicate interplay between individual beliefs or ideas and the overall “facts” of a community.
One area in which we are forced to grapple with all of this interplay in its wonderful complexity—between what the community knows and what each of us knows—is in the realm of language.
• • •
LANGUAGE is a fickle thing, always changing. This is even recognized in the two ways linguists discuss grammar: prescriptive grammar and descriptive grammar. Prescriptive grammar is the way things ought to be, while descriptive grammar is the way things are. Prescriptivists held sway in centuries past, declaring what is allowed and what is not. They are responsible for such blanket rules as bans on split infinitives or ending sentences with prepositions.
On the other hand, descriptivists aim to chronicle the way we ac
tually use language. While it turns out that we are still subject to many rules, these are often subconscious and less set in stone. There can be a great deal of overlap between these two areas of grammar, but it diminishes as time goes on, as our actual language shifts and changes around us, widening the gap between the stone-inscribed rules of the prescriptivists and the observations of the descriptivists.
Language is a complex mix of flux and stability. On the one hand, there is evidence that the frequencies19 of the sounds of consonants in Old English are by and large the same as those in modern English, even though we modern English speakers are separated from Old English by one thousand years.
On the other hand, we also have many cases of linguistic change, such as new words being introduced and old words going extinct. Similarly, words themselves change, such as when verbs become more regular over time, and become more adherent to grammatical rules. In English we have verbs that are both regular and irregular. For example, the past tense of discuss is discussed (a regular verb that fits the “-ed” past tense), but the past tense of speak isn’t speaked; it’s spoke. Luckily, this change is not random: It turns out that the more frequently used words are those that are less likely to change, with a clear quantitative rule. Specifically, the rate of a verb’s regularization20 is inversely proportional to the square root of its usage frequency. So how can we understand linguistic facts, and their interplay between change and stability?
Most of the facts we have examined so far are either what we as a society think is true (as in scientific truth) or what is the current state of the world (such as the speeds of the fastest computers). But when it comes to language, we’re in a different sort of factual realm. Unlike those people who adhere to a prescriptive approach to linguistics, there’s no real objective truth, with immutable rules that reside in some manual and that is completely independent of language speakers. A misuse of a word isn’t wrong if enough people begin using it that way. Once most people start using disinterested and uninterested interchangeably, it just becomes annoying to continue to correct everyone.21
The facts of language are a sort of population average of each individual’s set of rules. Each person’s approach, known by the delightful term idiolect, is a mercurial thing that is subject to what you learned when you were young, and to who’s around you. It includes your vocabulary, grammar, pronunciations of words, and accent. Our linguistic facts hit the knowledge change jackpot: They are a complicated combination of slow adaptive change, factual inertia, and shifting baseline syndrome.
When speaking to others, we push and pull their speech patterns in various directions, even if only subconsciously, and they in turn influence us. There are many examples of this; one is voice onset time, which refers to how long it takes to produce the sound of certain consonants. This is performed completely automatically, but it is not unchangeable. While speaking with someone who has a longer voice onset time,22 a conversant often subconsciously begins to mimic the other.
Another subconscious language example is the situation-based dialect: A team of linguists studied Oprah23 Winfrey and how she introduced guests of different races. They found that she actually changed how she spoke during introductions, depending on whether her guest was white or black. This is similar to the person I know who was born in South Africa but raised in the United States: He only has a South African accent when speaking to his parents. Or how my wife switches between using soda and pop, depending on her location.
Of course, we’re not entirely products of the influences of those around us; there are certain limits to our malleability. For example, while lengthening voice onset time causes a similar change in a listener, doing the opposite, shortening one’s voice onset time, does not cause the conversant to also shorten theirs. Henry Kissinger, who has lived in the United States for well over seventy years, still has a very strong German accent; his idiolect has not changed one bit. Understanding language acquisition and change at the individual level is a complex and highly multidimensional issue.
But ultimately, seeing how language changes, and how we mentally respond to this, can give us insights into how we adapt to the facts around us.
In an article about taboos and curse words, the linguist John McWhorter examines how this linguistic change happens around us:24
One reads with bemusement at scientists once perplexed at unearthing enormous bones of creatures now nonexistent. Between the teachings of the Bible and the brevity of a human life span, it took centuries to grasp that the world’s fauna and flora have been in an eternal and imponderably long state of transformation. On language, the layman is today often in a similar state of perplexity. A language, too, is as inherently changeable as the lump in a lava lamp. However, print lends a sense that “real” language doesn’t change, and we live too briefly to see much but hints otherwise.
Hints, of course, we do see: When Ginger Rogers says in an old movie that a man “made love to” her we know she means what we would express as “come on to.” However, we do not live long enough to know that two hundred years ago obnoxious meant “subject to injury” or that eight hundred years ago quaint meant “clever.”
We are often like objects being dragged through mud. We change, but slowly, and with the residue of where we came from upon us.
Sometimes these changes are rapid and widespread, such as during the wonderfully named period in human history known as the Great Vowel Shift. The first time I stumbled across this phrase in my introductory linguistics textbook, I was fascinated. Apparently there were linguistic equivalents to the Black Death, the Great Awakening, the Enlightenment, and the Industrial Revolution. When I looked more carefully, though, it wasn’t quite as dramatic as I first expected. While its exact causes are still unknown, it involved a shift, over the span of a couple of hundred years, beginning in the fourteenth century, when the pronunciation of certain vowel sounds changed. It is the reason that we now say “mouse” and “mice” instead of “moose” and “meese,” which is what they used to be.
But imagine living during this. As people changed, what would be our response? Would we be confused by these changing facts or adapt rapidly to what was happening?
I had my own personal, far less great, example of this. It occurred when I was younger, when my brother and I were speaking with my grandfather. One of us described some activity as “very fun” only to have our grandfather inform us that this was not proper speech. One simply does not say that something is “very fun.” But we felt that there wasn’t anything wrong with it. I can clearly remember our confusion, trying to tell our grandfather that people say this all the time, and that what we had said was correct.
Figure 9. The frequency of the phrase “very fun” over time, as a curve. Notice that around 1980, the phrase’s frequency increases rapidly. Data courtesy of Google Books Ngrams and the Cultural Observatory.
In fact, we were simply part of a shift in usage that was happening around us. While considered improper English for nearly two hundred years, this phrase became acceptable around the early 1980s.
This is an example of shifting baseline syndrome and can perhaps give us a hint of what it would have been like to live during the Great Vowel Shift. My grandfather had not recognized the slow shift in language around him until confronted with generational knowledge, or, in this case, a two-generation jump in the linguistic facts around him.
This sort of shift in one’s own mental linguistic rule set has actually been quantified in an attempt to understand it. When, and in what situation, we learn a language often affects how we process it for our entire lives, and sometimes even affects how we view the facts around us, such as in the case of my grandfather.
Linguists have looked at various aspects of a regional accent based on age.25 There are numerous examples of differences in the numbers of changes being present as a population is examined by age: The older the people examined, the less likel
y they are to have a certain linguistic innovation, whatever it may be. Of course, people also change during their own lives, too; despite numerous instances of people adhering to what they learned during childhood, there are also many instances when they alter their speech patterns both consciously and unconsciously over the course of their lives. The comedian and actor Stephen Colbert, for example, made a concerted effort to lose the Southern accent of his South Carolina childhood. But there are many times when linguistic change at the level of the community can be seen by looking at speakers of different ages.
Such linguistic decay is quite widespread. For example, something like the Great Vowel Shift occurred among French Canadians during the twentieth century. In Quebec, over the course of several decades, they began to change their pronunciations of the vowels in certain words. And just as before, those who were younger were more likely to exhibit this shift.
But, intriguingly, it didn’t happen evenly across all words; there was a certain situational aspect to the shift.26 Words associated with the good old days, such as those having to do with parents, World War I, and even iceboxes, did not change alongside other words. There is speculation that younger speakers heard these words more often from their elders (who did not change their vowels), and were thus more likely to maintain the original pronunciations. One’s linguistic facts are affected in a very real way by who we hear them from.
This is similar to my relationship with certain terms from cellular biology. When I took a cell biology course in college, I learned the topics from a British professor. Since I haven’t gone on to study cell biology further, I am certain that many of the terms I use for cellular organelles are frozen in a British-style pronunciation. This was made clear when I was speaking with my father about programmed cell death, a process known as apoptosis. I said it as “a-puh-TOE-sis,” only to be informed that many Americans actually say “a-pop-TOE-sis.” While the American pronunciation sounded far sillier to me, apparently I was the one who sounded silly.
The Half-Life of Facts Page 20