Book Read Free

What Intelligence Tests Miss

Page 19

by Keith E Stanovich


  An important domain in which the inversion of conditional probabilities happens quite often is medical diagnosis. It has been found that both patients and medical practitioners can sometimes invert probabilities, thinking, mistakenly, that the probability of disease, given a particular symptom, is the same as the probability of the symptom, given the disease (as a patient, you are concerned with the former).

  Strategic Mindware

  Much of the mindware discussed so far represents declarative knowledge. However, not all mindware is declarative knowledge. Some of it would be classified as procedural knowledge by cognitive scientists—that is, as strategies and dispositions to process information in a certain way. For example, many of the principles of probabilistic reasoning I have discussed so far would be classified as declarative knowledge, whereas the tendency toward disjunctive thinking would represent strategic mindware.

  Varying tendencies for regulating processing, for information pickup, and for belief calibration are dispositions of the reflective mind that are sometimes, but not always, measured by questionnaires.14 For example, the thinking disposition need for cognition affects our tendency to engage the reflective mind in problem solving. It is measured by questionnaire items that ask a person to agree or disagree with statements such as: “The notion of thinking abstractly is appealing to me,” and “I would prefer a task that is intellectual, difficult, and important to one that is somewhat important but does not require much thought.” My research group has studied a thinking disposition termed belief identification: whether belief change in order to get closer to the truth is an important goal or whether retaining current beliefs is a more important goal. It is measured by questionnaire items that ask a person to agree or disagree with statements such as: “Beliefs should always be revised in response to new information or evidence,” and “It is important to persevere in your beliefs even when evidence is brought to bear against them.”

  Some thinking dispositions are measured by performance-based tasks. For example, the reflectivity/impulsivity disposition is assessed by performance on the Matching Familiar Figures Test (MFFT). In the MFFT, participants are presented with a target picture of an object, and their task is to find the correct match from an array of six other pictures that are quite similar. Participants’ time to respond and number of errors are measured. Reflective people have long response times and few errors, whereas impulsive people have short response times and numerous errors.

  Other thinking dispositions of the reflective mind that can be assessed by either questionnaire or performance-based measures are: typical intellectual engagement, need for closure, belief perseverance, confirmation bias, overconfidence, openness to experience, faith in intuition, counterfactual thinking, categorical thinking, superstitious thinking, and dogmatism. The commonality among these types of mindware is that they are closer to strategies, tendencies, procedures, and dispositions than to declarative knowledge structures.

  Dysrationalia Due to a Mindware Gap

  Irrational behavior due to a mindware gap occurs when the right mindware (cognitive rules, strategies, and belief systems) is not available to use in reasoning and decision making. However, in order to constitute a case of dysrationalia, the mindware gap must occur in an individual of substantial intelligence. How likely is this? Mindware gaps most often arise because of lack of education or experience. Thus, it is not surprising that there is a positive correlation between the acquisition of some of the mindware discussed in this chapter and intelligence.15 But the correlation is far from perfect. Many individuals of high intelligence lack critical mindware, and many individuals of low intelligence use mindware to make rational responses. For example, if we look at the subset of subjects in a university sample who are all above the median SAT for their institution, we find that less than half of them can use the base rate correctly in situations such as the XYZ virus problem discussed in this chapter.

  So while there are modest correlations between rational thinking mindware and intelligence, there is still plenty of room for the dissociation that defines dysrationalia to occur. Although it is true that highly intelligent individuals learn more things than the less intelligent, there are other factors involved.16 The explicit teaching of some of the mindware discussed in this chapter is very spotty and inconsistent. That such principles are taught very inconsistently means that some intelligent people fail to learn such important aspects of critical thinking. The studies showing that people often fail to think of alternative possibilities for events, ignore P(D/∼H), commit the conjunction fallacy, fail to use base rates, and invert conditional probabilities often employ as subjects university students—most of whom are presumably of high intelligence. This must also have been the case in the example that I gave at the outset of the chapter regarding the pediatrician who wrongly testified about the probabilities involved in sudden infant death syndrome (although he probably had thinking problems that combined a mindware gap with tendencies toward overconfidence).

  Training on such mindware remains rare even later in life. As legal scholar Jeffrey Rachlinski argues, “In most professions, people are trained in the jargon and skill necessary to understand the profession, but are not necessarily given training specifically in making the kind of decisions that members of the profession have to make. Thus, even though some psychologists have argued that certain types of reasoning can be taught quickly and easily, such training is extremely rare. Generalized training that allows people to avoid a wide range of cognitive errors also seems unavailable” (2006, p. 220). In summary, although we might expect mindware gaps to occur somewhat less frequently in individuals of high intelligence, the powerful mindware that prevents irrational thought and action is often inadequately learned by many people regardless of their cognitive ability.

  The mindware of rational thinking—strategies for dealing with probabilities, for thinking about causes, for thinking about what conclusions follow from arguments—currently goes unassessed on intelligence tests. If these strategies were assessed, the tests would identify some individuals as more intelligent than do current tests and some as less so. Intelligence tests would be measuring rational functioning, and rationality would be part of MAMBIT (the mental abilities measured by intelligence tests). But as the tests are currently constructed, they do not—and because they do not, we have dysrationalia due to a mindware gap.

  ELEVEN

  Contaminated Mindware

  Civilizations have never gotten along healthily, and cannot get along healthily, without large quantities of reliable factual information. They also cannot flourish if they are beset with troublesome infections of mistaken beliefs.

  —Harry Frankfurt, On Truth, 2006

  We as human beings are also irrational animals, unique among animals in our capacity to place faith in bizarre fictions of our own construction.

  —Robert Fogelin, Walking the Tightrope of Reason, 2003

  The country of Albania was a communist dictatorship for many decades. It was also one of the poorest countries in Europe, but by 1991–1992 it had started to turn itself around, granting more personal and economic freedoms. Economic strides were made from 1992 to 1997. The International Monetary Fund lauded the country’s progress during this period as markets opened, GDP increased, inflation eased, the budget moved closer to balancing, and foreign investment went up. This economic and social improvement came to an end in early 1997 when the economy collapsed, lawlessness broke out, army depots were plundered by irregular armed bands, and the government lost control of a large part of the country. In 1997, Albania collapsed—basically, because of mass dysrationalia.

  Albanian society imploded in 1997 because by that time over one-half of its population had become involved in Ponzi schemes, and in the early months of that year the schemes—as they always do—collapsed.1 Ponzi schemes offer extremely large rates of return to initial investors. In a Ponzi scheme, no assets are actually owned by those running the scheme (thus it is insolvent from its first da
y of operation), but that does not mean that the early investors are not paid their promised return. Early investors in fact are paid off with the money put into the scheme by later investors. The high returns paid to the early investors spawn (usually by word of mouth) a rush of new investors who in turn cause more excitement, and the scheme runs on this self-reinforcing basis for a time. Of course, mathematics eventually catches up with the scheme, and at some point the pyramid collapses—usually after the originators have managed to skim off a considerable amount of money from gullible investors.

  Usually prospective investors are given a complicated explanation for the high returns. Some of the scam artists operating in Albania explained to their investors that the high rates of return were generated by foreign currency speculation; others claimed complex mining schemes were behind the profits; and one even proclaimed that the returns were generated from investment in California hotels. In Ponzi schemes, it is often the case that the more complex and exotic the purported scheme for generating the profits, the more enticing the scheme seems to potential investors.

  Ponzi schemes operate all over the world, but it was the sheer magnitude of the Albanian schemes that was noteworthy. They had become popular by offering interest rates of 30 percent monthly on money invested—when real banks and real companies offered investment opportunities of only a tiny fraction of that return. Once the early schemes became popular, they spawned many newer competitors. However, in order to entice investors, the newer schemes had to offer even better rates. At the end of 1996, many of the Ponzis (which of course traveled under the names of legitimate-sounding companies) were offering interest rates of 50–60 percent monthly, and one was actually offering 100 percent. Of course, the higher the rate of return, the quicker the scheme collapses because it eventually becomes impossible to recruit enough new money to pay off the obligations owed to earlier investors.

  By 1997, fully one half of Albania’s adult population was participating in such schemes! People took out mortgages on their homes in order to participate. Others sold their homes. Many put their entire life savings into the schemes. At their height, an amount equal to 50 percent of the country’s GDP was invested in Ponzi schemes. Before the schemes collapsed, they actually began to compete with wage income and distort the economy. For example, one business owner saw his workforce quickly slip from 130 employees to 70 because people began to think they could invest in the Ponzi schemes instead of actually working for their income.

  Ponzi schemes are related to pyramid schemes in that the latter often operate by giving recruits to the system (who pay a fee to join) a commission for bringing in new recruits—who then in turn try to recruit new members on the same logic. The combinatorial explosion ensures that the schemes will exhaust themselves after just a few iterations, leaving approximately 80 percent of the recruits (the latest ones) at a loss. In pyramid schemes there is often a nominal product being sold, but the focus is always on the new recruits, not the product supposedly being sold. A Ponzi scheme instead involves no recruiting at all for commissions. There is no product. It is a simple case of paying returns to early investors from the investments of the newer investors. At some point, the promised returns cannot be delivered to everyone who is owed them, and those running the scheme usually try to abscond with the remaining money.

  How could people have thought that such a system could continue once everyone was participating in this manner? Likewise, how can people ignore the mathematical implications of a pyramid scheme where 15 people recruit 15 people and so on? (After 15 people recruit to seven levels, over half the population of the United States would be involved!)

  People ignore the mathematics because they have become prisoners of contaminated mindware. The underlying logic behind Ponzi and pyramid schemes is essentially the same: the people hosting such contaminated mindware come to believe that the laws of economics—laws that they see all around them and that they have experienced throughout their lives—can be defied. They come to believe that there is a way to obtain returns on investments that are orders of magnitude greater than those in traditional financial instruments and that such a scheme involves no risk. The scheme is usually justified with a clever explanation, but however clever the justification, belief in Ponzi and pyramid schemes is bad mindware. It leads people to take actions that they will come to regret.

  Thousands of Albanians lost their entire savings and their homes when the schemes collapsed. The country descended into chaos as riots broke out. The government could not guarantee the investments of the population because at the time of the collapse the five largest companies operating the Ponzis had $49 million in assets to cover $475 million worth of liabilities—and the latter figure was twice the value of the country’s GDP. As is usual in such fraud, much of the actual money had disappeared into foreign banks, and many of those who perpetrated the fraud had fled or were being held in jail but claiming bankruptcy along with the other investors.

  Because such a large segment of the population was participating in these schemes, we can be certain that many of the individuals caught up in this economic hysteria were highly intelligent people and thus were exhibiting dysrationalia.2 They had acquired irrational economic beliefs—they were dysrationalic due to contaminated mindware. Certainly mindware gaps are involved too, but since in the last chapter I discussed problems that those lead to, here I would like to focus our attention on situations where mindware is acquired but where that mindware is maladaptive.

  Contaminated mindware can often spread and sweep through a specific population like an epidemic. In the late 1980s, encouraged by therapists who themselves were in the grip of some complicated mindware, many patients in psychotherapy began to remember being sexually abused, usually by family members, as small children. The psychotherapists who encouraged these reports had theories about why these memories had been forgotten and then remembered subsequent to therapy. The favored explanation was one of dissociation in childhood, and this led to an epidemic of diagnoses of multiple personality disorder. As Elaine Showalter explains:

  Therapists maintained that children dealt with the pain, fear, and shock of sexual abuse through splitting or dissociation. The memory of abuse was always there but contained in another personality or many personality fragments—“alters” who sprang into being to contend with the trauma. Therapists could contact these alters through hypnosis, using the Inner Self-Helper, an alter who mediates between the various fragments. Then they could reach a child alter, who might testify to sexual abuse as well as to other suppressed or repressed aspects of the host personality. (1997, p. 159)

  Professional associations spread these ideas in the absence of any research evidence that this theory was correct. And these interrelated sets of theories linking recovered memory with multiple personality disorder replicated quite quickly throughout various therapeutic communities. Prior to 1970 there had been fewer than a dozen cases of multiple personality disorder reported in the United States in the previous fifty years. The disorder did not even become an official diagnosis of the American Psychiatric Association until 1980. Yet by the 1990s thousands of such cases had been identified.3

  As this so-called recovered memory phenomenon gained steam, the claims made by the patients in therapy sessions became more and more bizarre. Some patients began to report that they were not only sexually abused as children but that they had been victims of satanic ritual abuse. Showalter describes the case of a woman, SRB, in her forties, who had a degree in biochemistry from Yale. Subsequent to therapy sessions, she began to believe that her parents had belonged to a ring of child pornographers who used children in rituals with satanic overtones. She recalled having been sold into prostitution and being tortured with electricity and drugs. She also thought that she had become pregnant as a seventh grader and had been compelled to have an abortion.

  The literature contains dozens of such examples, many of them are more lurid than this one, and virtually all share a problematic aspect of SRB
’s case—there is no independent evidence that any of the events occurred. The patients involved had had no memories of this abuse prior to entering therapy. This was true in SRB’s case. She had endured many years of unsuccessful therapies for various phobias, but prior to 1986 had reported no memories of any sexual abuse. In 1986 she attended a workshop for survivors of child abuse and in therapy began to present three different personalities. It was there that her stories of sexual abuse and satanic rituals began to emerge. No one questioned the accuracy of SRB’s stories—even though there was no independent evidence to corroborate them. This was because the belief systems of the therapists had been shaped so as to not ask for independent evidence (“if the patient thinks she was abused then she was”). The mindware represented by this belief system requires only that the patient and therapist believe in the coherence of the narrative. But these narratives were not so innocent. People were convicted of abuse charges because of them.

  Both the patients and the therapists in the recovered memory epidemic of the 1980s and 1990s were the victims of contaminated mindware—mindware that leads to maladaptive actions but that resists evaluation. This and the Ponzi scheme example illustrate that not all mindware is helpful. When discussing mindware gaps, one can easily get the impression that more mindware is always better. The examples of Ponzi schemes and the recovered memory epidemic illustrate that people can acquire mindware that not only fails to prevent irrational action, but is actually the cause of the irrational action.

  “If That Man Had Two Brains He’d Be Twice as Stupid”

  The title of this section is the punch line to an Irish joke told to me by Desmond Ryan. I cannot remember the rest of the joke, but this funny line summarizes the implication of the existence of dysrationalia—that having more brainpower in the form of intelligence is no guarantee against foolish behavior. This is especially true of irrational action caused by contaminated mindware.

 

‹ Prev