unSpun
Page 17
REPUTATION OF THE AUTHORITY
When a medical study appears in The New England Journal of Medicine, we know it has gone through a systematic screening process called peer review, in which other knowledgeable scientists are asked to comment or point out possible flaws. The original author may then respond with clarifications or additional data. By contrast, we should always be skeptical of “scientific breakthroughs” that are announced at a news conference without any independent review by other experts. For example, when a news conference in 2002 proclaimed the birth of the first cloned human being (supposedly named “Eve”), it created a brief sensation. But good reporters were quick to point out that the man behind the announcement, a French former journalist named Claude Vorilhon, had renamed himself Rael, claimed to be a direct descendant of extraterrestrials who created human life on earth, and founded a cult. Neither “Eve” nor the mother of the supposedly cloned baby ever appeared publicly. Reasonable people gave the unsupported announcement zero weight and quickly dismissed it as a silly fraud.
TRANSPARENCY
Look for transparency whenever a claim is made. Is the publisher of a poll telling you the statistical margin of error and exactly how the poll takers asked the question? If not, don’t give much weight to the result. Political candidates who are challenging entrenched incumbents like to release polls showing that they are “closing the gap” or even have a lead, in order to convince potential donors they can win. But such polls can be tailored to produce a positive result by including loaded questions. The challenger might ask, “Did you know the incumbent is a wife-beater?” These so-called push questions nudge the respondent toward the desired answer, and a poll containing them is called a push poll. Questions can also be worded in ways that bias the result. One survey conducted by the Annenberg Public Policy Center found a dramatic difference in support for school vouchers depending on whether such phrases as “taxpayers’ money” or “private schools” were included in the question. And polls asking about support for public financing of political campaigns come out one way if the poll taker asks about “banning special-interest contributions from elections” and another if they ask about “giving tax money to politicians” as a substitute.
When reading a news story or article, ask whether the reporter or author is telling you where the information came from. We supply footnotes at FactCheck.org, with links to the sources we are using if they are available free on the Internet, so that readers may find more information or check that we’re getting it right. When you see somebody claim that “a study” has backed up their claims, ask how it was conducted, how many people participated and under what conditions, and whether it really supports what’s being said.
PRECISION
Sometimes evidence isn’t nearly as precise as portrayed. A good example is a pair of studies that produced shocking headlines about deaths in Iraq, studies that have since been widely questioned and disparaged. Both studies were published in the British medical journal The Lancet, and both were produced by a team from Johns Hopkins University in Baltimore. The first was released five days before the 2004 presidential election, and estimated that 98,000 Iraqis had died as a result of the invasion ordered by President George W. Bush in March 2003. The second was released less than a month before the 2006 midterm House and Senate elections, and estimated that the Iraqi death toll had reached 654,965 from the invasion and the violent aftermath. Both were several times higher than other generally accepted estimates.
However, neither estimate was an exact count, just the midpoint of an exceptionally broad range of possibilities. For the first estimate, the authors calculated that their “confidence interval” was somewhere between 8,000 deaths and 194,000 deaths. In the language of statistics, that means a 95 percent probability that the actual figure fell somewhere within that huge range. Put another way, there was 1 chance in 40 that the actual number was less than 8,000 and an equal chance that it was greater than 194,000. As the critic Fred Kaplan put it in an article for the online magazine Slate, “This isn’t an estimate. It’s a dart board.” For the second estimate the dart board was larger, between 393,000 and 943,000 deaths. Such wide ranges of uncertainty are much larger than the plus or minus 2 or 3 percent we are used to seeing in U.S. public opinion polls, and should tell us to beware.
The exceptionally imprecise estimates of the Lancet studies stem from the relatively small sample used to produce them. The estimates came from interviews in 33 clusters for the first study, 47 for the second. Using such randomly chosen “clusters” is a statistical method commonly used when it isn’t practical to draw a random sample of individuals from an entire population. But other experts criticized the Lancet authors for using too few. “I wouldn’t survey a junior high school, no less an entire country, using only 47 cluster points,” said Steven Moore, a Republican consultant who had conducted polling in Iraq for Coalition forces. One of the Lancet authors, Gilbert Burnham, replied that “surveying more clusters would have also meant more risk to the survey team.” He said, “Had we used 470 clusters, our range of plausible values would have been about 3 times narrower.” It is also possible that the results would have been far different.
Indeed, a survey of Iraq by the United Nations Development Programme used 2,200 cluster points, compared to only 33 used by the first Lancet study conducted four months later. And the study—Iraq Living Conditions Survey 2004—estimated only 24,000 deaths, roughly one quarter as many as The Lancet estimated at the time.
CONVERGENCE
In Chapter 6 we mentioned the notion of convergent evidence, and said that when different methods arrive at similar estimates, those estimates are more credible. The reverse is also true: when results diverge, we should be more cautious. To be sure, the Lancet studies seem to support each other, but both produced results that are far higher than those of others. The Iraq Body Count project, for example, tabulated in November 2006 that between 47,016 and 52,142 deaths had been reported in Iraqi and international news media as a result of the 2003 invasion and the continuing violence. That’s just 7 to 8 percent of The Lancet’s 654,965 figure published the previous month. It’s true that the IBC estimates almost certainly missed some deaths that weren’t reported, but we judge it unlikely that they could miss so many.
The 2004 Lancet study was inconsistent both with the Iraq Body Count tabulations and with the United Nations survey. Shortly after The Lancet had estimated 98,000 war deaths, the Iraq Body Count put the count between 14,619 and 16,804 as of December 7, 2004. The United Nations survey estimates war deaths at between 18,000 and 29,000, with the midpoint of that range at 24,000.
After the second Lancet study, Iraq Body Count officials issued a “reality check,” disputing it and pointing out inconsistencies with other data. Delving into the details, they said that if the Lancet study was valid it would mean, among other improbabilities, that an average of 1,000 Iraqis had been killed by violence every single day in the first half of 2006, but that only one of those killings in ten had been noticed by any public surveillance mechanism. They said it also would mean 800,000 Iraqis suffered blast wounds or other serious conflict-related injuries over the preceding two years and that 90 percent of them went unnoticed by hospitals.
We can’t say the Lancet studies are wrong. Unlike Mitch Snyder’s “meaningless” estimate of 3 million homeless persons, which we discussed in Chapter 6, the Lancet estimates both were derived using scientifically accepted methods and were published in a reputable, peer-reviewed journal. The findings also are stoutly defended not only by the authors but by some independent experts as well. Nevertheless, given both the extraordinary imprecision of the figures and their failure to square with other observations, we can’t accept them as accurate until and unless validated by other researchers using a much larger sample.
FINAL RULE: Be Skeptical, but Not Cynical
THE SKEPTIC DEMANDS EVIDENCE, AND RIGHTLY SO. THE CYNIC ASSUMES that what he or she is being told is false. Throughout this book
we’ve been urging you to be skeptical of factual claims, to demand and weigh the evidence and to keep your mind open. But too many people mistake cynicism for skepticism. Cynicism is a form of gullibility—the cynic rejects facts without evidence, just as the naïve person accepts facts without evidence. And deception born of cynicism can be just as costly or potentially as dangerous to health and well-being as any other form of deception.
To understand this notion, consider Kevin Trudeau, the author of a book that topped the New York Times best-seller list for a time in the summer of 2005: Natural Cures “They” Don’t Want You to Know About. Trudeau pumped up sales with a massive campaign of late-night infomercials in which he claimed that “there are in fact natural, non-drug, and non-surgical cures for virtually every disease.” His basic claim—an appeal to cynicism that he repeated over and over—was that “they” were conspiring to suppress information about known cures: “The drug companies don’t want you to know the truth, the Food and Drug Administration, the U.S. government does not want you to know the truth. Why? Because it would cost them too much money in profits if you knew the inexpensive, natural remedies.”
Trudeau’s pitch is that “they” are lying but he will tell you the truth. Just buy his $14.95 book ($39.95 on audio CDs,) or subscribe to his $71.40-a-year newsletter, or become a $999 “lifetime member.” And, oh, yes, buy his $19.95 weight-loss CD, whose title is, he claims, “censored by the Federal Trade Commission.” Did you know the FTC is in on the conspiracy, too? As is the food industry, which, Trudeau claims, is putting unspecified ingredients in “diet” products that actually make people fat. No wonder we can’t lose weight! The “censored” title: How to Lose 30 Pounds in 30 Days, just the sort of extravagant and unsupported claim the FTC often cites as misleading advertising.
We call Trudeau’s pitch an appeal to cynicism because he is trading on the public’s belief the federal government can’t be trusted, and that big corporations—especially pharmaceutical companies—pursue profit so blindly that they are capable of almost any villainy. We might agree that some of the practices of drug companies justify criticism, but Trudeau is hoping you will automatically—cynically—accept his claim that big companies and the government are secretly conspiring to make you fat. This tactic has made untold millions of dollars for him, unless he’s conning the public about that, too. His company once claimed to have sold 4 million copies of his book alone. His marketing plan must still be working, because in May 2006 he came out with a sequel: More Natural Cures Revealed: Previously Censored Brand Name Products That Cure Disease.
But here’s why you should be skeptical—of Trudeau. Just do a little research using any Internet search engine and you will quickly discover a few facts about this master salesman:
• Trudeau has a criminal past. He served nearly two years in federal prison after a 1991 guilty plea to credit card fraud in which he bilked American Express of $122,735.68. In 1990, he served twenty-one days in jail and got a three-year suspended sentence on a Massachusetts state conviction for larceny after depositing $80,000 of worthless checks. At the time, he was posing as a doctor.
• Trudeau has been repeatedly cited for false advertising. In 1998, he agreed to pay $500,000 to settle FTC charges that he appeared in a string of infomercials that claimed, among other things, that his “Mega Memory System” could enable anyone to achieve a photographic memory. In 2003, the FTC and FDA charged him with falsely claiming in infomercials that a dietary supplement called Coral Calcium Supreme could cure cancer. He agreed to stop making such claims but then continued to do so anyway, leading a federal judge in Chicago to find him in contempt of court. Later in 2004, Trudeau agreed to pay $2 million to the FTC and to cease making infomercials selling any product at all, except for “informational” material, which is protected by the First Amendment. That was when he switched from selling pills to selling books, CDs, and newsletters.
The $2.5 million that Trudeau has paid to settle earlier false-advertising cases is probably chump change compared to what he’s taking in from a public made gullible by its own cynicism. It is easy to see why Trudeau’s appeal works so well. Politicians love to blame “corporate greed” whenever prices go up, and Hollywood loves to cast corporate executives as villains in movies and TV crime shows. Since the Watergate scandals and the Vietnam War, a large majority of Americans who once trusted government to do the right thing now say they believe it is controlled by special interests and not run for the common benefit. Public trust of drug companies is particularly low. But that shouldn’t be a reason to fall for the unsupported claims of a convicted felon and incorrigible huckster who’s making millions selling books about bogus “natural cures.” And anyone who tries using those “cures” instead of seeking competent medical advice is putting his or her health and even life at risk.
So we say cynicism can kill you. But you can save money, and maybe your life, if you are skeptical about claims like those made by Trudeau and the many others like him. Always look for real evidence.
Conclusion
Staying unSpun
STAYING UNSPUN REALLY BOILS DOWN TO FOLLOWING A FEW PRINCIPLES that we’ve been talking about throughout this book. When confronted with a claim, keep an open mind, ask questions, cross-check, look for the best information, and then weigh the evidence.
CASE STUDY: Hoodia Hoodoo
TO SHOW HOW TO PUT THESE SIMPLE BUT POWERFUL MENTAL HABITS into practice, let’s walk through a quick fact-checking of a real-life claim you may already have encountered. Let’s say you have seen on the CBS News website a snippet of a 60 Minutes program in which correspondent Lesley Stahl is telling you about the next big thing in dieting: a rare South African cactus called Gordon’s Hoodia (or Hoodia gordonii). Stahl is in the Kalahari Desert, where she says the native San tribes-people eat Hoodia to suppress their appetite on hunting trips. “Scientists say it fools the brain by making you think you’re full, even if you’ve just eaten a morsel.” A weight-loss pill may soon be on the market. The Web version of the story carries the headline “African Plant May Help Fight Fat: Lesley Stahl Reports on Newest Weapon in War on Obesity.”
Hoodia gordonii, the rare South African cactus purported to suppress appetite naturally
This is no late-night infomercial huckster talking; this is a tough reporter who once covered President Richard Nixon’s Watergate scandal. Stahl reports that after eating a piece of the plant she went all day without feeling hungry, and that she experienced no aftereffects either. “I’d have to say it did work,” she reports.
Wow! Where can I get this stuff? You quickly search the Internet for “hoodia,” and find a cyber-bazaar of merchants hawking “Pure Hoodia,” “Pure Hoodia Plus,” “Hoodia Supreme,” “Desert Burn” Hoodia, and any number of other brands. You also see websites offering advice on finding the “best” Hoodia products among all the clamoring competitors. Several provide a link to a video clip from the 60 Minutes program on the CBS News website. They feature testimonials—for example, one from “Sarah” of Los Angeles, who says, “I used to always have cravings at night, but those cravings went away.” This is sounding better and better.
But before you send off $149.95 for a five-month supply of this magical substance, take a few minutes to ask questions. How do I know this works, and is it safe? Just because Lesley Stahl swears by the freshly cut cactus she nibbled in the Kalahari Desert doesn’t mean the capsules you buy from an Internet merchant will have the same effect, or even came from the same plant. And we had better dismiss those Internet testimonials: they’re anecdotes at best, and they could be fabricated for all we know. Where are the scientific test results?
A bit of cross-checking turns up more information. Our Internet search has also brought up a 2003 story from a BBC reporter, Tom Mangold, who sampled the “Kalahari diet” even before Stahl. After eating a piece of cactus about half the size of a banana, Mangold reported that he and his cameraman “did not even think about food” for the four-hour drive back to
Cape Town. “Dinnertime came and went. We reached our hotel at about midnight and went to bed without food. And the next day, neither of us wanted nor ate breakfast.” But read on.
The BBC story also warns us that the stuff we’ve been seeing advertised may be just another weight-loss scam. The rights to develop a diet drug from Hoodia are owned by a British company named Phytopharm, and clinical trials still have several years to run. The reporter adds: “And beware Internet sites offering Hoodia ‘pills’ from the U.S., as we tested the leading brand and discovered it has no discernible Hoodia in it.” Oddly, several Hoodia hucksters actually post a link to this BBC story on their websites, probably figuring that few will actually read it and most will just assume it’s an endorsement.
To be fair to CBS, Stahl’s full report also warned against the claims of Internet marketers of Hoodia products. It mentioned that the wild cactus is so rare that Phytopharm has established a plantation in an attempt to grow it in the huge quantities that would be required to meet demand should tests prove that the product is safe and effective. But the Internet Hoodia merchants who link to the CBS report probably figure you won’t notice that part. They just post a link to the story, with introductions such as “Leslie [sic] Stahl…Hoodia works!!”