A Short History of Stupid

Home > Memoir > A Short History of Stupid > Page 15
A Short History of Stupid Page 15

by Helen Razer


  In that sense, the sheer ineptitude of so many conspiracies, by any credible measure, is irrelevant. Fluoridation has been a spectacular failure as a mass-poisoning scheme. False flag gun massacres intended to provide the basis for greater gun regulation in the United States have likewise failed. The death of JFK ushered into the White House the most aggressively liberal president on domestic issues in US history. If the Jews or the Rothschilds or the Masons or an international drug cartel headed by the Queen of England controlled the world’s financial system, they did a terrible job in 2008 with the destruction of trillions in wealth in the global financial crisis. But none of that matters. What matters is that someone, somewhere is in control—in secret, and with evil designs (if plainly incompetent), but at least there’s someone in charge. This is a more appealing idea than that there might be something profoundly wrong with your society, or that US intelligence and law enforcement agencies are inept, or that a beloved figure could be killed in an entirely meaningless accident or act of violence. More appealing than the idea that things, even big, dramatic, epoch-making Things, do not, necessarily, happen for a reason.

  Conspiracy theories are also a readily obtainable marker of status, at least in the eyes of the theorists themselves. To believe in a conspiracy is to understand the way the world really works, to have privileged access to information that the rest of society doesn’t have or refuses to acknowledge, to be an insider, part of an information elite self-selected because of their intelligence and scepticism. This information snobbery enables the theorist to look down on the rest of us who are too dumb or too sheep-like to recognise reality, who’ve been gulled by the cover stories of the conspirators: ‘That’s what they want you to think.’

  This can lead to a certain exasperation if a conspiracy turns out—as some do—to be correct. ‘There’s nothing new there; everyone knew that’ is the annoyed reaction of many experts to the revelations about National Security Agency surveillance, seemingly angered that everyone else has now acquired what some previously used as a personal marker of distinction. Conspiracy theories are a club that becomes unappealing if too many people join, their members epistemological hipsters who knew 9/11 was a con, that Big Brother was watching us, that the moon landings were Stanley Kubrick’s finest work and that LBJ killed JFK before it was popular to believe it.

  A decision to believe in conspiracy theories, and what conspiracy theories you choose to believe, is thus similar to a consumer decision to buy one particular product or another based on its advertising and what the product says about you. In Australia, affluent inner-urban families are significantly less likely to vaccinate their children than in lower-income areas, and a similar phenomenon has emerged among affluent, well-educated mothers in California and the UK. Deliberately refusing to vaccinate one’s children has thus become a marker of social status, a lifestyle accessory that marks one off from the broader herd, the immunity of which is so important. For older right-wing males, belief in a giant UN-controlled global-warmist conspiracy demonstrates their individualistic, capitalistic mindset. More traditional-minded conspiracy types might prefer the Old Faithfuls of the conspiracy world, JFK and fluoridation, while scholarly theorists embrace the Illuminati, the Knights Templar and plots involving the Catholic Church, ancient texts and historic figures—although, God help us all, Dan Brown might have come dangerously close to popularising those.

  But while conspiracy theories notionally give the theorist a greater, more authentic understanding of the world by explaining what is really going on, they’re often predicated on ignorance. At the heart of many conspiracy theories are howling errors of fact or absurd non-logic. The blood libel, for example, is said to have derived from apocryphal tales of the behaviour of Jewish families during the Crusades; in the face of forced conversion to Christianity, they preferred to die or kill themselves and their families. What would, in a Christian context, be interpreted as martyrdom was considered evidence that Jews were disposed to murder children. The Knights Templar were disbanded because the King of France saw it as a way of evading the huge debts he owed the order. Rather than being an incipient dove, JFK was every bit as signed up to escalating the war in Vietnam as his successor.* The claims of 9/11 truthers about air interception and controlled demolitions of the towers have been repeatedly debunked. The Protocol of the Elders of Zion is a fabrication. The lawyer for drug trafficker Schapelle Corby admitted simply inventing his claim that corrupt baggage handlers planted the cannabis she tried to import to Bali. Andrew Wakefield falsified data in the paper he used to claim a link between autism and vaccination and planned to make money from scaring people off vaccination; he’s been struck off the medical register in the UK and a US judge threw his libel suit out of court.

  But conspiracy theory factoids are the cockroaches of epistemology, capable of surviving even a nuclear blast of contrary evidence. Indeed, the act of disproving them often reinforces their validity in the eyes of adherents. After all, if you’re bothering to try to discredit their theories, they must be on to something, especially if you work in the media and are therefore in on all the conspiracies because you need to cover them up.

  Another core theme of most major modern conspiracy theories is a belief in the unity and competence of governments. Most of the challenges confronting governments around the world—keeping economies growing, delivering services effectively while struggling to convince voters of the desirability of paying tax, the looming impact of ageing populations—look a doddle compared to the superhuman feats of organisation required to poison the entire population without anyone knowing, tightly control thousands of climate scientists around the world or use swine flu to declare martial law as a prelude to a New World Order. Disillusioned voters the world over can only dream of governments with the organisational genius required to pull off the conspiracies with which they are so often charged.

  The sordid truth is that governments aren’t especially competent even in the areas they directly control, and nowhere near as unified as conspiracy theorists make out. If multiple US government agencies had conspired to kill JFK, they almost certainly would have had problems of interoperability and demarcation. Bureaucrats everywhere protect their turf. Some like to build empires. Some like to avoid responsibility of any kind. All look with scepticism at the activities of other agencies. Tribalism is still tribalism despite the PowerPoint presentations and bureaucratese. And information is harder to control than conspiracy theorists realise. There’s always someone who gets caught and spills, or a whistleblower, or an agency that won’t cooperate, or even if a secret is kept close, the passage of time tends to out it. Climate change denialists like to compare global warming to Lysenkoism, the absurd agricultural and genetic theories that became legally enforced orthodoxy in the Soviet Union in the 1930s and 1940s. But in fact Lysenkoism demonstrates how hard it is to enforce quackery even with a state apparatus dedicated to surveillance and thuggery—it took full-blown Stalinism to suppress opposition to Lysenkoism, and only in the Soviet Union itself; in other Eastern Bloc countries, it was criticised and rejected. After Stalin’s death, criticism of Lysenko re-emerged despite the Communist Party’s control mechanisms and within a decade he was denounced and his theories abandoned.

  But in the minds of conspiracy theorists, democratic governments operate with Stalinist brutality and remarkable efficiency, moving at top speed to execute their plans flawlessly, like in a Hollywood film: the black helicopters materialise at a moment’s notice, the men in suits all act as though part of a hive mind, decisions are made in a split second, plenty of resources are always available. But as anyone who has worked in government knows, bureaucratic reality is messy, and laborious, and frustrating: half the black helicopters are being refitted because of a poor procurement decision, there aren’t enough pilots on duty to fly the rest, and poor intelligence caused by agencies refusing to share information has sent them to the wrong location anyway.

  This can be seen in the best recent examp
le of a conspiracy theory that turned out to be true: that a US government agency, in league with its counterparts in the UK, Canada, Australia and New Zealand, has a giant internet and phone surveillance and computer-hacking system that it uses to monitor everyone in the world with a phone or internet connection. This was an actual, real-world conspiracy of exactly the kind portrayed in movies, in which the US government, from the president down, implemented a vast, secret plot to spy on their own people and the rest of us. But the core problem was the vast nature of it—it was so huge that the plot extended beyond politicians and government officials and men and women in uniform to private contractors, cleared by a privatised former government security vetting agency. It only took one contractor among hundreds of thousands to decide that the illegal, secret mass surveillance being conducted by the NSA needed to be exposed for the conspiracy to fall apart. Even in the America of Barack Obama, in which whistleblowers are jailed as spies and journalists are regularly spied on, it proved impossible to keep secret a giant government plot to turn the planet into a panopticon.

  And as it turns out, secrecy is a highly inefficient way of conducting affairs. Julian Assange has written of a ‘secrecy tax’ that makes organisations trying to operate in secret less efficient, less internally communicative and less adaptable. The response of the US government to the Edward Snowden revelations has been a sublime demonstration of the secrecy tax in operation: despite knowing that more revelations were to come, and thus preparing for them, or even being proactive in pursuing a debate about surveillance that was going to happen anyway, the Obama Administration and its agencies remained, for months, painfully reactive and relied for an extended period on denial, evasion and casuistry that has left officials, Congressional figures and the president himself embarrassed.

  Another logical fallacy at the heart of conspiracy theories is the straightforward post hoc ergo propter hoc (‘after this, therefore because of this’). For many theorists, what comes after a major event, and who benefits from it, provide an insight into who caused that event. But merely because Western governments have exploited 9/11 to justify a significant reduction in their citizens’ basic rights and funnel money to defence contractors does not mean 9/11 was an inside job; merely because the Vietnam War rapidly accelerated after 1964 does not mean JFK’s death was a factor. The proper question isn’t ‘Who benefited?’ but ‘Who best exploited it?’.

  The problem is, when governments behave as if they are engaged in conspiracies, they enable conspiracy theorists. The War on Terror has encouraged conspiracy theories because governments have given themselves more power, decreased accountability, engaged in extra-judicial killing, kidnapping, torture and unjustified imprisonment, and reduced transparency. A government that by its own admission abducts people and transports them to ‘black sites’ for torture, taps the entire internet or breaks into the systems of major internet companies even after those companies have given them access to their data, can easily be assumed to be doing much worse besides that we don’t yet know about; indeed, it may be sensible to assume that they are doing much else that hasn’t been disclosed. Even a government like Australia’s secretly approved its citizens being ferried about for torture while publicly denying any knowledge of them, and used its intelligence services to bug the Cabinet rooms of a vulnerable micro-state to benefit a resources company.

  This furtive behaviour of governments, their continual arrogation of power, their attacks on whistleblowers and their treatment of their own citizenry as (to use the National Security Agency’s own term) ‘adversaries’ to be constantly monitored seem designed to confirm the worst biases of the paranoid and make belief in conspiracy theories look like a sensible precaution rather than seeing them as the delusions of tinfoil wearers. Real-world governments behaving like governments in movies do worse than blur the line between fact and fiction, they undermine the basic compact of trust between electors and those who, at least notionally, serve them as elected leaders. It becomes much harder to argue that governments are not conspiring against their own citizens when, in fact, that’s exactly what they’re doing.

  BK

  * In its Cold War form, this theme produced three of the greatest science fiction films: the original Invasion of the Body Snatchers, Howard Hawk’s The Thing From Another World and then John Carpenter’s remake, in which the most faithful human friend of all, the humble dog, turns out to be—spoiler alert—One Of Them.

  * Gore Vidal long claimed that FDR was complicit in Pearl Harbor, and was a 9/11 truther in his last years, demonstrating that Vidal was less America’s Biographer than its Dream Diarist.

  * Proper Kubrick conspiracy theorists, of course, can see that the director left any number of clues pointing to the earlier conspiracy in The Shining, thereby giving us a conspiracy theory about a conspiracy theory inside a movie.

  * The term began life as a description of a form of psychosis, but has since spread more widely—quite rightly given what a cinematic term it is.

  * And for the same reasons—JFK had poor advice from both the State Department and the military about Vietnamese nationalism, and the Democrats couldn’t stomach a reprise of ‘who lost China’.

  7

  Reason and unreason: How we’ve all gone Stupid-mad in an age of absolute sanity

  To think of Dr Freud as that guy who wrote about dicks is to think about Bruce Springsteen as that guy who helped Courteney Cox out with her career. If you care to listen to The Boss or Freud, you’ll find they both told us with great intelligence how we’re just dancing in the dark. Although, of course, many people think we’re dancing in the sun of reason. We should all be made to listen to Springsteen and read Freud to remind us that this age, in many respects, is as dark an age as any and, in fact, quite a bit darker than most. Mostly because we think it is so illuminated.

  This is a feature of every age. Generally, we tend to presume that this time is the most enlightened time and even if we romanticise the past—and I fully intend to glorify Freud or at least redeem his reputation as That Penis Guy—we do so through the enlightened filter of the present. We know that nostalgia is never a longing for a real thing lost; it’s more of a romantic way of avoiding the present. It is probably worth mentioning that in the Good Old Days, nostalgia was actually a mental disorder.

  Nostalgia. It ain’t what it used to be. What it first was, in fact, was a medical disorder recognised in soldiers deeply troubled by their longing for the past. During the Thirty Years’ War, some men were discharged from the Spanish Army of Flanders with the condition that was first described in a 1688 medical text. Up until the American Civil War, this illness was treated.

  Of course, now we are so much more enlightened and we know that soldiers were just suffering the effects of war. They had post-traumatic stress disorder, silly! Thank goodness we know that.

  And thank goodness we can now recognise depression. Did you know that Willy Loman from Death of a Salesman was depressed? Well, Arthur Miller, who wrote the play, didn’t and he was a bit, um, depressed by the news that for a 1999 Broadway revival, director Robert Falls gave the script to two psychiatrists. Willy was diagnosed as depressed.

  ‘Willy Loman is not a depressive,’ Mr Miller said to the New York Times. ‘He is weighed down by life. There are social reasons for why he is where he is.’

  Willy Loman was weighed down by life. Soldiers were weighed down by death. Diagnosing any of these men, either in the terms of their own era or by ours, just seems silly. External forces impacted on them and it would be, surely, as Stupid to get Willy some Prozac as it would be to take men forced to kill other men into family counselling. Or Lady Macbeth into a treatment program for obsessive-compulsive disorder.

  But. We are very enlightened. To remove Willy Loman from his broad social context and to give him a narrow psychological diagnosis is the work of our age. But it wasn’t Freud’s work, now discredited and darkened by the false memory of a big penis. Psychiatry as it is now—organised and immen
se and as detached from the world that produced it as Willy is from life—is very often very Stupid. And not only is this iffy branch of medicine itself Stupid, but it has begun to endorse our individual Stupid in a way that extends well beyond patient care. It is, after all, the science of the self. Psychiatry has not only itself become a Stupid science decried by some of its own most respected practitioners; it is now very useful in enhancing our everyday Stupid.

  It seems stupid to say that a soldier is suffering the internal illness of ‘nostalgia’ and not the external impacts of war. But, what we say about Loman, or what we might say about ourselves is just as stupid. This is an era that makes its influence felt. Which is to say, there has never before been an instant where we have been so intimate with the institutions of the state and the market. Opting out is not a realistic option in a time that demands our participation in and consumption of social and electronic media, goods, organised labour and all of those exchanges that create a ‘normal’ and socially viable person.

  We tend to think of our time as one that celebrates the uniqueness of the individual. And, in some ways, there are greater freedoms to behave in ways that are not orthodox; for example, living in a homosexual relationship will not make you a pariah. Neither will failing to show up on every Sabbath to your temple. But what will mark you indelibly is, for example, a refusal to engage in social media. In 2012, both Forbes Magazine and Time ran articles proposing that an individual without a searchable social media history was far less likely to gain employment in white collar industries. Drawing on the experiences of a former Facebook employee, Katherine Losse, who left the company and cashed in her options, the articles concluded that non-participation in the profit-seeking social media sector not only made life difficult, it was almost impossible. In a piece in the Washington Post that same year, Losse, who deactivated all her own social media profiles for a time, explained how Facebook kept ‘dark’ profiles of those who had not yet joined the big blue giant. ‘The moment we’re in now is about trying to deal with all this technology rather than rejecting it, because obviously we can’t reject it entirely’, Losse told the Post.

 

‹ Prev