Stranger Than We Can Imagine
Page 25
Neoliberalism produced a significant retreat in the role of the state, in comparison to the postwar Golden Age. When the world of empires was replaced by a world of nation states after the First World War, those nation states justified their new position by taking over the duty of protection which emperors had traditionally offered their subjects. This new form of noblesse oblige took the form of social and welfare programmes, as well as standing defence and police forces. This resulted in an increase in the size of the state. Total government spending in the US was under 10 per cent of GDP before the First World War, and typically between 30 and 35 per cent of GDP during the second half of the century. This pattern was repeated in most of the Western world. The reason why the neoliberals thought that they could reduce the size of the state was because corporations, as they grew to become more powerful than nations, were not beholden to any concept of noblesse oblige. It was simply not their job to protect ordinary people from psychopaths. Which was lucky, given the clinical assessment of their own personalities.
By the end of the twentieth century neoliberalism had become orthodoxy. As corporate power grew, its influence over politicians and media companies increased, in no small part because of their need for corporate money. Protests about corporate power occurred only outside of the political and cultural mainstream. The idea that a Western democratic politician from a mainstream political party could gain office with a platform that aimed to reduce corporate power, or increase corporate responsibility, became increasingly implausible. This is despite how popular a policy of, for example, making corporate executives legally responsible for their decisions would be with the electorate. There may have been widespread concern that a profit-led society was fundamentally inhuman, as well as depressing and unimaginative, but there was no way to express that opinion at the ballot box.
The years that followed the Great Divergence also produced the environmental movement. To the environmentalists, the neoliberalist pursuit of perpetual growth was delusional and deeply troubling. They had had their perspective on the earth radically changed by the photos brought back by the Apollo programme. While the planet had previously been imagined as an endless horizon and a bountiful, unexplored frontier, ripe for plunder, environmentalists now knew that it was a finite, closed system. The earth was limited, and in these circumstances the pursuit of perpetual growth was dangerous.
An old Indian legend illustrates the problem. An Indian king named Sharim was presented with an exquisitely made chessboard, and he was so delighted with the gift that he asked the craftsman to name the reward he would like in return. The craftsman asked for one grain of rice on the first square of the chessboard, two grains on the second, four grains on the third and so on, with the amount of rice doubling for each square on the board. King Sharim was surprised that the craftsman asked for such a trivial gift, and readily agreed. But the amount of rice increased rapidly from square to square. By the time they reached the twenty-first square it had reached over a million grains. By the last square the king was required to supply more rice than existed in the entire world. The amount of rice had not increased in the containable, linear fashion that the king had expected, but had instead increased geometrically. Geometric or exponential growth is like compound interest, in that the increases towards the end massively overshadow the increases at the start.
The doubling of rice is an extreme example, because those who believe in perpetual economic growth do not expect the economy to actually double each year. Yet even a seemingly manageable rate of growth, such as 2 per cent a year, requires the economy to double in size every thirty-five years. That means that twice as much real-world trade and economic activity have to occur in the space of roughly one human generation. But that is not the end of the problem because the amount of growth then continues to expand exponentially. It does not take long for the required size of the economy to become absurd.
All of this raises the question of when a global economic system based on perpetual growth will collide with the physical reality of a finite planet.
This was the question asked by the global think tank the Club of Rome, who in 1972 produced an influential book called The Limits to Growth. The Limits to Growth examined the implication of exponential growth in a number of categories, from human population to food production and resource depletion, and from that it generated a number of potential future scenarios. In one of those scenarios, the world stabilised in the mid- to late twenty-first century and fell into a sustainable system. In the other two scenarios, it didn’t. The result was societal and economic collapse.
The Limits to Growth stressed that it was not attempting to make definitive predictions and was instead attempting to understand the behavioural tendencies of the global system. That said, a number of follow-up studies undertaken over thirty years after its publication have reported that the current data is roughly in line with its projections. Unfortunately those are the projections that point to overshoot and collapse, rather than the one that points to stabilisation. Increasing inequality of wealth seems to make the situation worse. It is the rich and powerful who are most able to change the system, but they are the last to be affected by collapse and have a greater investment in maintaining the status quo.
The reaction to The Limits to Growth was telling. It was rejected out of hand not by those who engaged with its data or arguments, but by those who were ideologically invested in the neoliberal project. It threatened constraints on individual behaviour and was dismissed for those reasons. It was not necessary to study the research into deforestation, depletion of topsoil, over-fishing or increasing water salinisation. The environmental perspective had to be wrong, because it was incompatible with individualism. Less than a century after our understanding of ourselves had been dominated by the top-down, hierarchical framework of masters and subjects, deferment to the desires of the individual had firmly cemented itself as our unshakeable new omphalos.
Environmentalism, from this perspective, was anti-human scaremongering which failed to take into account mankind’s ingenuity. Imagination was a non-limited resource, and humans could adapt and solve problems as they developed. The Limits to Growth, it was argued, was no different to An Essay on the Principle of Population by the English clergyman Thomas Malthus. Writing at the end of the eighteenth century, Malthus had argued that population growth would lead to mass starvation. This scenario failed to materialise, in Western countries at least, thanks in part to the development of pesticides and fertilisers. But in a race against exponential growth it does not follow that, just because you keep up at the start, you will be able to keep up permanently. Exponential growth was like a video game which becomes increasingly difficult the more you play. The fact that you can complete level one doesn’t mean that you will make it through level twenty-three in one piece.
The most significant, if unspoken, question about the alarms raised by environmentalists was this: would the point where the system collapses occur after my lifetime? For many of the baby-boomer generation, then comfortably into middle age, environmentalism didn’t seem worth rejecting individualism for because the economic system looked like it should be able to keep going for at least another three or four decades. That baby boomers would think this with no regard for their children and grandchildren is a particularly damning indictment of individualism.
The clash between individualism and environmentalists is perhaps best illustrated by the global reaction to climate change. By the late 1980s it had become clear that the release of greenhouse gases on an industrial scale was affecting the climate in a manner which, if it continued, would be catastrophic. Crucially, there was still time to prevent this. The issue quickly reached the global stage thanks in part to influential speeches by Margaret Thatcher, most notably her 1989 address to the United Nations General Assembly. Thatcher was a trained chemist with a good grasp of the underlying science. ‘The problem of global climate change is one that affects us all and action will only be effective if it is taken at
the international level,’ she said. ‘It is no good squabbling over who is responsible or who should pay. Whole areas of our planet could be subject to drought and starvation if the pattern of rains and monsoons were to change as a result of the destruction of forests and the accumulation of greenhouse gases. We have to look forward not backward and we shall only succeed in dealing with the problems through a vast international, cooperative effort.’
This did not sit well with oil corporations. Selling hydrocarbons was a far easier way to achieve short-term profitability than a long-term research programme into alternative energy infrastructure. The technical challenges involved in producing carbon-free energy at a price and quantity that rivalled oil were, as scientists would say, ‘non-trivial’.
The oil corporations and free-market think tanks began exercising their influence, in both government and the media, in an effort to prevent the international action on climate change that Thatcher spoke of. Their main tactic was a stalling approach which promoted a fictitious sense of doubt about the scientific consensus. This was an approach borrowed from the tobacco industry, which had used a similar disinformation campaign to cast doubt on the links between smoking and lung cancer. Those links were first discovered in 1950, but the tobacco industry was able to pretend otherwise for over four decades. Their campaign was highly successful in corporate terms because, even though hundreds of thousands of people died in one of the most unpleasant ways possible, they made loads of money and nobody went to jail.
In a similar way, the disinformation campaign of the oil industry was able to postpone action on climate change. It made it politically impossible for the United States to ratify the 1997 Kyoto Protocol, which aimed to set binding obligations on the reduction of greenhouse gas emissions from industrialised nations. After every typhoon, drought or flood, news programmes could be relied on to broadcast politicians angry at the suggestion that the extreme weather events now occurring could be linked to science which says that extreme weather events will increasingly occur. Even Margaret Thatcher had to amend her views after it became clear how much they offended her political allies. While her 1980 talks displayed clear scientific understanding of the situation, her 2003 book, Statecraft, fell back on the political talking points that cause climate scientists to bang their heads on their desks in despair. Curbing climate change was a front for a political viewpoint that she disagreed with, and for that reason no efforts to curb climate change should be made. Ideology beat science. Individualism beat environmentalism. So carbon continued to be emitted, topsoil continued to decrease and the ice sheets on the poles continued to melt. The debt which funded the consumer activity that caused all this continued to grow. As a result, the window when runaway climate change could have been prevented now appears to have closed.
And in the background, the Sixth Extinction continued. What chance did the Golden Toads have in a century such as that?
A screenshot from the 1985 Nintendo computer game Super Mario Bros. (ilbusca/iStock)
FOURTEEN: POSTMODERNISM
I happen to have Mr McLuhan right here
If you want to understand postmodernism you should spend a few hours playing Super Mario Bros., a 1985 video game designed by Japan’s Shigeru Miyamoto for the Nintendo Entertainment System.
In Super Mario Bros. the player takes control of a mustachioed Italian plumber named Mario. Mario’s job is to travel across the Mushroom Kingdom in order to rescue Princess Peach, who has been kidnapped by Bowser, the monster-king of the turtle-like Koopa people. None of that, it is worth stressing, makes any sense.
Super Mario Bros. is a combination of elements that don’t fit together under any system of categorisation, other than the game’s own logic. Fantasy kingdoms are all well and good, but they are not usually the playground of Italian plumbers. Likewise the mix of elements Mario encounters in the game, from giant bullets to fire-spitting pot plants, does not lend itself to logical scrutiny. There is no need to look for hidden meaning in the symbolism of Super Mario Bros., because it isn’t there. The character of Bowser, for example, was originally intended to be an ox, but he became a turtle-beast simply because Miyamoto’s original drawing looked more like a turtle than an ox. Mario himself was also something of an accident. He originally appeared in the arcade game Donkey Kong and was known as Jumpman, because he was a man who could jump. He was later christened Mario as an in-joke, in honour of the landlord who owned the warehouse that was being rented by Nintendo of America. Princess Peach was rechristened Princess Toadstool for the American version of the game, for no reason of any importance.
None of these things affected the success of the game. What mattered was that each element was fun in itself. This is probably the most recognisable aspect of postmodernism, a collision of unrelated forms that are put together and expected to work on their own terms. The idea that an outside opinion or authority can declare that some elements belong together while others do not has been firmly rejected.
A related aspect of postmodernism is what theorists call jouissance. Jouissance refers to a sense of playfulness. The French word is used over its closest English translation, ‘enjoyment,’ because it has a more transgressive and sexualised edge that the English word lacks. Postmodern art is delighted, rather than ashamed, by the fact that it has thrown together a bunch of disparate unconnected elements. It takes genuine pleasure in the fact that it has done something that it is not supposed to do. A good example of postmodern jouissance can be found in the British dance records from the late 1980s, such as MARRS’ ‘Pump Up The Volume’ or ‘Whitney Joins The JAMs’ by The Justified Ancients of Mu Mu. These were records made by musicians who had just gained access to samplers and were exploring what they could do. They were having a whale of a time playing around and putting together all sorts of unconnected audio.
A third postmodern element can be seen in the mass-produced nature of the game. Super Mario Bros. is made from code, and that code is copied to create every instance of the game. It is not the case that there is one ‘real’ version of the game, while the rest are inferior imitations. The code that ran on Shigeru Miyamoto’s development system, at the moment he signed the game off as complete, does not have some quality of authenticity that a battered second-hand copy found in a market in Utrecht does not. The status of identical copies of a work of art had been a hot topic in the art world ever since the German critic Walter Benjamin’s 1936 essay The Work of Art in the Age of Mechanical Reproduction. As far as postmodernists were concerned, that debate was over. Every mass-produced copy of Super Mario Bros. was intrinsically as good as all the others, and no amount of hoping to find some magical aura imbued in an artist’s own copy could change that.
A fourth important factor is that the game is well aware that it is a game. Super Mario Bros. makes no attempts to hide the conventions of the form, and will regularly highlight them in a way that games such as chess or tennis do not. Should the player find and collect a green and orange ‘1-Up’ mushroom, they will be rewarded with an extra life and hence extend their playing time. In a similar way, the game is littered with rewards, power-ups and other gameplay factors that affect the structure of play, and which only make sense in the context of a video game.
This self-aware element of postmodernism is sometimes associated with film, television or theatre, such as the 1977 Woody Allen movie Annie Hall. Allen’s character was able to win an argument with a self-righteous bore in a cinema queue by producing the media critic Marshall McLuhan from off-screen. At this point Allen turned to the camera and said, directly to the audience, ‘Boy, if life were only like this!’ In doing so he acknowledged the artificial nature of the situation: that he was a character in a movie, talking to a camera, in order to address a future audience of cinemagoers.
Postmodern moments like this are rare in the narrative arts, because they rely on the suspension of disbelief for their power. They are more common in the genre of comedy, such as the work of the British comedy troupe Monty Python. Th
e final sequence in the ‘Spanish Inquisition’ episode of their second television series involved three members of the Spanish Inquisition being late for a sketch that they were supposed to appear in. Once this was realised they hurried off and caught a bus in order to get to the sketch. They knew that they were running out of time because the end credits had started rolling over them. They finally arrived in the sketch at the moment the programme ended.
Another postmodern aspect of Super Mario Bros. is that each time the game is played, it is different. There is no one true version of the game, and hence no true ‘authorial intent’ to provide the correct understanding of Miyamoto’s work. Some users even go so far as to alter the code in order to create different versions of the game, known as mods. For gamers, this is entirely valid.
Postmodernists have firmly internalised Duchamp’s insight that when different people read a book or watch a movie, they perceive it differently. There are many interpretations of a work, and it cannot justifiably be argued that one particular perspective is the ‘true’ one, even when that perspective is the author’s. People can find value in a work by interpreting it in a way that the author had never thought of.
Finally, the game itself transcends the categories of highbrow and lowbrow, being simultaneously high art and populist fluff. When Super Mario Bros. was released in 1985 cultural critics would have dismissed it as lowbrow, had they been aware of it at all. Video games were then seen as dumb, noisy things for kids, and it took a number of decades before claims for their cultural validity were heard. Yet Super Mario Bros. was named as the best game of all time by IGN in 2005. It becomes difficult to classify a dumb bit of kids’ entertainment, which is hailed as the pinnacle of a recognised art form, as being either highbrow or lowbrow.