Stranger Than We Can Imagine
Page 24
There are, unfortunately, a great number of similar examples. The actions of the Swiss corporation Nestlé highlight the failings of the legal system to prevent corporations knowingly causing deaths. In 1974 Nestlé was accused of encouraging malnutrition and infant deaths through its active promotion of formula baby milk to impoverished mothers in developing countries. Nestlé fought back by launching a libel trial against a War On Want report entitled, in its Swiss translation, Nestlé Toten Babies or Nestlé Kills Babies. A two-year trial followed. Despite the judge accepting the unethical nature of Nestlé’s actions, he found in their favour on the grounds that the corporation could not be held responsible for the deaths ‘in terms of criminal law’. The actions that led to those deaths were the legal responsibility of the corporate personhood of Nestlé itself, which could not be jailed. The individual executives who devised and implemented that behaviour could not be held individually liable.
Examples like these generate a great deal of anti-corporate sentiment. In their 2003 documentary The Corporation, filmmakers Mark Achbar and Jennifer Abbott examined the behaviour of corporations and asked, if a corporation really was a person, what sort of human personality did it exhibit? After cataloguing a lengthy list of unethical behaviour including the inability to feel guilt, a failure to conform to social norms with respect to lawful behaviour and the callous unconcern for the feelings of others, they came to the conclusion that corporations should properly be classed as psychopaths.
Corporations may not actually be individuals, but their pursuit of their desires, and their rejection of related responsibility, is highly reminiscent of teenagers from the 1950s onwards. To paraphrase Peter Fonda, corporations just wanted to ride on their machines without being hassled by the man, and that’s what they were going to do.
There are organisms in nature which depend on externalities in order to grow and survive, such as parasites or cancer cells. But these are typically found at the smaller scales of nature and, while the cost of their externalities may be catastrophic for their hosts, they can still be absorbed by the wider ecosystem. Behaviour that can be absorbed on a small scale can have very different results when it plays out on a larger stage.
The fact that corporations have grown to become such a large and significant part of the world, while still being hardwired to pursue perpetual growth, is deeply unnatural. Nature, in contrast, waxes and wanes. It grows wild at times, but self-limiting feedback loops always rein it back in again. And it is not just the natural world that follows those rhythms. The development of passenger aeroplanes in the twentieth century saw them increase in speed until the Anglo-French supersonic Concorde was flying from London to New York in three hours and twenty minutes. But Concorde was ultimately retired because it was expensive and inefficient, and the same route now typically takes over seven hours.
The speed of commercial flight could not keep increasing indefinitely. It was subject to economic and engineering factors, which kept it within the boundaries required by a stable, functioning transport system. Feedback loops rein in both the natural and manmade world, keeping everything from the speed of planes to animal populations within reasonable limits. But in a corporate economy which actively promotes the pursuit of externalities, the natural feedback loops that would normally impede constant growth are severed, or simply ignored. Infinite economic growth can only exist by becoming divorced from reality.
For a typical Western individual in the middle of the twentieth century, all this was brilliant.
Economically, the first half of the twentieth century had been grim. It had been home to financial horror stories such as the hyperinflation of the Weimar Republic, where a glass of beer cost 4 billion marks in 1923, and the Great Depression itself. But the period from the end of the Second World War to the 1970s looks, from the perspective of the early twenty-first century, like something of a Golden Age. The immediate situation of the postwar world was bleak, and yet untold millions were lifted out of poverty over the decades that followed. Malnutrition and starvation in the Western world were mostly confined to the pages of history. Wages rose and people became acquainted with the term ‘disposable income’. Regular working people gained access to everything from motor vehicles to central heating. Healthcare provision improved enormously, and life expectancy rose. The average man in England and Wales was expected to live to forty-six in 1900, but that figure rose by over a third to seventy-three by 1990. For women, the same figure rose from fifty to seventy-nine. In 1957 the British Prime Minister Harold Macmillan said, ‘Let us be frank about it – most of our people have never had it so good. Go around the country, go to the industrial towns, go to the farms and you will see a state of prosperity such as we have never had in my lifetime – nor indeed in the history of this country.’ From a contemporary perspective that statement seemed complacent, but from a historical viewpoint it was a fair comment.
The growth of corporations was a major factor in the rise in living standards. The President of General Motors, Charles Erwin Wilson, was appointed to the position of Secretary of Defense by President Eisenhower in 1953. This was, with hindsight, a clear example of corporate influence on government. When he was asked if his two roles represented a conflict of interest, he replied that he could not imagine a scenario that created a conflict ‘because for years I thought what was good for the country was good for General Motors, and vice versa’. General Motors was then one of the largest employers in the world and would become the first American corporation to pay a tax bill of over a billion dollars. Its well-paid employees had disposable income which they spent on consumer goods, creating growth in other industries. This in turn created an affluent society full of ready customers who wanted to buy GM cars. Corporate growth had produced a virtuous circle, and society as a whole benefited.
This was a golden period for American industrial design. It would no longer be enough for consumer goods like chairs or toasters simply to be functional, when they could be both functional and aesthetically beautiful. A leading figure in this move was the Michigan-born designer Norman Bel Geddes, who started as a theatrical designer but then began to apply streamlining and aerodynamic principles to everyday objects. His work on the Futurama pavilion of the 1939 New York World Fair was immensely influential on the postwar generation of designers, such as the Paris-born Raymond Loewy, whose work included everything from oil company logos to Greyhound buses and Coca-Cola vending machines. Designers including Bel Geddes and Loewy dreamt up a visual language of America that was far superior to the visual language of the communist East. They exploited new materials like chrome and vinyl, and new methods of production such as moulds and stamping. Americans who watched Flash Gordon B-movies as children would grow up and buy cars with fantastic tailfins that deliberately echoed the rocket ships of optimistic science fiction. Consumers were made to keep spending through ideas like planned obsolescence, where products were designed to break early and need replacing. An example of this was the light bulb, whose life expectancy was reduced from around 2,500 hours to less than 1,000 by an illegal organisation known as the Phoebus Cartel, whose members included General Electric, Philips and Osram.
Advertising was no longer about informing rational consumers about the existence of products. It focused on defining those goods as necessary elements of a consumer’s personal identity. Consumer goods aren’t integral elements of personal identities, needless to say, but the advertising industry was not one to let truth interfere with persuasion.
We might forget names and faces and birthdays, but once jingles and slogans were in our heads they were there for good: Guinness is Good For You, Finger Lickin’ Good, I Want My MTV, It’s the Real Thing, Just Do It. Advertising was a form of black magic. It used the glamour of lifestyle, and an understanding of subconscious psychology, to take money from people in exchange for products they did not previously want or need. In his 2002 documentary The Century of the Self, the British filmmaker Adam Curtis highlighted the connections between the
work of Sigmund Freud and the work of his great-grandson Matthew Freud, founder of the public relations firm Freud Communications. From this perspective, branding, marketing and public relations were arts that manipulated people’s subconscious for financial gain, while at the same time convincing those being manipulated that not only were they in complete control, they were proudly expressing their own individualism.
The growth of individualism was, clearly, of immense benefit to corporations who needed their products to be consumed in large numbers. They did all they could to promote it.
It was an exciting time to be alive. A rising tide of affluence benefited entire populations and suggested that the future could only get better. The American Dream was the American reality. The mix of individualism, advertising and corporate growth was a potent cocktail indeed.
But then, at some point in the 1970s, things changed.
The strange attractor-like shift that occurred in the years leading up to 1980 was not apparent at the time. Economic growth continued as expected, but its impact on society began to change. The Princeton economist Paul Krugman calls the shift in American inequality that began at the end of the 1970s the ‘Great Divergence’, while the New Yorker journalist George Packer refers to the years after 1978 as the ‘unwinding’. The benefits of economic growth slowly removed themselves from the broader middle class, and headed instead for the pockets of the very richest. Good jobs evaporated, social mobility declined and the ‘millennial’ generation born after 1980 are expected to be worse off than the postwar ‘baby boomers’. Life expectancy rates have started to fall, in certain demographic groups at least. At the same time, inequality has increased to the point when, in 2015, the combined wealth of the eighty richest people in the world was equal to that of the poorest half of the world’s population, some 3.5 billion people. Even those who believe that those eighty people work hard for their money would have difficulty arguing that they work 45 million times harder than everyone else.
This retreat of the American Dream, which had promised a future better than the past, is the result of a number of complicated and chaotically linked events from the 1970s. One of these was the rise of Deng Xiaoping to the position of Paramount Leader of the Chinese Communist Party in December 1978, in the aftermath of the death of Mao. Deng began the process of introducing a managed form of capitalism into China. The impacts of this would not be felt immediately, but the availability of cheaper Chinese labour for Western corporations would lead to the disappearance of well-paid Western manufacturing jobs, as well as destabilising trade imbalances. This process of globalisation also led to the disappearance of corporate taxes from government balance sheets. Corporations increasingly embraced globalisation and reimagined themselves as stateless entities in no way beholden to the nations that formed them.
A second factor was the collapse of the Bretton Woods Agreement in August 1971. This was a framework for international monetary management that had been agreed in the small New Hampshire town of Bretton Woods towards the end of the Second World War. The pre-war, every-man-for-himself approach to currency valuation had been responsible for some of the instability that led to war, so Bretton Woods was an attempt to create a more stable environment for international finance. It tied the value of international currencies to the US dollar, which was in turn tied to the value of gold reserves.
This link between the dollar and gold was important. When the Bank of England had first issued banknotes in the currency of ‘pounds sterling’ in the eighteenth century, this meant that the note itself was a convenient stand-in for a certain weight of sterling silver. The paper itself, therefore, was really worth something. For this reason, currencies have historically been backed by something that was both physical and in limited supply, such as gold or silver.
This link between money supply and actual physical wealth created confidence in a currency, but it was a constant frustration for those who dreamt of perpetual, continuous economic growth. The gold standard, as the link between paper money and precious metals was known, tended to produce periods of growth interspersed with periods of deflation and even depression. This may have been a natural and sustainable system, but it was not the sort of thing that democratic societies voted for. President Nixon’s response to a period of economic pain was to end the gold standard, cut the link between the dollar and physical wealth, and ultimately bring Bretton Woods to an end. The value of the dollar could then float free, worth whatever the markets said it was worth.
Thanks to this neat trick of divorcing money from physical reality, the perpetual-growth economy continued in what would otherwise have been a period of recession. It also transpired that the ever-increasing amount of consumption needed for perpetual growth could be financed by creative bookkeeping, and the creation of debt. Debt mountains began to grow. When that approach ran into difficulty, in the early twenty-first century, taxpayer-funded bailouts kept the dream of perpetual growth alive.
Financial traders were able to create wealth out of thin air using wild and psychedelic financial instruments such as those traded on the derivatives market. This involved the trading not of actual things, but of changes to how the market would value things over time. It is no oversimplification to describe the derivatives market as pretty much incomprehensible, which is a problem for those who wish to regulate it. It was recently estimated to have a notional value of $700 trillion, or about ten times that of the entire global economy. In the opinion of the billionaire philanthropist Warren Buffett, ‘derivatives are financial weapons of mass destruction, carrying dangers that, while now latent, are potentially lethal.’
The fact that markets such as these created wealth on paper, while not actually doing anything of value or creating anything tangible, did not unduly trouble those who profited from them. But when Adam Smith defined wealth in his 1776 book The Wealth of Nations, he said it was ‘the annual produce of the land and labour of the society’. Economics was supposed to be a mathematical model of what happened in the real world.
Another factor in the emergence of the Great Divergence was the peak in US conventional oil production at the start of the 1970s, and the rise in the price of a barrel of oil from roughly $4 in 1970 to a price north of $100 in 2008. The price of oil has a proven impact upon national GDP, so even allowing for inflation, this was a significant extra cost which became a drag on economic growth.
This meant that corporations had to work harder to maintain the same rates of growth as before. As energy costs increased, so other costs needed to be reduced, and payroll and taxes were the most likely candidates. This in turn encouraged the move away from the virtuous circle that existed during the period of postwar economic growth. What was good for corporations, increasingly, was not what was good for countries.
The impact of the price of oil on the US economy became apparent in October 1973, when members of the Organisation of Petroleum Exporting Countries began a six-month embargo on sales to the US. This was a protest against the American rearmament of Israel in the aftermath of the Six-Day War. It caused an unprecedented peak in prices and shortages at the pumps for motorists. This did initially spur research into renewable energy, and resulted in the shift in car design towards more aerodynamic, but visually boring, vehicles. Yet the influence of oil corporations on American politics meant that the country ultimately committed itself to a hydrocarbon-based energy policy, regardless of the future cost. This was apparent when Reagan became President in 1981, and immediately ordered the removal of the solar panels President Carter had installed on the White House roof.
The intellectual justifications for the policies that led to the Great Divergence are commonly referred to by the term neoliberalism. Neoliberalism was a school of economic thought that dated back to the 1930s, but it only became the orthodox belief system for politicians and corporations following the election of Margaret Thatcher as the British prime minister in 1979 and the arrival of the economist Paul Volcker as Chairman of the US Federal Reserve in 1979. N
eoliberalism, at its heart, argued that the state was just too dumb to be left in charge of people’s wellbeing. It just didn’t understand the nature of people in the way that the markets understood them. It had only a fraction of the information needed to make good decisions, and it was too slow, inept and politically motivated to put even that knowledge to good use.
As the neoliberalists saw it, the role of the state was to put in place a legal system that protected property rights and allowed for free trade and free markets, and to protect this system by military and police forces. State-owned industries needed to be placed in private ownership and run for profit. At that point the state had to step away and not interfere. Private enterprise would take care of the rest.
Neoliberalism was always going to create inequality, but it claimed that this was for the greater good. If a small elite became immensely wealthy, then this wealth would ‘trickle down’ to the rest of society. Or in a phrase which came to symbolise the 1980s, ‘greed is good.’ Wealth did not trickle down, needless to say. It passed upwards from the middle class to the very top. Few economists now take the idea of the trickle-down effect seriously, but the thinking behind it still colours much of the discussion about global economics. It is still common to hear the very rich described as ‘wealth creators’, rather than the more accurate ‘wealth accumulators.’
The belief that a combination of free markets and property rights would solve all problems meant that sometimes it was necessary to create markets where they had not previously existed. It was blind faith in this logic that led, in 1997, to the World Bank pressuring Bolivia to grant foreign corporations ownership of all Bolivian water. This included rainwater that people had traditionally collected from their own roofs. According to the theory of neoliberalism, privately owned property rights such as these were the best way to give Bolivians access to water. The Bolivian people did not see it this way, especially after the corporations exploited their monopoly and immediately raised water prices by 35 per cent. The protests that followed led to martial law being declared, and one death, before the Bolivian people got their water back.