Millennium
Page 38
This simple model accounts for many aspects of our society. The ‘s’-shaped ‘civilisation curves’ we encountered in the eighteenth, nineteenth and twentieth centuries, reflecting the slow start, rapid acceleration and eventual levelling-off of so many changes discussed in this book, reach that final stage because the new behaviour becomes universal. When 100 per cent of the adult population has the vote, no further increase is possible. And when something has become firmly established, it is very difficult to change it. Every newly elected politician must wonder at how little power he or she has in reality, being restricted by so many conventions. You can see this crystallising force in everything from the adoption of units of measurement to laws and professional standards. Over the course of time, certain patterns of behaviour become enshrined in tradition so that alternatives become less familiar, less attractive and even threatening to the established order. A tribe of hunter-gatherers may well have to face disruption and upheaval when the herd of wild animals on which they rely moves to a new grazing ground thirty miles away but their lack of permanent structures allows the people to adapt relatively easily. If the herd moves, the tribe moves too. In a modern town, however, a lack of food in all the shops within thirty miles would be a far more serious problem. The most significant changes are experienced when society is forced to deviate from its entrenched patterns of behaviour. When New Holland was renamed Australia and New Amsterdam became New York, the process in each case was straightforward. Imagine trying to rename Australia and New York today: you’d face a logistical nightmare, political upheaval and communications mayhem. The more firmly established our patterns of behaviour, the more difficult it is to give them up. The lighter our footfall on the planet, the less significant the shifts in our behaviour and the smaller the change.
Why, then, has change not ground to a halt over the centuries? It stands to reason that if there is a tendency for our patterns of behaviour to crystallise, then generally things should be changing less and less. The explanation of this paradox is another paradox: the more things are set in stone, the more things change. Stability itself is a destabilising factor. In economic terms, as Hyman Minsky pointed out, stability leads to complacency, over-lending and a boom-and-bust cycle of prosperity and depression. And as regards population, as Malthus explained two hundred years ago, stability leads to population growth, which in turn puts pressure on food supplies. In addition, the systematic exploitation of any finite asset or resource is liable to lead to depletion, thus eventually forcing change. Traditional fishing grounds become overfished. Continual farming of the same piece of ground removes the nitrogen from the soil, making it infertile. When the rich seams of ore run out, a mine becomes redundant. On top of these factors, many people earn their salaries froom effecting change. Builders, architects and town-planners alter the landscape as part of their jobs. Scientists, inventors and entrepreneurs similarly are required to develop the ways in which we live. Then you need to consider cultural clashes. A steady influx of immigrants to a small island may be initially welcome but attitudes are likely to change when their overwhelming numbers start to erode the island’s culture. Even deliberate attempts to resist change tend to result in new patterns of behaviour. Whereas old buildings were once regularly pulled down to make way for new ones, now the old structures might be preserved and new systems put in place to make sure they are not altered. The only way for a community not to experience constant social change is for it to be isolated and self-sufficient, with enough resources to satisfy all its requirements, no risk of it exhausting them, no need to defend them, no need to avail itself of technological advancement, and a death rate that corresponds with the birth rate. It is doubtful that any such community exists today – although it is possible that some tribes in the Amazonian rainforest still live according to ancient patterns.
Having said this, the fact that we can hypothesise what it would take for a society to undergo little or no change means we can also hypothesise the criteria that do affect change. The key word is ‘need’. If a society has no need to do something it is not currently doing, then the chances of it changing are greatly reduced. If we concentrate on that essential point we can measure change over a series of centuries by its consequences – that is, by the extent to which it met society’s most important needs. We therefore have to ask what those needs might be.
A scale of needs
What causes a significant social development? It is not that someone has a great idea and everyone else follows suit; it is never as straight-forward as that. The social context has to be right for a good idea to take root. The compass was known for centuries before it was regularly used for crossing the world’s oceans; many people had questioned the practices of the Roman Catholic Church long before Martin Luther; Francis Ronalds’s telegraphic system was rejected by the Admiralty – and so on. As we have seen so often in this book, it is not the invention that results in a major change so much as the adoption of that invention by a significant proportion of the population. There has to be sufficient demand for the change in question in order for the invention to take off. That said, the ‘demand’ is not always consciously expressed. Few people demanded to fly long distances at high speeds in 1900. However, the advantages of airborne transportation were immediately obvious. Military commanders, for example, could attack an enemy’s capital city without a full-scale invasion. People could travel around the world for business or pleasure. The potential was always there for a series of rapid developments to follow the invention of a suitable engine to propel an aircraft. Had the internal combustion engine been around just 60 years earlier, in 1800, the passenger railway might never have been invented: there would have been no demand for it.
What creates a level of demand sufficient for a single invention to change the world? Looking back over the last thousand years there seems to be a fundamental shift in the thirteenth century. The Four Horsemen of the Apocalypse – Conquest, War, Famine and Disease – have caused changes throughout human history but society was particularly vulnerable to these threats in the first two centuries we examined. In the eleventh century the development of castles, the resistance to Viking invasions, and the spread of the influence of the Church were all closely connected with the threat of conquest and war. In the twelfth century, the population expansion was associated with the provision of food, and the changes in medicine and the rule of law sought to address disease and ‘war’ (in the sense of social disorder). But in the thirteenth century, money entered the picture. People now did all they could to avoid financial disadvantage (unless they were friars). Some strove to make themselves wealthy, the most successful city merchants rivalling the power and status of the old aristocracy. People started to reject the old adage that God had created three estates (‘those who fought’, ‘those who prayed’ and ‘those who worked’) as Europe shifted significantly to a form of international dialogue that was not solely driven by the dictates of kings or noblemen but took account of merchants and markets. The emerging desire for personal enrichment has been an underlying factor for change ever since. Sixteenth-century explorers, seventeenth-century bourgeois, eighteenth-century agrarian reformers and nineteenth-century industrialists were all motivated by dreams of riches. The twentieth century saw businessmen and women turn self-enrichment into an art form, as they played ‘real-life Monopoly’ with the world’s assets. As a result, I would argue that the primary forces underlying change over the last millennium were: the weather and its effect on the food supply; the need for security; the fear of ill-health; and the desire for personal enrichment.
This set of four primary forces doesn’t guide us directly to the century that saw the most change but it does give us some starting points. All four loosely correlate with the hierarchy of needs drawn up by the American psychologist Abraham Maslow in 1943.1 He defined these needs as follows: the physiological necessities for life (i.e. food, water, air, warmth); safety, including health; love; personal esteem; and self-actualisation. The orde
r is important: if a man has insufficient food, it doesn’t matter whether his contemporaries are producing great art or travelling by train. As Maslow states,
For our chronically and extremely hungry man, Utopia can be defined very simply as a place where there is plenty of food. He tends to think that, if only he is guaranteed food for the rest of his life, he will be perfectly happy and will never want anything more. Life itself tends to be defined in terms of eating. Anything else will be defined as unimportant. Freedom, love, community feeling, respect, philosophy, may all be waved aside as fripperies which are useless since they fail to fill the stomach.
If a man is fed and watered he will be most concerned with security next; only if he is safe and healthy will his mind turn to love, emotional support and personal esteem. Finally, if every other need is met, his preoccupation will be ‘self-actualisation’. This Maslow explained in a number of ways – as the pursuit of truth, beauty, satisfaction and meaningfulness, among other things – but for our purposes it can be summed up in his phrase ‘a musician must make music’.
Maslow’s hierarchy was very much a product of its time and it doesn’t wholly correlate with what we find when we look at earlier centuries. Many of our forebears placed religious matters before security or food – those sixteenth-century people who chose to be burnt to death for their beliefs rather than recant, for example, or the medieval lord who opted to fight on a crusade rather than live in peace on his estates. In their cases, self-actualisation took priority over everything else. And while Maslow regards ‘freedom from prejudice’ as an aspect of self-actualisation, before the rise of liberalism in the seventeenth century people believed that there was virtue in enacting their prejudices, and so self-actualisation was quite different. Having said all this, Maslow’s work clearly shows that certain needs take priority over others. It doesn’t matter much whether or not you have the latest mobile phone if you are suffering from plague. We have to give greater weight to the need to eat and drink, and to be warm, safe and healthy, than to changes of luxury and convenience. The relative importance of ideological factors is more difficult to assess. For those on hunger strike for their political beliefs, ideology matters more than the need for food; for those who stand up against racial prejudice, their belief can be more important than their personal safety. With this caveat about the variable position of ideology in the hierarchy, we can determine a more historically representative scale of needs by which to evaluate changes in society:
1. Physiological needs: whether the members of a community had enough food, heat and shelter to sustain life, or not;
2. Security: whether the community was free from war, or not;
3. Law and order: whether members of the community were safe in peacetime, or not;
4. Health: whether they were free from debilitating illnesses, or not;
5. Ideology: whether the members of a community were free from moral requirements and social or religious prejudices that prevented them from satisfying any of the needs below, or made them forgo any of the needs above, or not;
6. Community support: whether they had sufficient companionship within the community in which they lived, including emotional fulfilment, or not;
7. Personal enrichment: whether they were personally enriched and able to realise their ambitions, or otherwise personally fulfilled, or not;
8. Community enrichment: whether they were able to help other members of the community with regard to any of the above.
Generally speaking, if the answer to one of the above is ‘no’ for an individual or a section of the community, then the progression stops there (bearing in mind the caveat regarding the variable position of ideology). If the answer is ‘yes’, the next criterion is the one that defines their need. Obviously not every individual in society found themselves facing the same needs at the same moment in time. In the Middle Ages, if a nobleman was healthy and his country was at peace, he might see all eight of his needs met while not even the first would have been satisfied for the peasants who tilled his soil. Nevertheless, the whole scale applied to everyone, wherever they found themselves on it. It thus defines the collective needs of a society and allows us simultaneously to evaluate a large number of significant changes that otherwise would be impossible to quantify collectively. Measuring the ability to meet physiological needs, for example, allows us to measure the effects of agricultural change and transport at the same time, as well as an element of social reform. Changes in law and order permit us to gauge developments in morality as well as the efficacy of justice. If a change does not correlate with one of these needs, then it is, to use Maslow’s word, a ‘frippery’, and can be disregarded.
Social change in relation to the scale of needs
PHYSIOLOGICAL NEEDS
The best test of whether the members of a community had enough food, heat and shelter to sustain life (or not) is to examine whether the population was expanding. Very simply, if the population was growing, they did. If it was contracting, that does not necessarily mean they did not have enough food – contraception, emigration, disease or war might have been the cause – but a population facing significant food shortages for long periods of time could not expand. Increases in the food supply should thus be relatively easy to quantify.2
The data in the Appendix (page 347) point firmly to the nineteenth century as seeing the greatest change in Europe (116 per cent), with the twentieth century being in second place (73 per cent), followed by the eighteenth (56 per cent), the twelfth (49 per cent) and the thirteenth (48 per cent). Not every European country followed the same pattern. In England, the nineteenth century saw by far the greatest population growth (247 per cent), followed by the sixteenth century (89 per cent) and then the twelfth (83 per cent). In France, the thirteenth century was the period of greatest growth (71 per cent), followed by the twelfth (48 per cent). But in terms of changes in the overall access to food across the continent, the nineteenth century was pre-eminent.3
What about those periods when there were downturns in the food supply? Famines have occurred in all centuries: even the plentiful nineteenth century saw millions of Irish people starve during the potato blight of 1848. But dearth was more commonly experienced in the early centuries, when communication links were worse. While we cannot quantify the seriousness of the problem before about 1200, after that date the worst food shortages were experienced during the multiple famines of 1290–1322 and 1590–1710. However, because famine reduces interest in other aspects of life – as Maslow says, to the starving man, Utopia is defined by food – it limits changes in society. You don’t start painting like Leonardo to while away the hours while you are starving. The periods of famine were tragic but they were of short duration and of minimal long-term social impact. The alleviation of hunger was the principal shift in respect of physiological needs, and thus the nineteenth century saw the most significant changes.
SECURITY
A comparison of the military dangers that European communities faced is more problematic. We could simply total up the number of years that each country was at war but this would not provide an accurate picture for the earlier centuries, when war consisted of a series of short, bloody campaigns and a wary peace thereafter. In 1001, fighting was endemic in many regions. Later on, long wars such as the Hundred Years War and the Eighty Years War were more clearly defined, in that declarations of war remained in place for decades, but these conflicts only saw intermittent fighting. The length of these wars reflects the lack of a permanent peace, not the lack of a ceasefire. Alternatively, we could limit our test to those wars fought on home soil but that would rule out both world wars as far as the British were concerned – apart from southern England being the target of bombing raids – which would result in an even more unrealistic view of the impact of war.
What we really want to measure is the change in the sense of security – the vulnerability to military force as well as the length of a war. To this end it is instructive to draw on the work of the so
ciologist Pitirim Sorokin.4 In 1943 he tried to measure the relative impact of war in several ways. In one exercise he calculated the number of casualties in all the wars he could identify in a sample of four countries and related them to the total population for those countries, arriving at the following comparison:
Century Population (millions) Military casualties Military casualties relative to millions of population Percentage change
12th 13 29,940 2,303 –
13th 18 68,440 3,802 65%
14th 25 166,729 6,669 75%
15th 35 285,000 8,143 22%
16th 45 573,020 12,734 56%
17th 55 2,497,170 45,403 257%
18th 90 3,622,140 40,246 -11%
19th 171 2,912,771 17,034 -58%
1901–25 305 16,147,500 52,943 211%
Pitirim Sorokin’s estimate of military casualties for England, France, Russia and Austria-Hungary
In using Sorokin’s figures, however, we need to be aware of a few problems. His casualty estimates for the earlier centuries are based on chronicles, which are very patchy, and his population figures for these early centuries are certainly too low. His figure for the twentieth century only includes military casualties for the first 25 years – he was writing in 1943 – and did not take into consideration the catastrophic loss of life in the Second World War. Another issue for us to consider is that his figures relate only to soldiers, not civilians. However, despite these problems, Sorokin’s quantitative assessment is a good place to start to think through the issues.
To begin with Sorokin’s low population estimates for the early centuries: if he had had access to more accurate population data, the ratio of military casualties to population would have been even lower than the figures in the above table. With regard to the partial coverage of the twentieth century, we can make allowances for that – in respect of both the very high figures in the first half of the century and the smaller number of deaths in the second half. As for civilian casualties, it is reasonable to suppose that high military casualties reflect many civilian deaths: with a few exceptions (such as the Napoleonic Wars), military engagements before 1950 did not normally try to spare civilians, and effective methods of killing large numbers of soldiers were equally effective at killing large numbers of non-combatants. If we were to add the deaths of the Second World War to Sorokin’s figures, then there is no doubt that the twentieth century saw the greatest impact of war on society. In terms of increased vulnerability – the very opposite of meeting our need for security – the introduction of total war accentuates this conclusion. Second place goes to the seventeenth century, not only on the strength of the above figures but also because the total number of deaths, including civilians, resulting from the Thirty Years War was in the order of 7.5 to 8 million. Most German states saw more than 20 per cent of their populations wiped out, and several saw more than 50 per cent mortality in this hitherto unprecedented conflict.5