by Ian Mortimer
War
It goes without saying that the conduct of war in the twentieth century altered beyond recognition. At the start it still lived up to its traditional definition of soldiers ‘advancing in tidy ranks and dying in untidy heaps’. Tanks, trenches, poison gas and barbed wire soon changed the shapes and sizes of those heaps. The importance of fighter aircraft, bombers, submarines and the atomic bomb in the Second World War further changed them, often leaving no heap at all. In the second half of the century, the fear was that, at best, whole cities would be vaporised, or at worst, that everyone in the world would slowly die of starvation or radiation poisoning brought on by a nuclear holocaust. By 2000, however, it was clear that many traditional elements of war remained, and that in some civil wars, instant vaporisation might be a blessing. The atrocities perpetrated during the fighting in former Yugoslavia, in particular the rape and sexual torture of civilians and children, made it clear that the extreme brutality of previous centuries had not abated. As we saw when considering the decline of private violence in the sixteenth century: if you take away the potential violence of a higher power – the force that stops people attacking each other – they are liable to revert to a more aggressive, uncivilised state. Mankind remains potentially as cruel and inhumane as ever.
The point here, however, is not how warfare itself changed in the twentieth century but how war changed Western society. Here we need to look at many changes collectively, as we did in considering the impact of the Black Death. There were technological outcomes from warfare that had a significant bearing on civilian life. The Second World War in particular contributed directly to technologies as diverse as computing, the jet engine, cardboard milk containers and radar, and the development of penicillin and pesticides. The technology behind the German V-2 rockets led not only to modern military missiles but to the ability to fire a rocket into space and the advances of modern astronomy. It is arguable that any or all of these developments might have happened in the fullness of time, and that war was not so much a cause as an accelerator. Nevertheless, it hastened a great many developments simultaneously and contributed hugely to the transformation of life in the late twentieth century. There were of course social and economic consequences to the major wars, and global political repercussions too, including the institutionalisation of international relations. But the point at which to begin is the most important change of all, namely the fundamental transformation of the relationship between war and society.
Warfare before 1900 had normally only affected civilians where they either worked in munitions factories or lived in close proximity to the front line or in places through which armies and their supplies passed. The countries engaged in the First World War saw the advent of what historians have termed ‘total war’, in which the resources of an entire nation are devoted to the war effort. Human resources were deployed where needed for the pursuit of military ends. Social barriers were suspended to maximise the production of the war-orientated industries on the home front. Rationing was introduced, and transport systems were redirected to increase the efficiency of military supply lines. In reality, the ‘totality’ of warfare in the twentieth century went much further than these socio-economic arrangements suggest. Advances in aviation made everyone a potential target. Over a thousand Londoners were killed by aerial bombardment from Zeppelins in the First World War, and more than 28,000 died during air attacks on the city in the Second World War. Bombing raids over Warsaw, Rotterdam and many towns in Germany destroyed hundreds of thousands of lives and millions of livelihoods. They created rubble-strewn wastelands in the centres of towns and in some instances ignited wind-blown infernos from which there was no respite for any person, animal or national treasure. The RAF raids on Hamburg and Dresden were particularly horrific, killing nearly half as many civilians in those two cities as were wiped out in Hiroshima and Nagasaki in August 1945. The use of atomic bombs was itself another ‘Columbus moment’. Two years later, atomic scientists in Chicago set the ‘Doomsday Clock’ at seven minutes to midnight in an attempt to draw wider public attention to the likelihood of mankind bringing about the extinction of human life on the planet through technological means. In 1953 the clock was set at two minutes to midnight. By the time of the Cuban Missile Crisis in October 1962, world leaders were openly discussing the ‘abyss’ of nuclear warfare. In the event, the red button remained unpressed, but the potential violence was still there. Everyone on the planet was at risk of becoming a victim of war, even if he or she did not live in or near one of the belligerent nations.
The increasing deadliness of warfare is surely the greatest irony of human civilisation. Previous concepts of the end of the world had been inspired by Biblical stories of the Flood and the Last Judgement. It is hugely ironic that science – which had gradually replaced religious explanations of the world and was frequently used to undermine religious teaching – found practical ways to bring about the mass extermination and horrors that the Bible foretold. Even more ironic is the fact that scientists had devised this potential Armageddon deliberately, in response to the requirements of democratically elected leaders. Over the centuries the combination of absolute monarchical power, social hierarchy and religious doctrine had resulted in many wars and atrocities but it had never threatened to wipe out human life completely, as the alliance of democracy and science did in the late twentieth century. And yet the greatest irony of all is that most people in the West benefited from the escalating scale of warfare because it placed a new value on the individual. Indeed, total war – particularly in the first half of the century – brought with it many social and economic reforms that massively increased the political power, equality of opportunity and standard of living of the West’s citizens.
Equality of opportunity was particularly pertinent at the start of the century for women, who still had some way to go to achieve social and economic parity with men. Labouring in munitions factories in the First World War permitted women to enjoy a great deal more freedom than they had previously known. Many found themselves employed for the first time, in sole charge of their household, and able to travel freely without a male companion. No doubt it led to a million and one private little battles when their husbands returned from the war – if they did return – but it was generally accepted that women had earned their greater freedom. The social pressure for the enfranchisement of women became overwhelming. In the aftermath of each world war there was a small rush of national Acts extending voting rights to women and any men who did not yet have them. The United Kingdom extended the franchise in 1918 to all men over 21 (19 if they had fought in the war) and women over 30 who were married, owners of property, or university graduates. Poland, Czechoslovakia, Austria and Hungary enfranchised their women the same year; the Netherlands followed in 1919, and the USA and Canada did likewise in 1920. The British government finally gave women the vote on the same basis as men in 1928. In 1944–5 the contribution made by women in the Second World War led to their enfranchisement in France, Bulgaria, Italy and Japan; Belgium followed in 1948. That greater political and economic independence of men and women heralded the decline of the servant class. Large houses that once had been staffed by cohorts of servants were closed up, often never to reopen. Just as the shortage of labour after the Black Death had accentuated the value of every labourer, so total war forcibly reminded society that every adult was useful, and led to the universal freedom to work and earn, vote, and (in the case of women) live more independent lives than before 1900.
Another social consequence of war was a turning of the tide against imperialism, monarchy and hereditary power generally. At the start of the century half a dozen empires ruled most of the globe. The largest was the British Empire, which included Canada, Australia, New Zealand, about two fifths of Africa, India, British Guiana and a number of Pacific islands. The French Empire consisted of huge swathes of North and West Africa, Vietnam and Cambodia, as well as colonies in India, China and the Pacific. The Russian Empire stretched f
rom the Pacific Ocean to the Black Sea. Although far smaller in scale, the German Empire included territories in Africa and the Pacific. The Austro-Hungarian Empire encompassed not only the heartlands of Austria and Hungary but Bohemia, Slovenia, Bosnia and Herzegovina, Croatia, Slovakia and portions of Poland, the Ukraine, Romania and Serbia. The Ottoman Empire included Turkey, the Holy Land, Macedonia, northern Greece and Albania. One way or another, all these empires came to an end. The Russian Empire was overthrown by revolutionaries in 1917. The German, Austro-Hungarian and Ottoman empires were broken up in the aftermath of the First World War. Both the British and French empires underwent a process of decolonisation, hastened by the economic stresses suffered by each nation in the Second World War. As for the monarchies, at the start of the century, France, Switzerland and America were the only large Western countries that were not ruled over by a hereditary sovereign. Although many were constitutional monarchies, most kings and regnant queens still exercised a huge influence over their governments. By 2000 there were few kings left. The British, Belgian, Danish, Dutch, Norwegian and Swedish royal families still clung to their thrones, and the Spanish royal family had been restored after the demise of General Franco in 1975. But even in these nations they were subservient to democratically elected governments. Aristocratic power had similarly been almost entirely eradicated: even the British House of Lords was predominantly composed of appointed members after March 2000.9 The long process of holding hereditary rulers to account, which had begun back in the thirteenth century, had finally resulted in the near-extinction of the species.
The massive scale and horrific nature of modern warfare contributed directly to a series of attempts to develop international law and multinational organisations to limit the possibility of future conflicts. Even before the end of the First World War, philanthropists and politicians in England, France and the USA were proposing ways to limit conflict through international arbitration and the imposition of sanctions on aggressive states. The League of Nations was set up as part of the peace negotiations at Versailles in 1919. It proved a failure for a variety of reasons. It excluded the newly communist Russia, and failed to attract many other states, including the emerging economic superpower of the USA. It had no army and very little authority, as all countries on the council had a veto and were unwilling to take action against their potential allies. Its complete inability to do the most important thing it was set up to do – to prevent another world war – was demonstrated in 1939. However, its 20-year existence and modest successes along the way did create the sense that international relations could be institutionalised. The League’s successor, the United Nations, was established in October 1945, also primarily in order to prevent a recurrence of international conflict. The actual activities of the UN have gone much further, of course, and it has involved itself in the social and economic well-being of people throughout the world. It also maintains the International Court of Justice in The Hague. Whereas the League of Nations never managed to bring together more than a quarter of the world’s nations, the UN includes almost every sovereign state. War, you could say, brought the world closer together in the twentieth century. It also resulted in the revision of the Geneva Conventions in 1949 to protect civilians, medical staff and other non-combatants in war zones, as well as wounded and sick soldiers and shipwrecked mariners. These conventions, originally drawn up in 1864, represented the first attempts to limit the cruelty of war through an international moral code since the Peace of God and the Truce of God movements in the eleventh century. It is interesting to reflect on what this says about mankind’s desire for peace and our seemingly uncontrollable propensity for violence.
Life expectancy
As we have just seen, the relationship between modern warfare and society is laden with irony. A particularly clear example is the fact that war had a positively beneficial impact on health. Obviously the people who were shot, starved, gassed, shelled, burnt or blown to smithereens would not see it quite like that, but the truth of the matter is that wars require healthy, fit populations to fight in the front line and to operate the munitions factories, trains and sources of food production back home. The First World War saw a rapid escalation in government care for the health of the workforce. Occupational health and safety became a significant issue, with lead poisoning, mercury poisoning and anthrax being the first ailments to be regulated, followed by illnesses such as silicosis and skin cancer. The same war saw the advancement of blood transfusions on account of the discovery of the first anti-coagulants, sodium citrate and heparin, and the mass production of chlorinating Lister packs with which to purify water. It saw the production of ammonia on an industrial scale for the first time, allowing for the manufacture of artificial fertilisers – originally to fuel the German war effort but later to feed the world. The psychological health of men suffering from shell shock forced the authorities to invest in research in mental health and care for sufferers. As for the Second World War, it saw the first antibiotic remedy, penicillin, mass-produced in readiness for the D-Day invasion of France in 1944. Today we take such medical innovations for granted but it is important to remember that before the introduction of antibiotics, simply grazing your elbow or knee could lead to blood poisoning. From 1944 a raft of diseases from meningitis to gonorrhoea were suddenly treatable. That little mould noticed by Alexander Fleming in September 1928, from which he developed penicillin, turned out to be one of the most important lifesavers of the modern world.
The medical discoveries made during the two world wars were hugely significant but so were the medical and social advances in peacetime. Countries in the West developed national healthcare systems with large public subsidies. National systems of pensions for the aged and infirm were organised, reducing the deleterious effect of advancing years on the poor in particular. Social reforms such as unemployment and disability benefits raised the standard of living for the needy far above their equivalents in the nineteenth century. Niursing, midwifery and obstetric surgery improved universally, resulting in a dramatic drop in infant mortality. In Britain, this amounted to a decline in stillbirths from 14 per cent of all confinements in 1900 to 6.3 per cent by 1930; and to 0.58 per cent in 1997. Neonatal mortality dropped from 3.2 per cent in 1931 to 0.39 per cent in 2000.10 At the same time, maternal mortality plummeted. In 1900 about 42 mothers died per 10,000 confinements in Britain and about 80 per 10,000 died in the United States. By 2000 across the developed world the mother died in just two confinements per 10,000.
Life expectancy at birth in England, France, Italy and Spain
The above graph could be replicated for almost any country in the developed world. In 2000 male life expectancy at birth was 75 or more in Australia, Canada, France, Greece, Iceland, Italy, Japan, New Zealand, Norway, Singapore, Spain, Sweden and Switzerland. In the UK it was 74.8 and in the USA 73.9. Female life expectancy at birth was 80 or more in all the above countries as well as Austria, Belgium, Finland and Germany. In the UK it was 79.9 and in the USA 79.5.11 Babies throughout the developed world could expect to live almost twice as long in 2000 as they had done in 1900. Of course, the lives cut short in infancy in the early part of the century are a distorting factor in gauging how much longer an adult might live, but even so, active life increased significantly. In 1900 the average American 20-year-old could expect to live another 42.8 years; in 2000 he or she could look forward to another 57.8 years. That meant 15 years more output: 15 years more experience for every scientist, physician, clergyman, politician and academic. This amounted to a massive expansion in the return on training. People lived into retirement, so that the less productive years – roughly the last 10 per cent of an individual’s life – did not detract from a career.12 A doctor who qualified at the age of 25 in 1900 who enjoyed the average life expectancy of 62.8 years and who did not work or only worked part time for the last 10 per cent of his life because of ill health would have had an active career of 31.5 years. In 2000 this man could have continued working full time for an
extra 14 years, until he reached 70. But arguably, relieving people of the fear of dying young was an even more significant change. Who at the age of 56 would not welcome another 15 years of good health? You could quite reasonably argue that this prolongation of active life is one of the most significant changes we have covered.
The media
Pick up a copy of a newspaper such as The Times from 1901 and you will immediately notice the lack of a banner headline: the front page is given over almost entirely to small-print advertisements. This will probably strike you even more forcibly than the lack of photographs. A popular British newspaper like the Daily Mail had larger and more varied text, but its front page also lacked photographs and was dominated by advertisements. By 1914 the situation was changing. While the front of the Daily Telegraph was still riddled with small ads, other papers were prioritising the big news. To our eyes, the Glasgow Evening Times for 2 August 1914 looks almost modern, with its headline: THE WAR CLOUD BURSTS. The header text beneath reads: THE DECLARATION OF WAR. GERMANY’S GRAVE RESPONSIBILITY. WILL BRITAIN FIGHT? FATEFUL CONFERENCE TODAY. American newspapers by this time had not only banner headlines but half-tone photographs. The shift to these more eye-catching and opinion-forming models forced politicians during the First World War to pay attention to the newspapers’ messages and their ways of selecting and presenting the news. Journalists increasingly had access to political leaders, who not only needed the support of the newspapers to get themselves re-elected, but also wanted their policies and decisions to be reported in a certain way. It was the beginning of an intimate yet uneasy relationship between power and the press.