by Ian Mortimer
Alongside the growth of mass-market newspapers, the film industry provided both entertainment and news. The first films had been shown in Britain in the final years of the nineteenth century and their popularity had spread rapidly as they were screened at fairs and in music halls. In 1906 the daily viewing figures were about 4,000 in London alone. The first purpose-built cinemas in Britain opened in 1907, a year in which 467 British films were released. As a result of the 1909 Cinematograph Act, cinemas had to be licensed by local authorities, but this did not lessen the rapidity with which venues opened. By the end of 1911 the town of Plymouth (population: 112,030) had at least a dozen cinemas, most of which seated more than 300 people.13 In 1910 the showing of the first newsreels brought images of current events to towns up and down the country. In the 1930s people were exposed to a welter of current political, social and moral messages, as voiceovers could deliver information far faster than captions. By 1939 the UK saw 19 million weekly visits to the cinema. Thirty-one per cent of cinema-goers went every week, 13 per cent twice a week, 3 per cent three times a week and 2 per cent four times; only 12 per cent of the population never set foot in a cinema.14 If that seems impressive, consider this: attendance increased by two thirds over the course of the Second World War, reaching a peak of 31.5 million weekly cinema visits in 1946. That is the equivalent of every adult in the country going to the cinema at least once a week. With newsreels shown before many feature films, people from one end of the country to the other were simultaneously fed with news. But the cinema was not just a means of spreading information more widely and rapidly, it also created international stars whose faces were instantly recognisable by millions and whose opinions were taken seriously by their admirers. Together, mass-circulation newspapers, magazines and films focused national attention on key moral and political themes and bound the country together in a series of national debates. If nations were more thoroughly integrated in the twentieth century than ever before, the media were an important part of this process.
The inventions that took that integration a stage further were radio and television. Starting in the USA with local broadcasts relaying the results of elections in 1920, radio stations began to deliver the news faster than ever before. The BBC, the world’s oldest national broadcasting organisation, was founded in 1922. In 1934 it added regular television broadcasts to its radio output – although television was suspended during the war (as it was in France and Russia). In 1946 there were more than 10 million households in the UK with a radio licence. In 2000 23.3 million households were licensed to receive TV and radio programmes.
UK radio and television licences purchased annually (thousands)15
By 1960 every country in western Europe had regular television broadcasts and many Latin American and eastern European countries were starting their own services. Although the delivery of news was not in real time or continuous, the ability to interrupt programmes with news flashes meant that important information could be broadcast to the nation almost on the spur of the moment. National events were sometimes broadcast on several channels at once. Millions watched the same soap operas and discussed the moral dilemmas arising from the storylines. TV programmes raised issues that the entire viewing public felt obliged to consider. Strikes, marches and social protests came to national attention through regular bulletins. Ethical matters such as bullying in schools, homophobia and the inequality of pay between the sexes became matters of national debate through drama as well as reporting. Even if you happened to grow up in a remote, all-white community, the message was forcefully circulated that racism was deeply damaging, divisive and morally wrong. Trends set in the capital one day were seen as the next best thing across the nation the next. The media thus gradually wove people’s awareness together in every country. In 2000 the rich financier in the city had many more points of reference in common with the poor farmer in a rural area than their respective forebears had had in 1900.
Finally, at the end of the century, a growing number of people started to look to the Internet as their prime source of information, education and entertainment. We have to remember, however, that the digital age arrived very late in our scheme – so late that many people in 2000 had barely been touched by it. The Internet was only formed in 1969 when four American universities linked their specially designed computers. Even though dozens of research establishments soon joined them it was not until the growth of Tim Berners-Lee’s World Wide Web, which went live in August 1991, that its potential to become a public media channel was recognised. The fact that it was a royalty-free system led to its extraordinary growth. Just as with the railways in the nineteenth century, there was an immediate rush of excitement that failed to yield profits, yet by the end of 1995, over a million websites were active worldwide. By December 2000, 361 million people were using the Internet: 5.8 per cent of the world’s population. In the United Kingdom, 28 per cent of adults had access to the Internet at home.16 In its first nine years the World Wide Web had a significant impact, drawing the world together in communication just as newspapers, films, radio and television had done on a nation-by-nation basis earlier in the century. By 2000 it had not yet resulted in sufficient online shopping for people to discuss the demise of the high street; nor had it had the political networking impact that would result in the Arab Spring of 2010–11. Nevertheless, people could see how it would change the world in the not-too-distant future.
Electrical and electronic appliances
While I was writing this book, my family and I stayed briefly in an old cottage in Suffolk. One night, storms brought the power cables down and over the next few days the electricity companies struggled in vain to restore the supply. It provided a salutary reminder of how dependent we are on electricity. The cooker in the house was electric, so there was no means of heating anything up – not even the water for a cup of tea. The kitchen utensils were rendered useless. We were deprived of all forms of entertainment and communication as the television and radio fell silent, and after a short while, our laptops lost their charge. The vacuum cleaner became a glorified dust box. The fridge freezer lost its cool – in both senses of the word. We could not take a shower or a bath. The dishwasher, washing machine and tumble dryer were all out of service. Most distressingly, so was the coffee machine. I don’t use an electric shaver, and I sadly have no need for a hairdryer, but these too would have been denied me. Of course, there were no lights either. As I sat jotting down ideas for this book by candlelight, I reflected on the whole electrification of our lives.
At the start of the twentieth century, there was only one common domestic electrical appliance – the light bulb. But even that was only to be found in a minority of homes, as many people still used gas lights. After the First World War, however, companies started to advertise a rapidly growing stream of electrical gadgets for the home. The electric kettle, which was first produced by Compton & Co. in 1891, came into its own in 1922, when the Swan Company manufactured a device with an internal heating element.17 As mentioned in the previous chapter, gas cookers had existed since the nineteenth century but had not sold in large numbers due to the need to be connected to a supply. However, as the electrical grid spread through towns in the early twentieth century, the occupants of new flats and houses switched to electric cookers. Housing estates were built in the 1930s with electricity already connected and a pre-installed electric cooker for the proud new homeowner. The first commercially successful fridge went on the market in 1927, allowing fresh food to be preserved much longer than previously. By this time, the marketing of electrical appliances was firmly directed at women who were expected to do the cooking, cleaning and housework. A sales catalogue printed in January 1935 shows a young woman on the front with the caption ‘Every housewife wants Magnet household labour-saving electrical appliances’. The inside of the catalogue offers two types of kettle, a toaster, an iron, a hairdryer, an upright vacuum cleaner, an upright floor polisher, six types of electric fire, a cooker, a ‘wash boiler’, a �
�stimulator’ (a sort of exercise machine) and a car engine radiator (for putting under the bonnet in cold weather, to prevent damage).18 A flood of other electrical appliances followed. By 1970, almost every home was packed with such gadgets and many more besides: audio and television equipment, electric drills and other power tools, electric blankets, juicers, alarm clocks, Teasmades, lawnmowers and so on.
Most of the appliances in daily use had been refined into a reliable form by 1970, only changing gradually thereafter. However, the 1970s saw a shift in the sort of gadgets that you might buy as the microchip crept into our consumer goods. The first ones I encountered as a boy were in pocket calculators. Less than ten years after seeing my first computer in the late 1970s, their use in schools was obligatory: all my undergraduate essays from 1986 had to be word-processed. By 2000 microchips were in everything from car dashboards to children’s toys. They also hugely increased the reliance on electricity in the workplace. In the office of the 1960s you might have found a telex machine and banks of electric typewriters. In the 1970s the photocopier became common, as did cassette-storage dictation machines, the fax machine, shredders, pocket calculators and, at the end of the decade, computers. By 2000 desktops, printers and scanners were de rigueur. Governments and businesses had mostly ditched their antiquated paper-based filing systems. And with the advent of the Internet, there arrived new systems for the storage, manipulation and distribution of data.
You could say that all this makes relatively little difference – that electrical gadgets, computers and fax machines didn’t change the nature of what we do, they just allowed us to do the same thing more quickly. Switching on an electric heater was considerably faster than lighting a coal fire in a hearth every morning but not very different in its effect. Sending an email was much the same as writing a letter and having it delivered immediately, rather than the next day. Where was the change in that? But speed mattered. Labour-saving devices in the home and the office allowed more time to be spent working or producing. Information could be transferred almost instantly; it did not have to be copied out prior to being sent. Large datasets could be consulted in a fraction of the time it took to peruse a card index, especially if you had to read the awkward handwriting of a predecessor. The twentieth century saw a significant change in how much we could accomplish and the types of intellectual tasks we could tackle, largely due to electronic devices.
Sitting in that cottage in Suffolk in the glow of a candle made me think of other aspects of our increasing dependence on electricity in the twentieth century. Domestically we underwent a similar process to the de-skilling that the working class experienced in factories in the nineteenth century. Before the Industrial Revolution anyone working in a workshop made their own tools as well as their own products – it formed part of a young man’s apprenticeship. A wheelwright would know how to instruct the blacksmith to produce everything from the type of plane he needed to the iron tyre to fit the rims of his wheels. The majority of men learnt the carpentry skills they needed to mend the doors and shutters of their houses, or to make new furniture for their growing families. But when factory owners introduced production lines, their workers were required to fulfil only the function of the machine he or she operated. The use of the machine required no experience in making tools, and the ability to use the machine was not a transferable skill. Machine work therefore had the effect of de-skilling the workers and keeping them in an unskilled state. A similar process affected all of us in our homes in the twentieth century. Any housewife who could manage a kitchen in 1900 knew how to bake bread in an oven heated with hot coals or burning furze or faggots. She also knew how to sieve, mix and blend ingredients. Have you ever tried making a consomme without electricity? Or a jelly, starting with the fruit and the shavings of the soft antlers of deer? Over the twentieth century we lost vast amounts of domestic knowledge, much of it practical and basic – such as how to build a fireplace for boiling a large amount of water quickly (which is different from a fireplace for cooking), how to iron clothes cleanly without an electric iron, and how to store food for months without a fridge. A major consequence of our growing dependence on electricity was our shrinking ability to get by without it.
It goes without saying that much the same thing applied to our workplaces. The shift from card indexes to data storage seems, on the face of it, a process that could easily be reversed. Card indexes are hardly the world’s most difficult technology, after all. But the change was more complex than that. As the year 2000 drew near, professional advisers warned that many computer systems would not survive the transition from the two-digit date ‘99’ to ‘00’. People began to become aware of just how vulnerable their electronic systems might be. It was then that the complexity of computerisation became apparent: not only was the data now stored in potentially a less robust system, it would not be possible to return to a non-electronic system if computers did indeed prove vulnerable. To do so, you’d have to begin writing out your card index all over again. Computerisation was a one-way street.
It is difficult to appreciate the significance of a change that is ubiquitous. As we saw with regard to clocks in the fifteenth century, we take inventions for granted very soon after they come into our lives. However, one way to assess the significance of a particular change is to ask yourself how easy it would be to undo it. After several days without power in that Suffolk cottage, I ended up thinking that it would probably be easier to reverse all the nineteenth-century changes mentioned in the previous chapter – to dig up the railways, reintroduce slavery and the subjugation of women, and take away the vote from all but the rich – than it would be to give up our dependency on electricity. All our record-keeping relies on it. We need it for the systems that allow society to operate, from our bank accounts and payments by debit and credit card to the records kept by doctors, dentists and the police. Without electricity, modern trains would not run – because of the signalling as well as their own power – and planes would collide. Stock markets would cease to operate. The logistics of our food supply would collapse. The gadgets with which we entertain and amuse ourselves would no longer work, and nor would many essential domestic tools. And yet the entire electrical and electronic system has an inbuilt vulnerability. If we were to experience a solar storm as powerful as the Carrington Event of 1859 – which knocked out the nascent telegraph system and wrapped the world in an aura akin to the Northern Lights – it might well destroy the functionality of all the satellites, communications systems, computers, hairdryers and coffee machines in its path. Then the significance of the twentieth-century shift to electrical dependency would be fully appreciated.19
The invention of the future
You might recall the opening line of the fourteenth-century chapter, where I explained that medieval people did not understand social history. It goes without saying that they had even less idea of the future. The astounding Roger Bacon might have deduced in his thirteenth-century friary that it was possible to build cars, flying machines, suspension bridges and diving suits, but he did not have any vision of the future as such. His reasoning was simply that these engineering projects were not beyond the bounds of possibility. The future and the past did not occur to the medieval mind, which was utterly consumed in the ever-continuing present. Gradually, however, the sixteenth century ushered in an awareness of the past. By the eighteenth century, the sense of Western society constantly changing had developed into a concept of progress as outlined by Turgot and Condorcet, and that led to people imagining the future. Hegel theorised that liberal values would continue to prevail, leading to an ‘end of history’ as everyone in the world adopted the same, most beneficial form of government. For Karl Marx, of course, this was socialism, and he was by no means alone in thinking that a socialist state was the desired end product of mankind’s development. At the end of the twentieth century the historian Francis Fukuyama looked at the trajectory of the West up to the fall of the Berlin Wall and saw the rest of the world gradually buying into
the values of liberal democracy.
The future was not found only in the pages of political analysis and utopian ideology. Science fiction introduced it to people who had no interest in Marx or Hegel. In the 1880s several novels dealt with what the future might hold through the conceit of a central character falling asleep and waking up in the future. The best known were Edward Bellamy’s Looking Backward 2000–1887 (1888), in which America is envisaged as a socialist state in 2000, and William Morris’s News from Nowhere (1890), which presents the author’s own socialist vision of a future society. Such works were set in the real world; they were idealistic expressions of what their authors hoped would come true in their own societies. Many commentators drawing on the ‘progress’ of the West also saw the future through rose-tinted spectacles. One John Elfreth Watkins Jr, writing in The Ladies’ Home Journal in 1900, made a number of predictions about life in the year 2000. He declared that trains would travel at 150 m.p.h.; automobiles ‘would be cheaper than horses’; farmers would own ‘automobile hay-wagons’; photographs would be ‘telegraphed around the world’; a university education would be free to every man and woman; there would be ‘aerial warships and forts on wheels’; people would buy ‘ready-cooked meals’ from stores in the same way they bought bread from bakeries; and food would not be sold exposed to the air. Less successfully, he predicted that hydroelectric power would have replaced coal in the home; mosquitoes and flies would have been eradicated; there would be no wild animals; medicinal drugs would no longer be swallowed; and strawberries would be grown as large as apples. Politicians who entered the uncertain business of predicting the future also tended to be optimistic. In 1930 the earl of Birkenhead wrote that over the course of the next hundred years, ‘warfare will not increase in savagery. The civilised world is rapidly becoming a single economic unit . . . the disaster of one nation involves all nations.’20