How, if at all, might this translate into peacetime economic policy? Relatively early in the war, the great economist John Maynard Keynes had more or less won the battle within the Treasury to persuade that deeply conservative institution to accept at least a substantial measure of demand management as the principal way of regulating the economy in order to keep the level of unemployment down. Thereafter, the real intellectual conflict among radically minded ‘activators’ was between Keynesians and those whose ideal was wartime-style (and Soviet-style) direct physical planning. For the former, there was still a significant role – at least in theory – to be played by the price mechanism of the market; for the latter, that role was fairly surplus to requirements. By the end of the war, it seemed that the force was with the out-and-out planners, with their emphasis on investment planning and, through direct controls over labour, manpower planning.
Indeed, such was the temper of the times that even most Keynesians had, in a visceral sense, little real faith in, or any great intellectual curiosity about, the possible economic merits of the market or of supply-side reforms. Hence the largely stony academic-cum-intellectual reception accorded in 1944 to The Road to Serfdom (dedicated ‘To the Socialists of All Parties’) by the Austrian economist F. A. Hayek, who was based at the London School of Economics (LSE). ‘His central argument was that a modern economy was a vast system of information flows which signal to everyone indispensable facts about scarcity and opportunity,’ a latter-day follower, Kenneth Minogue, has helpfully summarised. ‘The vitality of modern Western economies, and the best use of scarce resources, rested upon the workers and entrepreneurs having these signals available to them. No planning committee could possibly plug into them. Central direction could lead only to poverty and oppression.’5. Such was the loss of confidence among economic liberals following the events of the previous 20 years – the inter-war slump, the lessons of the war (including the apparent Russian lessons) – that it would be a long time before a critical mass of politicians began to make a full-bloodedly coherent or attractive case on Hayek’s behalf.
Unsurprisingly, then, the inescapable necessity of a substantial portion of the economy being in public ownership was hardly questioned for many years after 1945. Indeed, such had arguably become the prevailing activator consensus from well before the war. The BBC (1922), Central Electricity Board (1926) and BOAC (British Overseas Airways Corporation, 1939) were all examples of important new organisations being set up on a public rather than private basis, while Harold Macmillan, the rising force on the Tory left, called in The Middle Way (1938) for a programme of nationalisation at least as ambitious as that then being advocated by the Labour Party. To many, the arguments seemed unanswerable: not only were there the examples of major, palpably enfeebled industries like coal mining and the railways as clear proof that private enterprise had failed, but in economies of scale, especially as applied to utilities (the so-called natural monopolies), there was an even more powerful siren call, very much reflecting what the political economist John Vaizey would term the prevailing ‘cult of giganticism’. During the last year of war, a quite sharp leftwards shift in the Labour Party – identifying public ownership with both economic efficiency and, in an ominously fundamentalist way, socialist purity – resulted in a fairly ambitious shopping list in Let Us Face the Future, featuring the Bank of England, fuel and power, inland transport, and (most contentiously) iron and steel, though with the high-street banks, heavy industry and building all excluded.
What sort of nationalisation would it be? The key text was the 1933 treatise Socialisation and Transport by the leading Labour politician Herbert Morrison, creator of the London Passenger Transport Board and, in due course, grandfather of Peter Mandelson. Notably short of hard economic analysis, Morrison’s paper nevertheless put forward a plausible enough public-corporation model that envisaged publicly appointed managers running monopoly industries in the public interest, though in a more or less autonomous way. Morrison did not have any truck with the notion of democratic control over these nationalised industries – certainly not democratic control as exercised from the shop floor. ‘The majority of workmen are,’ he insisted, ‘more interested in the organisation, conditions, and life of their own workshop than in those finer balances of financial and commercial policy which are discussed in the Board room.’6. The assumption was that the managers of these public corporations would be exemplars of scrupulous, objective professionalism – and that the workers in them should know their place.
A similar faith in the beneficent, public-minded expert underlay the creation of the modern welfare state. There was in December 1942 no greater expert than Beveridge himself, who summarised his Report as ‘first and foremost, a plan of insurance – of giving in return for contributions benefits up to subsistence level, as of right and without means test’. This last point was crucial, given the widespread detestation that had developed between the wars of the many forms of means testing. And this in practice meant that the social insurance provided – essentially against loss or interruption of earnings due to unemployment, sickness or old age – would be universal. Beveridge’s proposals engendered serious consternation on the part of Churchill, most Conservative MPs and some top Whitehall officials. But by March 1943 it was clear, following a clutch of by-elections, that there was an unignorable head of steam behind them. That month, Churchill – in a broadcast called ‘After the War’ – solemnly promised ‘national compulsory insurance for all classes for all purposes from the cradle to the grave’ – not the first use of that striking phrase but the one that made it famous. There were still plenty of debates and committees to go through, but by the time the war in Europe ended, family allowances – the first of the Beveridge-inspired pieces of legislation, providing 5s a week (more than 5 per cent of the average male wage) for each child from the second onwards – were virtually on the statute book.
From the perspective of more than half a century later, three of Beveridge’s central assumptions are especially striking, starting with what one might call the ‘Nissen hut’ assumption. Beveridge’s insistence that contributions be levied at a flat rate, rather than in the earnings-related way that tended to be adopted in other advanced industrial economies, was perhaps appropriate in an age of austerity. But that would change in an age of affluence with its inflationary implications and, above all, financially onerous concept of relative poverty. Secondly, there was Beveridge’s assumption that married women would – following their wartime experience – return to and stay at home, given that their prime task was to ‘ensure the continuation of the British race’, which at ‘its present rate of reproduction . . . cannot continue’. In administrative terms this meant that a married woman would be subordinate to her husband, with benefits to her coming only as a result of his insurance. Beveridge’s third, equally Victorian assumption, befitting a Liberal who was already in his teens when Gladstone had been Prime Minister in the 1890s, was that in the post-war world enhanced rights would be matched by enhanced responsibilities. Not only did he insist that his social-security system be contribution-based rather than tax-based, but he was also determined that his ultimate safety net of means-tested national assistance would be pitched at such an unattractively minimalist level that it would ‘leave the person assisted with an effective motive to avoid the need for assistance and to rely on earnings or insurance’. And he added sternly that ‘an assistance scheme which makes those assisted unamenable to economic rewards and punishments while treating them as free citizens is inconsistent with the principles of a free community’.7. Beveridge’s welfare state – a term not yet invented but one that he would come to loathe – was not, in short, to be a soft touch.
Integral to the Beveridge vision of the future was a free and comprehensive national health service. The key propagandist, in terms of preparing the intellectual ground for such a development, was undoubtedly Richard Titmuss – a remarkable person who would become (in Edmund Leach’s words) the ‘high pri
est of the welfare state’. Titmuss was still a young man, the son of a failed farmer-turned-haulier, when he researched and wrote Poverty and Population (1938), which he somehow managed to do while holding down a full-time job as an insurance actuary. In it he examined the depressed areas of industrial Britain and showed in irrefutable detail the appalling human wastage resulting there from poverty and inequality. Other books followed, including (soon after Beveridge) Birth, Poverty and Wealth (1943), which put infant mortality under the microscope of social class and found that each week almost 2,000 lives were lost unnecessarily. ‘The writings of Titmuss set a new standard,’ the historian of the NHS has written. ‘Their influence was extensive and immediate. His method of demonstrating inequalities found its way into popularisations aimed at various classes of reader.’
In February 1944 the Conservative Minister of Health in Churchill’s coalition government, Henry Willink, issued a White Paper that spoke of ‘the need to bring the country’s full resources to bear upon reducing ill-health and promoting good health in all its citizens’ – in effect making it clear that a post-war Conservative administration would bow to Beveridge’s wishes and introduce a national health service. Nevertheless, ‘there is a certain danger in making personal health the subject of a national service at all,’ the document added. ‘It is the danger of over-organisation.’ One way in which Willink intended to minimise that danger was through combining free, universal access on the one hand with diversity of provision on the other – above all through not nationalising the hospital stock as a whole, maintaining instead a mixture of voluntary and municipally run hospitals.
The attitude of the medical profession to all this was ambivalent. It broadly accepted the case for a free and universal health service, but it was understandably reluctant to abandon its profitable private work, feared political interference (whether at a local or at a national level) and – on the part of GPs, who usually operated solo – saw in the increasingly fashionable nostrum of the health centre a dastardly socialist plot. ‘We have entered a new era of social consciousness,’ the Spectator – hardly noted for left-wing views – observed in the spring of 1944. ‘Some of the doctors seem not to have realised that fully, and it is desirable in everyone’s interest that they should.’8. A year later there was still a significant degree of consciousness-raising to be done.
If in health there was still much to play for by 1945, the same was rather less true in education, where in outline anyway the post-war settlement had already taken shape. In a flurry of wartime action, it had three main elements: the Norwood Report of 1943, which examined what should be emphasised in the curriculum at secondary schools and (to the private satisfaction of the President of the Board of Education, Rab Butler, in theory a reforming Conservative) plumped for the time-honoured virtues of PE, ‘character’ and the English language, as opposed to anything more technical or modern; the Butler Act of 1944, which vastly expanded access to free secondary education; and, from the same year, the Fleming Report on the public schools, which in retrospect represented the spurning of a realistic chance to seek the abolition of the independent sector.
Relatively few people at the time appreciated the negative significance of Norwood and Fleming, amid a general preference for concentrating on provision and numbers, whereas even at its outline stage the Butler legislation was widely seen as historic. ‘A landmark has been set up in English education,’ the Times Educational Supplement declared. ‘The Government’s White Paper promises the greatest and grandest educational advance since 1870.’ The paper’s editor, the progressive-minded Harold Dent, claimed that the government now accepted two key principles – ‘that there shall be equality of opportunity, and diversity of provision without impairment of the social unity’ – and boldly prophesied that ‘the throwing open of secondary education, of various types, to all’ would ‘result in a prodigious freeing of creative ability, and ensure to an extent yet incalculable that every child shall be prepared for the life he is best fitted to lead and the service he is best fitted to give’.
Did that innocuous phrase ‘of various types’ catch some eyes? Quite possibly, for although Butler’s subsequent legislation would have nothing specific to say about different types of secondary school within the state sector, the fact was that at the very time of his White Paper the Norwood Report was not only enshrining as orthodoxy a tripartite system of grammar schools, technical schools and secondary moderns but explicitly avowing that ‘in the Grammar School the pupil is offered, because he is capable of reaching towards it, a conception of knowledge which is different from that which can be and should be envisaged in other types of school’. A former headmaster of Bristol Grammar School, Marlborough College and Harrow School, Sir Cyril Norwood had no qualms about pecking orders. In fact, there was an incipient movement under way in favour of the comprehensive school (or the ‘multilateral’, as it was then usually called), a movement in which Dent cautiously participated; yet even in one of English society’s more egalitarian phases, such a concept was far removed from practical politics. Significantly, when Dent in early 1944 wrote a pamphlet entitled The New Educational Bill, he neither questioned tripartism nor mentioned the comprehensive alternative.
There seems, moreover, to have been a similar lack of concern about the inevitable selection implications of a tripartite structure. ‘The Government hold that there is nothing to be said in favour of a system which subjects children at the age of 11 to the strain of a competitive examination on which not only their future schooling but their future careers may depend,’ wrote Dent about the White Paper in wholly sanguine mode. ‘In the future, children at the age of 11 should be classified, not on the results of a competitive test, but on assessment of their individual aptitudes largely by such means as school records, supplemented, if necessary, by intelligence tests, due regard being had to their parents’ wishes and the careers they have in mind.’ Just in case anyone was worried, he added that there would be arrangements for children to transfer at 13 in the unlikely event of a mistake having been made two years earlier.9.
If for Keynesians, social reformers and educationalists the war provided unimagined opportunities for influencing the shape of the future, this was even more true for architects and town planners and their cheerleaders. In their case a momentum for fundamental change had been building inexorably between the wars, and now the heady mixture of destruction and reconstruction gave them their chance. That gathering impetus was perfectly encapsulated as early as 1934 by a young architectural writer answering the question ‘What Would Wren Have Built Today?’ After diagnosing the City of London as overcrowded, badly lit and generally impossible to work in either efficiently or pleasantly, he went on:
We must give up the building rule which restricts the height of buildings, and we must not only do that, but we must build office blocks twice as high as St Paul’s, and have green spaces and wide roads in between the blocks . . . Two dozen skyscrapers, though they would obviously dwarf St Paul’s, would not take away from its beauty if they were beautiful themselves. They would alter the skyline, certainly, yet we should not sacrifice health, time, and comfort to one skyline because we have not the courage to create another.
The author of this confident, uncompromising clarion call? John Betjeman, that future doughty conservationist.
Crucially, this rapidly swelling appetite for the new embraced not only the horrors (real and perceived) of the unplanned Victorian city – above all, understandably enough, the horrors of the industrial slums. It also addressed the much more recent blight, as received ‘activator’ opinion had it, of the suburbs, sprawling outwards through the 1920s and 1930s, especially around London, in a spectacular and apparently unplanned way. They were, declared the Welsh architect Sir Clough Williams-Ellis in 1928, full of ‘mean and perky little houses that surely none but mean and perky little souls should inhabit with satisfaction’, while ten years later, according to Osbert Lancaster (cartoonist, architectural writer and coiner
of the derogatory term ‘Stockbroker Tudor’), the certainty that the streets and estates of the suburbs would ‘eventually become the slums of the future’ unless they were obliterated did much ‘to reconcile one to the prospect of aerial bombardment’. Even George Orwell could not see their point. In his last pre-war novel, Coming Up for Air, he wrote contemptuously of ‘long, long rows of little semi-detached houses’, of ‘the stucco front, the creosoted gate, the privet hedge, the green front door’, of ‘the Laurels, the Myrtles, the Hawthorns, Mon Abri, Mon Repos, Belle Vue’, and of the ‘respectable householders – that’s to say Tories, yes-men, and bum suckers who live in them’. To someone like Thomas Sharp, a planning consultant as well as a university lecturer in architecture and town planning, ‘suburbia’ – where by the end of the 1930s about a quarter of the population lived – was complete anathema; without compunction he condemned ‘its social sterility, its aesthetic emptiness, its economic wastefulness’. In short: ‘Suburbia is not a utility that can promote any proper measure of human happiness and fulfilment.’
Sharp had been implacably anti-suburb through the 1930s, but this particular broadside was published in Town Planning, an influential 1940 Pelican paperback. ‘However little can be done in wartime towards the achievement of the ideals I have tried to set out, it is essential that we should get our minds clear now as to what we are going to do when the war is over,’ he stressed. ‘The thing is there for us to do if we will. We can continue to live in stale and shameful slum-towns. Or in sterile and disorderly suburbs. Or we can build clean proud towns of living and light. The choice is entirely our own.’ Two years later, When We Build Again (a documentary focusing on Bournville Village in Birmingham) was even more idealistic. ‘There must be no uncontrolled building, no more ugly houses and straggling roads, no stinting of effort before we build again,’ declared the film’s narrator, Dylan Thomas, who also wrote the script. ‘Nothing is too good for the people.’ The Beveridge Report did not concern itself specifically with town planning, but in February 1943 – the same year that a bespoke Ministry of Town and Country Planning was set up – it was Beveridge who opened a notable exhibition, Rebuilding Britain, at the National Gallery. ‘How can the war on Squalor be won?’ asked the accompanying catalogue, referring to one of the five evil giants that Sir William’s report hoped to slay. The answer was sublime in its certainty: ‘The very first thing to win is the Battle of Planning. We shall need to have planning on a national scale, boldly overstepping the traditional boundaries of urban council, rural council, County Council. Boldly overstepping the interests described so often as vested.’
Austerity Britain Page 4