Book Read Free

The English: A Social History, 1066–1945 (Text Only)

Page 95

by Christopher Hibbert


  For middle-class women the number of occupations available was growing – if growing slowly – all the time. The struggle to enter the professions had admittedly been a hard one. Elizabeth Garrett Anderson and Sophia Louisa Jex-Blake, two leading pioneers of medical education for women, had both experienced great difficulty in qualifying, since no medical school in England would accept them as pupils nor any examining body as candidates. Elizabeth Garrett Anderson eventually took the examinations of the Society of Apothecaries, having been advised that the Society’s charter made refusal impossible; but no sooner had the Apothecaries reluctantly granted her the necessary licence in 1865, than they changed their constitution. She was consequently for many years the only female member of the British Medical Association. Miss Jex-Blake, sister of the headmaster of Rugby, after studying medicine in Edinburgh, was thwarted in her attempt to sit for examination by the College of Surgeons by the prompt resignation of the entire body of examiners. It was not until 1877, three years after she had founded the London School of Medicine for Women, that clinical experience was at last made available for women by the London (afterwards the Royal) Free Hospital. It was not until 1919 that restrictions upon women entering the professions, including law as well as medicine, were removed by the Sex Disqualification (Removal) Act; not until 1922 that a woman was called to the English Bar; and not until 1929, ten years after Lady Astor had become the first woman to take her seat in the House of Commons as Conservative Member for the Sutton division of Plymouth – formerly represented by her husband – that a woman became a Privy Councillor and Cabinet Minister when Margaret Bondfield was appointed Minister of Labour. She held office for only two years, however; and it was many years before another woman held a Cabinet appointment.

  In the 1920s and 1930s there were women mayors and magistrates to be found in every county; but their numbers were very small. Twenty-four women were elected to Parliament in 1945; but this was still only a very small fraction of the number of men. Indeed, women were still generally far from sharing equal rights with men. A series of enactments had certainly removed the more obtrusive of women’s legal disabilities. They no longer had to prove grounds other than adultery for obtaining a divorce; and they could possess and sell property on the same terms as men. Their clothes also suggested a new freedom. Skirts became much shorter; waistlines, liberated from constriction, gradually disappeared altogether; silk stockings became commonplace, underclothes far lighter and more alluring, sports clothes more practical, swimming costumes more revealing, hairstyles much simpler. Before the First World War the suffragettes had been urging women to discard the long tresses and elaborate coiffures which were symbols of female bondage; and during the war long hair came close to being considered unpatriotic, since women in the services and nurses were not permitted it and women working with machinery would have found it dangerous. After the war short hair became fashionable: shingling, the fringe and the boyish style known as the Eton crop were all the rage. Lips were more vividly painted, for a time in the artificial shape of the cupid’s bow; nails were coloured, lacquered and varnished; eyebrows were plucked and replaced by pencilled lines. By the early 1930s about £500,000 a year was being spent on promoting beauty preparations in the rapidly expanding advertising business which, by the time the Second World War broke out, was spending in all nearly £10 million a month.24

  Yet, while many of the aims of the early feminists had been achieved by 1939, and while the war brought about further improvement in their lot, in 1945 women still laboured under numerous disadvantages. In many occupations, including the Civil Service, they received less than men for the same work; they were expected to abandon their appointments when they married; and, even when they were permitted to stay on, they were most unlikely to find their posts held for them if they left them to give birth.

  Their children were, at least, far more likely to survive infancy than the children of their great-grandparents had been; and they themselves were far less likely to die from puerperal fever or from any other of the dangers attending childbirth in the past. The treatment of sickness generally had dramatically improved since Florence Nightingale had observed in 1863 that ‘the very first requirement in a hospital is that it should do the sick no harm’.25 At that time few hospitals had met that requirement; and it was often as difficult to get into those that did as to survive treatment in those that did not. In 1859 the rules of Salop Infirmary had provided

  That no woman big with child, no child under seven years of age (except in extraordinary cases, such as fractures, stone, or where couching, trepanning or amputation is necessary), no persons disordered in their senses, suspected to have smallpox or other infectious distemper, having habitual ulcers, cancers not admitting to operation, epileptic or convulsive fits, consumptions, or dropsied in their last stage, in a dying condition, or judged incurable, be admitted as inpatients, or inadvertently admitted, be suffered to continue.26

  Patients at other hospitals were liable to be denied food or to be discharged for breaking such rules as those established by the governor of Guy’s Hospital who ran it ‘despotically’ for half a century and provided by Rule V that ‘if any patient curse or swear or use any prophane or lewd talking, and it was proved on them by two witnesses, such patient shall, for the first offence, lose their next day’s diet, for the second offence lose two days’ diet, and the third be discharged’.27 The loss of diet, however, was considered no great punishment. In most large London hospitals the food provided consisted of a pint of water gruel or porridge for breakfast, eight ounces of meat or six ounces of cheese for dinner, and broth for supper. Patients might also receive up to a pound of bread a day and two to three pints of beer, but no vegetables or fruit.28

  Those who were denied admission to hospital could frequently regard themselves as fortunate. Even in the 1890s there were numerous institutions like the workhouse in which the editor of the British Medical Journal found the sick ‘lying on plank beds with chaff mattresses about three inches thick between their weary bodies and the hard uneven planks… Some idiots and imbeciles share the wards with these patients. The infants occupy a dark, stone-paved room, bare of furniture, with no rug for the babies to crawl or lie upon and no responsible persons to see to their feeding and cleanliness.’

  Most hospitals, in fact, were extremely bleak and insanitary institutions where the standard of nursing was abysmal, where drunkenness and sexual escapades were common, and where the risk of cross-infection was acute. Standards of nursing gradually improved after reforms initiated by the Anglican Nursing Sisterhoods – in particular by Sister Mary Jones, whose nurses were responsible for tending the patients of King’s College Hospital – and by Florence Nightingale whose work in the Crimean War had brought her fame, influence and a decisive say in the disposal of large sums of money. The Nightingale School of Nursing was opened in 1860 at St Thomas’s Hospital which was rebuilt on a new site a few years later to designs approved by Miss Nightingale. The building cost £330,000 and provided beds for nearly 600 patients in seven detached blocks.

  Every ward on each floor has two hydraulic lifts: one small lift for food or medicines, one larger one for taking up patients or nurses [The Illustrated London News reported]. Every ward has its own bathrooms, lavatories and closets detached from others and its separate shoots for sending down dust and ashes… dressings and other things. Natural ventilation is as much as possible depended on… All the building is fireproof… The walls of each ward are coated with Parian cement which, while not as cold, is almost as hard and non-absorbent, and quite as smooth, as marble.

  When the new St Thomas’s Hospital was opened by Queen Victoria in 1871 there were far too few hospitals in the country to provide for the needs of the sick, even though the State’s responsibility for providing hospitals for the poor had been recognized by the Metropolitan Poor Act of 1867. Ten years before the opening of St Thomas’s, when the population of England and Wales was rather more than 20 million, there were les
s than 12,000 beds in 117 hospitals, compared with 250,000 beds in some 3000 hospitals in 1939; and as late as 1896 a census of the sick poor revealed that of 58,550 cases only just over 22,000 were in general infirmaries, the rest being looked after in workhouses.29

  Most people, therefore, even when seriously ill, had to be looked after at home, as they had been at the beginning of the century when there were only about 3000 patients in hospitals in the entire country.30

  The first district nurse did not appear until 1857; and, a generation later, the only doctors available to the destitute were the ‘parish doctors’, practitioners of scant competence employed, usually part-time, by the local Boards of Guardians. ‘The great majority of wage-earners, however.’ as F. F. Cartwright has written, ‘depended upon the sixpenny doctor, often an able but over-worked general practitioner who received patients in his surgery, gave them a cursory examination, and invariably supplied a bottle of more or less harmless medicine, bright colour and pungent flavour being the most desirable qualities.’31 He also provided his patient with a certificate enabling him to draw sick pay from any benefit society or club of which he happened to be a member.

  Towards the end of the nineteenth century death and sickness rates, particularly death rates of children and of adult deaths from tuberculosis, began to fall. At the same time surgical operations became less painful and less perilous. Chloroform had been first administered as an anaesthetic in 1847 by Sir James Simpson who had already used ether to relieve the pains of childbirth; and after 1884, when it was discovered that cocaine produced insensitivity to pain if applied to the eye, local anaesthesia was quickly developed. Antiseptic surgery with a carbolic spray was introduced by Joseph Lister in 1865, and was soon followed by the sterilization of surgical instruments.

  A fearful epidemic of influenza in 1918–20 was responsible for over 150,000 deaths, the highest in proportion to population since the cholera outbreak of 1849.32 And this upset the encouraging trend of health statistics for a time. But thereafter more nourishing diets and improvements in housing, sanitation, hygiene and water supply encouraged the rising standards of health and an increase in the average height and weight of children. So did improvements in medical skills and treatment. Insulin proved effective in the treatment of diabetes, salvarsan in that of syphilis, sulphonamides in cases of pneumonia. After the 1930s diphtheria was brought under control by immunization; and experiences gained by doctors in World War II led to advances in both cosmetic surgery and blood transfusions.

  Hospital treatment, however, often proved extremely expensive. Free only for the very poor, it had to be paid for by most patients; and even those whose contributions to National Insurance were regularly maintained were not covered for certain specialist services including dentistry and ophthalmic treatment: tooth decay was consequently widespread and spectacles were on sale in large stores where customers tried on one pair after another until they found the one that suited them best. Harrods and the more expensive shops sent out sets of trial lenses ‘for the convenience of country customers’.

  The National Insurance Act, applying to some 13 million workers with wages of less than £160 a year, had come into force in 1913. But it was not until 1919 that a general health scheme began to be established by the creation of a Ministry of Health by Lloyd George’s Coalition Government, and not until the 1920s that a drastic reform of the Poor Law, under which the treatment of the sick had previously been administered, was undertaken by a Conservative Minister of Health, Neville Chamberlain. By 1938, although the National Insurance Act had been extended to cover nearly 20 million people, there were many, including the dependents of the insured, the self-employed and most of the middle classes, who were still not covered by its provisions.

  The government’s reaction to the outbreak of war in 1914 had been slow. The Defence of the Realm Act had given it extensive powers; but it was not until the end of 1916 that the Ministries of Labour, Shipping and Food were established; not until 1917 that the railways and mines came under government control; and not until February 1918 that rationing was introduced. The reaction to the threat of the Second World War was far quicker: two Emergency Powers Acts were introduced within months; new Ministries were quickly established; petrol rationing was imposed; men up to the age of forty-one were made liable to conscription; a fair distribution of clothes and certain foods was ensured by a system of rationing by ‘points’; rent controls were extended to nearly all unfurnished houses; food subsidies were introduced in December 1939; not long afterwards cheap ‘utility clothing’ appeared in the shops; air-raid shelters were made, and gas-masks distributed.

  A quite new and horrible type of warfare was expected, and immense numbers of civilian casualties were forecast. In the First World War there had been occasions when the fighting at the front seemed far removed from the relative comforts of Blighty, when the soldiers were wholly out of sympathy with the denizens of Civvy Street. Siegfried Sassoon, a brave young officer in the Royal Welch Fusiliers who, after being wounded, came on convalescent leave to London, wrote of his desire to see the jolly complacency of a theatre audience violently disrupted:

  I’d like to see a tank come down the stalls

  Lurching to rag-time tunes of Home Sweet Home

  And there’d be no more jokes in music-halls

  To mock the riddled corpses of Bapaume

  There was no such widespread feeling in the Second World War during which 264,000 fighting men were killed, less than a third of the 908,000 who died in 1914–18. Nothing much happened on the home front at first and, when the raids did begin, casualties were not on the terrible scale that had been predicted. Yet by the beginning of January 1941 over 13,000 civilians had been killed in London and nearly 18,000 badly injured. And on the night of 10–11 May that year there were over 3000 casualties, 1436 of them fatal. In all 60,000 civilians were killed in the war and 35,000 merchant seamen.

  As in the First World War, women played a vital role in victory. Conscription of women was introduced – as it had not been in the earlier war – at first for those between twenty and thirty years of age, then, after 1943, for those from eighteen and a half to fifty. But in practice only women between the ages of nineteen and twenty-four were called up, and these could choose between serving in the various auxiliary services – in which they would not be called upon to use a gun or other weapon unless they agreed to do so in writing – joining the Civil Defence or taking up some kind of specified civilian employment. Married women living with their husbands were exempt, so were the mothers of children under fourteen whether married or not. In 1944 there were almost half a million women in the Women’s Royal Navy Service, the Auxiliary Territorial Service and the Women’s Auxiliary Air Force, mostly volunteers. There were only 1072 female conscientious objectors compared with 60,000 men.

  A further 200,000 women served in the Woman’s Land Army, while 260,000 worked in government ordnance factories and very nearly half the posts in the Civil Service were held by women.

  It had been expected [so Arthur Marwick has written], that where women were substituted for men it would require three women to do the work of two men, but in fact a one to one substitution proved perfectly possible; this was partly because modern technology had rendered sheer strength less essential to industrial work. In a previous generation the symbol of female rights which had been gained at the end of a previous war was the vote; now, though there had been no organized movement comparable with that of the suffragists and suffragettes, the symbol was equal pay.33

  A Royal Commission on Equal Pay was accordingly appointed in the last year of the war. The members of this commission suggested that ‘conventions and prejudices’ had been ‘crumbling fairly fast in recent years’ and that the record of what women had been able to achieve in war had exerted and [would] continue to exert a lasting influence in breaking down whatever elements of the old-fashioned or irrational remains in the public’s estimation of the capabilities of women’.34


  Despite these remarks the commission, with three of its women members dissenting, did not wholeheartedly recommend equal pay on the grounds that it would not be in the interests of women for them to receive it, since it would have adverse effects on the expansion of women’s employment. Nevertheless the movement for equal pay ‘had received an enormous accession of strength from the war experience’.35

  The war, by submitting men and women, rich and poor to many of the same dangers and deprivations, had also helped to forge a national unity which had seemed unattainable in 1926. ‘Hitler,’ the London correspondent of the New York Herald reported, ‘is doing what centuries of English history have not accomplished – he is breaking down the class structure of England.’ At the same time the government’s measures to keep down the cost of living by food subsidies, rent controls and other means proved successful. Between the outbreak of war and 1945 – a period of full employment – wages increased by more than half, but the cost of living during that time rose only by 30 per cent. Most foods, except bread and potatoes, remained rationed, yet vegetables, grown everywhere, including Hyde Park and Windsor Castle, were usually in ready supply. The Ministry of Food, under the direction of Lord Woolton, a former chairman of Lewis’s, the department store, was highly effective; the meals supplied in schools, works canteens and the so-called British Restaurants were high in nutritional value. Babies were provided with concentrated orange juice.

  Although the prime minister himself was principally concerned with winning the war, others were already planning the better society which it was hoped would be created when the fighting was over. Already in 1942 Sir William Beveridge, Director of the London School of Economics, had issued a Report on Social Security in which was proposed not only a social insurance scheme, national health service and child allowances, but also measures to ensure full employment and widespread reforms in housing and education. In 1945 a Labour government under Clement Attlee was returned with a large majority to carry out a programme of limited nationalization.

 

‹ Prev