Book Read Free

The People's Republic of Walmart

Page 11

by Leigh Phillips


  A high-priority program aiming to increase penicillin yields was placed under the direction of the Fermentation Division of the Department of Agriculture’s Northern Regional Research Laboratory (NRRL) in Peoria, Illinois, a move that proved vital to the innovations that made large-scale production of penicillin possible. Howard Florey, the Australian pharmacologist—who, along with German-born British biochemist Ernst Chain and Alexander Fleming, would go on to win the 1945 Nobel Prize for Medicine for the development of penicillin—visited a number of pharmaceutical companies to try to interest them in the drug, but he was disappointed in the results. The Committee on Medical Research (CMR) of the Office of Scientific Research and Development (OSRD)—created in June 1941 to ensure that as war approached, the appropriate amount of attention was directed toward scientific and medical research relating to national defense—convened a meeting with the heads of four drug firms to impress upon them the urgency of their involvement and assure them of government assistance. The response, however, was pessimistic. It was only during the second such conference, ten days after the attack on Pearl Harbor, that the argument was won. Crucially, the government obtained agreement for the sharing of research between the different actors through the CMR—a cooperative development that proved decisive in the scaling-up of production as each company solved different aspects of the overall problem, each in itself a problem from hell. As Pfizer’s John L. Smith characterized it, “The mold is as temperamental as an opera singer, the yields are low, the isolation is difficult, the extraction is murder, the purification invites disaster, and the assay is unsatisfactory.” Despite the successes of initial production under OSRD auspices, the manifest utility of this wonder drug to the war effort, ahead of the invasion of occupied Europe, prompted the War Production Board in 1943 to take over direct responsibility for cranking up production. The board directed twenty-one companies to participate in its aggressive expansion of penicillin production, each of which received priority on construction materials and supplies. In time of war, government leaders did not trust the private sector to be up to the task: the supply of all penicillin that was produced was controlled by the WPB, which distributed it to the armed forces and the US Public Health Service. Production soared from 21 billion units in 1943 to 1.7 trillion units in 1944 (in time for the D-Day landings at Normandy), to some 6.8 trillion units at war’s end.

  With the war’s conclusion in 1945, planning was rapidly abandoned, departments were shuttered and government plants were sold off to private industry. Paradoxically, however, US corporations ended the war stronger than they began it. Elephantine contracts from government, price supports and relaxed anti-trust laws all worked to boost profits and grow corporations. The wartime planning regime needed to get business onboard, so throughout the war, while government bureaucrats made some of the top-level decisions, business still controlled production. The war ultimately enabled a capital-friendly version of planning: production was still mainly carried out by large firms belonging to even larger cartels, albeit with a significant dose of government rationing. At the same time, the scope of economic planning carried out inside corporations increased.

  The combination of bigger government and bigger corporations that emerged from World War II led even those on the right to question whether capitalism would give way to some form of economy-wide planning. Hayek’s fellow traveler Joseph Schumpeter famously thought that the replacement of capitalism by some form of collectivist planning was unavoidable. A fervent anti-socialist, Schumpeter nevertheless saw how the capitalism of his time was aggregating production and creating ever-larger institutions—not just firms but also government agencies—that planned internally on ever-larger scales. He thought it was only a matter of time before bureaucratic planning overtook, through its sheer weight, the dynamism of the market. The rise of Keynesian economic management and the experience of wartime planning convinced Schumpeter that a transition to the socialism he despised was inevitable, if not imminent.

  Instead, the onset of the Cold War after 1945 produced a fervent official anti-Communism, alongside a narrow, technocratic vision of economic management. The government saw good in increasing productivity, and even in coordination between business; but any move to extend democracy to the economy was bad. Elite concern about a growing militancy, both among rank-and-file soldiers still in Europe and workers in the United States, meant that even as official rhetoric extolled loudly the virtues of free market capitalism, in practice, the American welfare state expanded. As with Western Europe’s emerging welfare state, elites grudgingly accepted social reform as the lesser evil to the immediate threat of social revolution. Business compromised: government would play a larger role in the economy, supporting basic innovation and ensuring that the final products and services produced by business found markets, while at the same time professing unwavering support for the free market.

  The central hotbed of publicly planned innovation was the postwar Pentagon, coordinating government agencies that would prove responsible for the initial development of computers, jet aircraft, nuclear energy, lasers, and, contemporarily, much of biotechnology. Its approach built upon the method of partnership between government and science for basic and applied research that was pioneered by the Manhattan Project of the United States, the UK and Canada during the Second World War. With the Soviet launch of Sputnik in 1957, as Mariana Mazzucato argues, senior figures in Washington were petrified that they were falling behind technologically. Their immediate response was the creation, the following year, of the Defense Advanced Research Projects Agency (DARPA), an agency that—along with allied agencies that the Pentagon viewed as vital to national security (including the Atomic Energy Commission and NASA)—would support blue-sky research, some of which might not produce results for decades. DARPA oversaw the creation of computer science departments throughout the 1960s, and in the following decade, it covered the high costs of computer chip prototype manufacture at a lab at the University of Southern California.

  Mazzucato lists twelve crucial technologies that make smartphones “smart”: (1) microprocessors; (2) memory chips; (3) solid state hard drives; (4) liquid crystal displays; (5) lithium-based batteries; (6) fast Fourier transform algorithms; (7) the internet; (8) HTTP and HTML protocols; (9) cellular networks; (10) Global Positioning Systems (GPS); (11) touchscreens; and (12) voice recognition. Every last one was supported by the public sector at key stages of development.

  We see a similar phenomenon within the pharmaceutical sector, but this time with respect to the crucial role played by government labs and public universities in the development of radical new drugs, known as “new molecular entities” (NMEs)—particularly those given a “priority” (P) rating—as opposed to the cheap-to-develop and therefore more profitable “me too” drugs (existing treatments with the formulas tweaked slightly, which are favored by Big Pharma). Mazzucato quotes Marcia Angell, former editor of the New England Journal of Medicine, who argued in 2004 that while the large pharmaceutical companies blame high drug prices on exorbitant research and development costs, the reality is that it was government-funded labs that were responsible for some two-thirds of the NMEs discovered in the previous decade. One must go beyond the concession that private pharmaceutical companies have been unproductive and declare that in the war against disease, they have been absent without leave for decades.

  It all reminds of Karl Marx’s simultaneous admiration and condemnation of capitalism of the nineteenth century. How furious he was that such an incredible system, more productive than feudalism or slavery or any other previous economic structure, could also be so inexorably restricted, so bounded, so lazy with respect to what it could produce. All these possible things (whether knowns, known unknowns, or Rumsfeldian unknown unknowns) that could so benefit humanity would never be manufactured so long as they were unprofitable, or even just insufficiently profitable! This was what Marx meant when he raged against the “fettering of production.” Human progress, the expansion of o
ur freedom, has thus far been held back by this irrational system.

  6

  NATIONALIZATION IS NOT ENOUGH

  On July 5, 1948, the National Health Service Act, establishing the world’s first universal, public and free healthcare system, came into effect in the UK. Despite the Labour government’s passage of the act two years previously, the formal creation of the NHS remained deeply uncertain and a source of fractious debate until the moment of its arrival. In a speech to Parliament on February 9, 1948, Aneurin Bevan, the Labour minister for health, exhorted his colleagues:

  I think it is a sad reflection that this great act, to which every party has made its contribution, in which every section of the community is vitally interested, should have so stormy a birth … We ought to take pride in the fact that, despite our financial and economic anxieties, we are still able to do the most civilized thing in the world—put the welfare of the sick in front of every other consideration.

  The story of the British NHS is, however, much more than a story about caring for the sick. It is a century-long saga of the struggle for some form of democratically controlled planning under capitalism—a major reason for the tempestu-ousness of its birth and the conflicts it continues to engender. Radical enough, but not revolutionary, the NHS signaled the potential for a slow erosion of the market in a major sphere of life. It raised the possibility of a democratic planning that initially coexists with capitalism—an embryo of the new world developing within the confines of our old, tired one. But just as we’ve already seen how the simple act of planning—even on the vast scales undertaken by the likes of Walmart or Amazon—is not enough, it turns out that simply placing planning in the hands of the state is likewise insufficient for this embyro to really flourish.

  “Nye” Bevan, as supporters affectionately called the charismatic leader of Labour’s left wing who was tasked with establishing the NHS following Labour’s landslide election victory in July 1945, famously said that “the NHS is socialism.” Before its creation and throughout its history, many of the NHS’s opponents have seen it that way as well and have acted accordingly. While the NHS is today the fourth-largest employer in the world, directly employing 1.4 million staff and outpolling every other institution—including the monarchy—in popularity among Britons, it is also, sadly, living proof of how a dream of a universal, publicly run service has been compromised, reduced to a hobbled mess of public and private institutions crisscrossed by markets. It is an example of far-reaching potential stymied.

  Yet even at its best—and for all the compassion it embodies, and lives it has improved and saved—the NHS has fallen short of the horizon of democratic possibility. A short history of how this imperfect institutional expression of human decency, this real yet incomplete democracy came to be, and how it planned, offers much more than an abstract badminton match of ideas between libertarian and statist versions of socialism.

  The story of the NHS begins not in the halls of the British Parliament at Westminster, but in the mining villages and industrial towns born of the human sweat that powered the Industrial Revolution. Before the NHS, healthcare was largely a luxury. The wealthy hired personal doctors; the rest simply did without or depended on the modicum of relief provided by churches or the state. Local governments set up rudimentary hospitals for the poor, but they were at best insufficient, at worst more akin to prisons. They often kept the sick and the infirm separated from the rest of society, rather than cure them—sweeping the unemployed and unemployable under a squalid, fetid rug and calling it charity.

  As a counter to this injustice, working-class organizations of all kinds began to experiment with mutual aid. Workers formed “friendly societies,” pooling together small monthly dues from individual workers to pay doctors and run occasional free clinics. As they grew, some societies could hire full-time doctors and even build their own clinics, offering care to entire families, rather than just (mostly male) workers. This people’s healthcare was most advanced in the coal-mining valleys of South Wales, where working-class culture thrived. By the early twentieth century, even little cottage hospitals were springing up alongside the black pits.

  It was this spirit of mutual aid that allowed communities to survive economic downturns. Unemployed miners were put to work doing administrative tasks such as collecting fees—themselves reduced during such times—and doctors were also forced to take a pay cut in proportion to a society’s lower income. This simple solidarity kept services intact, even when money was short. Worker-run clinics in Wales and across the UK were among the first large-scale insurance schemes for healthcare, predating both national public insurance (as in Canada or France) and private insurance (as in the United States). The working class organized itself to deal collectively with a problem that affected every individual, but with which no individual could deal on their own. It was socialized medicine in embryo.

  As workers became more organized, these mutual aid–financed clinics grew still further in scale and number. Membership was opened up to entire communities, beyond just miners and their families. In turn, and through unions, workers made demands, not only on bosses for better working conditions, but also on government for radical redistribution of resources, including the establishment of healthcare as a right. In essence, this would be a public healthcare system: the same phenomenon of mutual aid extended to all of society and, crucially, requiring those with greater means to pay a greater share of the finance. Pushed to act to contain such broader demands and the spread of socialist ideas, the UK government created, in 1911, a limited national insurance scheme. This first attempt at publicly funded healthcare, however, was far from comprehensive: even after two decades, National Insurance covered just 43 percent of the population, the majority of them working-age men.

  Today, doctors can be some of the strongest defenders of public healthcare, helping us recognize, for example, that vaccinations will not deliver the crucial defense of herd immunity unless an entire community is vaccinated. But at the time, it was not only the wealthy, as one might presume, but also most doctors that opposed the establishment of public healthcare. The former did not want to shoulder new taxes to pay for universal services that would disproportionately benefit the poor and working majority; the latter feared that a national scheme would not only reduce their incomes but also challenge their managerial control over what medical care looked like.

  Both fears were warranted. As they expanded, worker-run schemes did indeed start to challenge the absolute power of doctors over medical care. Worker societies did not so much target individual clinical decisions—rather, they increasingly wanted a say in planning, in how resources were allocated. Would new money go into building clinics or hiring nurses—or into savings accounts held by doctors? The most forward-thinking societies advocated for doctors to become salaried workers rather than contractors—people thus invested in the expansion of medical practice, rather than that of personal fortunes. As with any other sector, medicine has its own logistic specificities. Decisions have to be made about where clinics are located, how to divide tasks between nurses and doctors, which afflictions should be prioritized, and so on. To have a say over these things goes beyond simple redistribution of resources; rather, British workers were demanding that an entire sector of the economy be democratized.

  Doctor Knows Best

  The barriers to change were formidable. Medical care was (and often remains) largely paternalistic: doctor knows best, and patients are to do as they are told. Doctors are also typically small-business people, and not just in the UK. They decide much more than which prescription to write; they have influence over where clinics are established, which medical technology to use, and what counts as a legitimate health need and what doesn’t. Of course, within the confines of the operating or examination room, doctors are legitimate experts. They have specialized skills and knowledge furnished by years of medical training. Contrary to the claims of modern-day charlatans, the advent of medical science unquestioningly represente
d a qualitative leap beyond the magical thinking and credulity that preceded it. The medieval notion that four humors in imbalance causes illness cannot compete with the germ theory of disease. As the lyrics of “The Internationale,” the socialist hymn, famounsly command: “For reason in revolt now thunders, / And at last ends the age of cant! / Away with all your superstitions, / Servile masses arise, arise!”

  Even so, doctors are not the only medical experts. Although nurses were key to the provision of care in the early-twentieth century United Kingdom, nursing was seen as less valuable because it was associated with femininity and low skill. Subjugated in society as women, nurses long played a subordinate role in hospitals and had little input into the shape of a system that would quickly stall without them. At a bare minimum, democratization would have to encompass all the workers involved in producing healthcare.

  But health and disease stretch far beyond the four walls of a clinic or hospital, and beyond the medical knowledge of health practitioners; they are not a single, isolated compartment of our lives. For example, whether someone contracts lung disease may depend on pollution as much as it does on the responses of the health system, as epidemiologists will be the first to remind. Chronic disease during old age depends on a whole life history, reaching back through quality of social integration as an adult to childhood nutrition and primary education. Work-related injuries are highly dependent on the kind of work we do and the kinds of safety protections we have—from rules against asbestos to unions’ willingness to fight for them. Health researchers today call these the “social determinants of health.” While medicine can be a narrow field of expertise, healthcare encompasses everything we do. It is not just an individual responsibility but is deeply impacted by what society looks like and the level of its collective decision making. What, for example, counts as a legitimate health concern, and what can be dismissed? Are you depressed because of who you are, or because you’re working two mind-numbing jobs at minimum wage? Is it you, or is it capitalism?

 

‹ Prev