A new politics of value was being advanced with the growth of American power and it was profoundly on display in the market’s influence on laboratory agendas. At Lexington, market and scientific value merged in the efforts to cultivate valuable people and to cultivate valuable drugs. A guiding principle of the research done at Lexington was to develop drugs that might be substitutes for other drugs—value, as with money, relied as much on a substance’s fungibility, on the capacity to replace existing drugs with similar, but more strategically valuable versions, than on any intrinsic health benefit. And value in this context was part of a complex diplomatic, economic, and legal calculus.
As the tenets of international drug control were being construed around the policing of “addictive” substances, the National Research Council’s National Committee on Drug Addiction made it a priority to collaborate with the pharmaceutical industry in an effort to “systematically set out to review all compounds that promised to achieve analgesic effects without producing physiological symptoms of tolerance and withdrawal.” Science and pharmacology promised to relieve social conflict. “The committee maintained that through substitution, industrial production of alkaloids could be ‘reduced to a minimum,’ thus lessening the police authority necessary to control the situation.”60 Thus the research at Lexington was not so much designed to rehabilitate “addicts,” who were largely considered to be unredeemable, but rather to test out new synthetic substitutes being cranked out by American pharmaceutical laboratories. Experimental drug trials for new compounds being put out by companies including Merck, Eli Lilly, Parke-Davis, and many others benefited from the research conducted at Lexington. This research reverberated through the drug regulatory regime. In justifying annual appropriations to continue the work at Lexington, the FBN commissioner reported to Congress in 1947: “Not so long ago I went to get demerol, a new synthetic drug, under control, and I had to prove that it was a habit-forming drug and it was only because of the work at Lexington that I was able to convince the Ways and Means Committee that this drug should be under control.”61
Demerol was just one of a number of drugs that fell under the purview of the international 1948 Protocol based on research into addiction being conducted at the Lexington Narcotic Farm.62 The work at Lexington helped establish both national and international definitions of “addiction” that influenced the orientation of international drug control. For a country intent on cultivating mass consumption, the policing of addiction offers a striking window onto the political economy of US power. Not only was the category of “addiction” notoriously difficult to define when detached from social and cultural understandings of it, but as a legal category it structured enforcement according to the power of racial, national, cultural, and other biases to determine whose consumption practices were targeted as a menace to the larger community. Unsurprisingly at Lexington, “The researchers were almost entirely white, upper- and middle-class professional men who experimented on poor, lower- and working-class, ethnically and racially diverse addicts.”63
This hierarchy was a direct consequence of the seemingly neutral science that had been brought to bear on the definition of addiction. The medical director of USPHS at Lexington, Kentucky, described his research conclusions on the nature of addiction before an audience of the American Psychiatric Association in 1947: “The term ‘addiction’ need not be confined to the use of substances. Persons who pursue certain practices to their own or the public’s inconvenience, harm or peril are sometimes a greater problem than those who misuse a substance. It may well be that internal or external difficulties responsible for the unwise pursuit of a practice and those responsible for the misuse of a substance are similar.”64
The social and the biological were intimately linked, he suggested, and addiction was a manifestation not merely of a drug’s impact on the human body, but of a person who already exhibited socially dysfunctional behavior. It is striking how notions of social conformity rooted in a particular model of consumer capitalism were prominently on display. As historian Nancy Campbell explains, the test subjects at Lexington were deemed socially irredeemable and, as such, incredibly valuable as “research material”—a sobering refashioning of people as human raw material inputs into the chemical laboratories of US capitalism. “Drug addicts, who occupy the social category of unproductive or even antiproductive, were rendered ‘useful’ through the exercise of scientific discipline at the [Addiction Research Center].”65 Echoing this logic, the medical director at Lexington testified before Congress in 1948, “Narcotic addiction is a public-health menace inasmuch as without control addiction spreads and persons addicted become submissive, ambitionless and abject.” The “typical symptoms of drug addiction” are evident in the “loss of self-control.” The doctor went on to describe the promise of social transformation such research might bring about: “In addition to this unconditioning and as a substitute for old habits, new habits must be built up; and for this reason the addict under treatment should be kept busy in some useful way during all his waking hours.”66
And Lexington provided an experimental context to do just that. As its original title suggests, the Narcotic Farm as a penal-research institution was also operated as a labor farm. Its institutional name would be changed to “Public Health Service Hospital” when “people began to ask where the narcotics were grown,” although it never shook the nickname “Narco.”67 Arguably narcotics were indeed being grown, or at least tested on the premises; however, the confidence behind the entranceway’s dramatic inscription (“United States Narcotic Farm”) was based on the idea of the redemptive value of productive labor. Inmates at Lexington operated a clothing factory, a furniture factory, a farm, and a patient commissary.68 The farm was intended to be self-sustaining and the other capital industries produced products “utilized by government agencies.”69 In fact, during the war the Army received articles manufactured at Lexington of “value in excess of $100,000.”70
FIGURE 8. Federal Narcotic Farm, Lexington, Kentucky. [Photo by Arthur Rothstein.]
With the revolution in drug development underway, synthetic drugs offered the opportunity to replace drugs deemed dangerous (addictive) with nonaddictive substitutes and alternatives (of course, many of the drugs produced like methadone and others turned out to be just as addictive as the drugs, in this case heroin, they were intended to replace). Similarly people deemed “antiproductive” threats to the community (convicted felons) could be put to work and, if not completely transformed themselves, might contribute to the greater social good. The logic of the laboratory was intimately linked to a vision for policing drug production and consumption, and the way this new drug market incorporated people’s bodies reveals much about the cultural and social implications of capitalist-driven modernization.
ALCHEMY IN THE ANDES
Policing and regulation were not tied simply to limitation and repression, but also to the positive production of capitalist consumer habits. This was true in both the United States and the Andes, at both ends of the economic circuit through which coca commodities flowed. In particular the policing of drug production and consumption was guided largely by identifying “addiction” as the benchmark for designating select drug consumption “illegitimate.” The deployment of a language of “addiction” to attack certain contexts of drug consumption relied on identifying the bodies of people deemed socially threatening to become the test subjects for drug development and social engineering across the Americas. The research conclusions devised from human drug experimentation in the United States would be replicated in US-led Andean projects of social engineering. US national strategic priorities incorporated a commitment to US-manufactured drugs as a component of encouraging specific models of modernization and development, defining the rights of citizenship, and controlling the physical bodies of people targeted for necessary social and cultural transformation in the United States, Peru, and Bolivia. The alchemy of empire consisted of more than simply the control of plants and physical mat
erial. It was also a process of cultivating the necessary social order, modeling the laboring and consuming habits of populations to quite literally fuel and sustain the envisioned economic transformations. In this regard, drugs were not merely valued as “commodities of commerce” but were deemed critical tools of international diplomacy and even, for an array of public and private officials, the very basis for pursuing US national security.
American laboratories had an impact on both the logic and material practices of US development policy internationally. The possibilities of laboratory transformation were explored not only in penal institutions in the United States, but also in peasant communities across the Andes where fears of social upheaval were countered by US-oriented development projects. World War II profoundly transformed the pharmaceutical industry with massive government subsidies to this strategic materials sector, fueling new drug developments and the expansion of US pharmaceutical markets around the world. Traveling with troops and aided by agents from the US Board of Economic Warfare, US pharmaceuticals replaced those of their former, now defeated, primary competitors, the Germans and the Japanese. In the aftermath of the war, the task of familiarizing the “underdeveloped” world with American-manufactured drug products assumed an increased urgency in the context of burgeoning Cold War Soviet and US competition over spheres of influence. Many US-backed research institutions, businessmen, scientists, anthropologists, public health officials, police and military personnel, and diplomats aggressively embraced this challenge and grounded it in an ideological vision of US world leadership. As John T. Connor, president of pharmaceutical giant Merck & Co., Inc., warned an audience at the American Management Association in 1958: “The Soviet is at least as well equipped medically as it is economically to match us in underdeveloped countries. . . . And when this well-staffed army sallies forth from its borders—as it will—carrying the nostrums of Communism in its medical kit, it will have a proposal to make that could be quite appealing. Reorganize your state along our lines, the proposal would go, and you, too, can do what we did—make the fastest progress in health achieved by any large nation in modern times.”71
The fear of a spreading sympathy for communism combined advantageously with the capitalist ambition to profit from social reorganization. These US visionaries found welcome partners among the Andean political and economic elite who embraced US ties and “viewed economic development as a bulwark against communism.”72 In the context of competing Cold War economic initiatives, Merck’s president went on to explain what this development entailed: “the Bolshevik planners were right when they decided to pour enormous effort into their human capital on the theory that better health as well as better education would have to precede to better output. . . . This concept of the relation between human capital and economic growth could turn out to be decisive as the Soviet sets forth to meet the rising expectations of Asia, Africa, the Middle East and even Latin America with a program of health, development and Communism.”73
US programs of health, development, and capitalism were already well underway. In fact, North American advisers had been involved in developing “human capital” in Peruvian and Bolivian “development” projects since the war prompted US advisors to study and recommend economic development programs that continued to influence government initiatives throughout the 1940s and 1950s.74 The United States was the region’s primary export market, and US public and private capital was the largest source of foreign investment in Peru and Bolivia at the time, particularly in the mining industry—an industry of critical importance as a supplier of material to the US military.75 The US Office of Inter-American Affairs spearheaded US involvement, pursuing Andean development through defining and institutionalizing “public health.” Public health campaigns were launched specifically to prioritize and transform “critical economic sectors.”76 Maintaining healthy mine workers and promoting market-oriented rather than subsistence-based agriculture was the focus of US initiatives. The American representative from the International Cooperation Agency (predecessor to the US Agency for International Development [AID]), John J. Bloomfield, who helped establish national public health programs in the 1940s in both Peru and Bolivia, focused these efforts on promoting “occupational” health in the major export industries.77 Health and the economy then were becoming mutually constitutive categories of modernization and progress.
Following the coca commodity circuit back into the Andes provides a window onto the capitalist values that structured modernization and development schemes and the way in which these visions targeted indigenous peoples—much like the coca leaf itself or the prisoner-patients at Lexington—approaching them as raw material in need of transformation to generate greater value. The logic of coca leaf control was embedded in debates about the land, life, labor, and consumption habits of Andean Indians. While governments deployed law enforcement to establish a line between licit and illicit cocaine, regulating the coca leaf would prove more complicated. In many ways this reflected a much longer colonial and imperial history. Historian Kenneth Lehman has argued that the twin exploitation of “silver and Indians” drove Spanish colonial policy in the Andes and continued to characterize post-independence governments’ structural exclusion of Indians from “national life.”78 This marked only the beginnings of an ongoing imperial encounter where local European-descended elites joined with foreign and, by the mid-twentieth century, primarily US-based interests to approach the native population as “vital resources of revenue.”79
The UN focus on controlling the coca leaf (or “supply side” of the market) ensured that this aspect of international drug control constituted an intervention in local conflicts over the terms of national economic development and, in particular, indigenous peoples’ envisioned contribution to the larger society. Centuries-long debate had swirled around Indian coca leaf consumption that, until these UN initiatives, was accepted as a necessary, if vexing, aspect of assuring indigenous labor capacity.80 Neither national nor international elites particularly worried about transgressing Indian cultural traditions, yet there was considerable interest in maintaining control over the labor force by acknowledging both customary usage and the leaf’s central role in the wage-labor economy. As late as 1940, the Bolivian government decreed that coca was “an article of prime necessity” and ordered its compulsory sale in mining and railway companies.81 The UN Commission of Enquiry into the Coca Leaf’s report in 1951 (see the previous chapter), and the national and international experts with whom it consulted, successfully shifted the regulatory landscape to identify coca leaf chewing as an obstacle to national development.
Arguments for coca eradication stigmatized Indian practices while proselytizing a model of “civilization” based on liberal visions of land ownership, hard work, and consumer capitalism.82 The attack on coca entailed the transformation of individual habits, as well as a general restructuring of the national economy (to secure the export market and stabilize the region for foreign investment).83 Coca was a source of government revenue and a linchpin of the informal market. As noted earlier, Gutiérrez-Noriega estimated the “coca leaf [was] the single most important item of commerce in the Andes.”84 The UN commission supported these claims, finding that “except in some cattle markets, business is on a small scale and generally limited to the exchange of products between the Indians. An exception is coca leaf; it is, as a rule, paid for in cash. In such markets coca leaf is sold by the Indian who grows his own crop.”85 Coca eradication thus entailed the radical transformation of the domestic cash economy including the elimination of many people’s primary medium of exchange, subsistence, and access to money, outside of the wage-labor sector. As one of an array of international development missions warned in 1951, “it must be constantly remembered that from one-half to two thirds of [Bolivia’s] people still live practically outside the money economy on a more or less self-sustaining basis.”86 The drug control agenda called for the eradication of coca leaf chewing as central to the process of dispossessio
n required to create a wage labor–based consumer economy in the Andes.
An array of development schemes sought to tackle the issue. One such example was the ambitious “Andean Indian Project” coordinated by the “Expanded Program of Technical assistance of the UN and the specialized agencies.” Beginning in 1952 this broad effort brought scientific “experts” to Bolivia, Peru, and Ecuador, “to raise the health, nutrition, housing, education, working and social standards of the altiplano people and to integrate them into the social and economic life of their countries.”87 A number of dramatic initiatives were launched, including crop replacement and expansive resettlement programs designed to relocate Indians from their traditional lands in the high Andes to lower, more tropical regions where the conditions for large-scale commercial agriculture were imagined to pertain. As Enrique Sánchez de Lozada, the head of the International Labor Organization program, pointed out, these experts believed that solving the agrarian problem must take the Indian into account who “by the very weight of his number is the most important factor in the economy.”88
These programs fused visions of eradicating communities’ cultural habits with effecting modernization, “integrating” the native into national society, and coca was central to these concerns. Cornell University anthropologist Allan Holmberg, an important figure in experimental modernization schemes, articulated the ways in which coca in the Andes was fundamentally a marker of “Indianness”: “A person who speaks an Indian language, wears homespun dress, and chews coca will be classed as Indian. If the same person speaks Spanish, wears Western dress and does not chew coca, he may be classed—depending on other characteristics such as family name, occupation, education, and health—as either mestizo or white.” Interestingly he went on to explain, “In a biological sense, at least, Peru has no racial problem. Its so-called ‘racial’ problem is largely a cultural one.”89
We Sell Drugs: The Alchemy of US Empire Page 22