by Sarah Jaffe
The first universities in what would become the United States were elite institutions, religious in nature, but the United States’ real contribution to higher education was the state university system. Beginning in North Carolina and Georgia in the 1780s, the state-funded institution helped to make higher education accessible to a broader swath of the country. City College of New York, now the CUNY system in which Katherine Wilson adjuncts, was founded in 1847 to educate, tuition-free, the children of the modest classes. Then the Morrill Act of 1862 created the “land grant” universities, paid for through the sale of public land granted to the states to fund “Colleges for the Benefit of Agriculture and the Mechanic Arts.” The sixty-nine schools funded through this act include the Massachusetts Institute of Technology, Cornell University, and the University of Wisconsin at Madison. (This was land, it’s important to note, that was seized from Indigenous nations and sold at a profit—a reminder once again that the new universities were never intended for everyone to access.) The American-style research university was a new kind of institution, better funded than universities had been and producing work, particularly in the fields of science and technology, that became a draw for scholars from around the world. Privately funded institutions and state-backed ones competed for students and research accolades. By the 1920s, the proportion of students in higher education in the States was five times higher than in Europe. Of course, those schools were still racially segregated—separate was certainly not equal—and women made up a much smaller proportion of the student body than men.9
Higher education was slowly becoming a path to upward mobility for a small but growing fraction of the working class. The post–World War II period brought more and more students into colleges and universities, across Europe and particularly in the United States, thanks to the act commonly known as the GI Bill, which provided military veterans with college funding. But despite the bill’s facially race-neutral language, in practice Black veterans were excluded, often formally rejected or forced into vocational programs rather than universities. Historically Black colleges and universities (HBCUs), which would have happily taken on more students and where Black faculty were welcomed, were underfunded and could not accommodate all of the would-be attendees, leaving many more out in the cold.10
The Cold War brought new funding into universities as the United States and the Soviet Union competed for scientific (and thus military) superiority. States and the federal government both committed substantial funds to higher education, including student loans and direct university subsidies, and most students attended public institutions. But as the university expanded and became less elite, its professors began to lose status. Institutions and the students in them were ranked in terms of prestige, and that prestige would largely define working conditions. Still, upward mobility through the university into what Barbara and John Ehrenreich dubbed the “professional-managerial class” (PMC), was a fact of twentieth-century life, and more and more people wanted in.11
The PMC, according to the Ehrenreichs, consisted of those service and management professionals whose jobs required some schooling and gave them some degree of power, usually over those further down the class ladder than themselves, and who retained some degree of autonomy on the job. Teachers, doctors, journalists, social workers, and of course college professors were part of the class. As opposed to those in the “managerial” part of the PMC, the professionals mostly considered themselves outside of the battle for profits and saw their work as having intrinsic social value. “Educational work,” the Ehrenreichs wrote, “was highly labor intensive, and there was no obvious way, at the time, to automate or streamline student-teacher interaction and make universities a profitable undertaking.” Perhaps because of their status as a temporary respite from profit-seeking, universities began to be a home for dissent and rebellion, as well as agitation for the university itself to open up further to those long excluded.12
Faculty fought for tenure protections, in particular, to preserve their job security and academic freedom. Despite the caricature—like that lobbed at public school teachers—of tenure as a protection for “lazy” professors, tenure protections, much lampooned in the years of right-wing budget-cutting and culture-war mania, allow a modicum of independent thought in the university. Through the 1950s, Stanley Aronowitz wrote, most faculty existed on year-to-year contracts, keeping them toeing the line. The American Association of University Professors (AAUP) agitated for tenure not to protect the radicals but to make everyone’s job more secure; nonetheless, tenure has always been particularly valuable to academia’s rebels. “Well into the 1960s, for example, the number of public Marxists, open gays, blacks, and women with secure mainstream academic jobs could be counted on ten fingers,” Aronowitz archly noted. “The liberal Lionel Trilling was a year-to-year lecturer at Columbia for a decade, not only because he had been a radical but because he was a Jew. The not-so-hidden secret of English departments in the first half of the twentieth century was their genteel anti-Semitism.” Yet tenure still did not protect the radicals from the pressures that the job itself placed on them, the conformity encouraged by academia’s own traditions of peer review, and the hoops to be jumped through while on the tenure track itself.13
The AAUP’s definition of academic freedom, so precious to the university professor, holds up professionalism—the judgment of one’s peers, in essence—as the standard to which academics should be held. An expansion on the Humboldtian concept, dating back to 1940, the AAUP statement on the subject “maintains that a professor’s research and teaching should be free from outside interference as long as he or she abides by the academy’s professional standards of integrity, impartiality, and relevance,” though as scholar Ellen Schrecker noted, those protections were less regularly applied to what a professor did off campus—meaning they could still be fired for political activities or speech. But theoretically at least, a professor was supposed to be free to teach and research what she liked, as long as she upheld her duties to the university—which meant committees, peer review, a variety of governance duties that professors complained about but nevertheless valued as signs that it was they who ran the university.14
The public university, accessible to broad swaths of the working classes, reached its heights in California and in New York, in the system where Katherine Wilson still teaches. The CUNY system was considered the “proletariat’s Harvard” in its heyday; children of immigrants with dreams of scholarship and middle-class life, those who didn’t make it into the Ivies, moved through its halls. It was also, from 1969 onward, fully unionized, with faculty, graduate students, and staff all members of the Professional Staff Congress (PSC). The year after the union was founded, CUNY gave in to pressure from Black and Puerto Rican student organizers and formally opened up to all New York City high school graduates who wanted to attend. “By combining an open admissions policy with free tuition, CUNY broke new ground in democratizing access to higher education in the United States,” wrote CUNY professors Ashley Dawson and Penny Lewis. “And in 1973, after voting to strike, CUNY faculty and staff won their first contract.” The University of California system, too, was free; its Master Plan (enshrined in 1960) committed to educate anyone who wanted to be educated, though the burgeoning New Right took aim at this ideal nearly as soon as it was written into law. One of Ronald Reagan’s campaign aides, as he ran for governor, laid out the stakes clearly: “We are in danger of producing an educated proletariat. That’s dynamite! We have to be selective on who we allow to go through higher education.”15
In 1975 the right was able to strike back against CUNY. New York City’s fiscal crisis—one of the turning points of the decade—marked a shift away from funding public goods that were accessible to the working class and toward the neoliberal politics we now know today. As the infamous newspaper cover had it, President Gerald Ford had told the city to “Drop Dead,” leaving New York to fill its budget holes however it could, meaning deep austerity for public services and
a turn to “business-friendly” policies. CUNY tuition was one of the first things to be instituted—just a few brief years after it had truly been opened up to the working class. Bondholders had to be paid off; students, meanwhile, would start taking out loans of their own, or more likely, for many of them, skip higher education altogether. The faculty union fought to keep its protections but could not stave off the institution of tuition, nor stop the firing of hundreds of young professors, only recently brought on to handle the expansion.16
In a way the Reagan aide was right: the rebellions of the 1960s and early 1970s, which had helped to create the open admissions period of CUNY’s history, and had shaken up many other college campuses as well, had in part emanated from a newly educated stratum of society no longer content to simply move into professional-status jobs. Their idea of changing the world was different from that of the Progressive Era reformers: they wanted revolution and they wanted it now. Angela Davis became one of the early targets of the counterrevolution when Reagan sought to have her fired from her position at the University of California Los Angeles. Davis had a PhD and a stellar record, but was a Communist and associated with the Black Panthers, and Reagan was able to chase her out, academic freedom be damned. The university had been a target of the McCarthy-era witch hunts, but by the 1970s it had become easier to strip it of funds than to try to get individual professors fired one by one. Reaganism was tested out on the Cal system, as Aaron Bady and Mike Konczal wrote at Dissent: “The first ‘bums’ he threw off welfare were California university students.”17
Margaret Thatcher too took aim at British professors. In what one researcher called “one of the most dramatic systemic changes in the terms of academic appointments,” in 1988 the Thatcher government eliminated tenure for university faculty. Ostensibly, this was to reduce distinctions between the traditional—and traditionally prestigious—universities and newer institutions, and to introduce “accountability” for faculty, which, as it does for other teaching staff, tends to mean “making them easier to fire.” The argument was the same as it is everywhere that the elimination of job security is debated: that “deadwood” tenured faculty who weren’t up to internationally competitive standards should be cleared away to save money. An otherwise Thatcher-supporting professor from the London School of Economics argued to reporters at the time that eliminating tenure would “make British universities into something very second rate,” and that the reforms would direct money to profitable programs while hacking away at the liberal arts. It was not the last time that refrain would be heard.18
The right had taken the analysis of the professional-managerial class from leftists like the Ehrenreichs and twisted it to useful form in order to attack the university. The right in the United States railed against, the Ehrenreichs wrote, “a caricature of this notion of a ‘new class,’ proposing that college-educated professionals—especially lawyers, professors, journalists, and artists—make up a power-hungry ‘liberal elite’ bent on imposing its version of socialism on everyone else.” That the people doing the excoriating were, in fact, members of this class themselves was perhaps lost on them, but it is a reminder that just because one wants to call a group of vaguely similar people one opposes a “class” doesn’t make it so. Classes, we recall, are composed, and as neoliberalism hit, the PMC was beginning, in fact, to be decomposed.19
While the managerial side of the PMC was doing better than ever—executive pay headed back upward in the late 1970s and kept going up—the professions were undergoing a very different process, one in which job security and pay rates were falling, and their treasured autonomy disappearing. Academia was at the very heart of this transformation. After all, education was the very thing that made one into a member of the PMC in the first place, as Barbara Ehrenreich noted, which made the university a central location of these changes as it trained the doctors, lawyers, social workers, and professors of the future. The academic profession itself, like many others, was becoming polarized into a handful of stars at the top and a vast academic proletariat at the bottom, made up of people like Katherine Wilson, cobbling together a living if they could, and feeling a sense of shame at not having achieved the career they’d aimed for. The middle class—a better term than “PMC”—as Ehrenreich wrote in Fear of Falling, was still “located well below the ultimate elite of wealth and power.” Further, she wrote, “Its only ‘capital’ is knowledge and skill, or at least the credentials imputing skill and knowledge. And unlike real capital, these cannot be hoarded against hard times.” A PhD might have been a symbol of so-called human capital, but its value could not be guaranteed.20
Just as the vaunted “knowledge economy” was making headlines, in other words, the labor of knowledge workers was being devalued and deskilled. Doctors became more likely to work for large institutions, lawyers in massive firms or to work in-house at corporations. We started to hear more about “stress” and mental health on the job than physical injury. Until the aftermath of World War II, the term “stress” was rarely used to describe something that happened to humans; researchers, though, began to apply the term to the wear and tear on the human body caused by, among other things, psychological strain on the job. By the 2000s, it had overtaken physical ailments as a cause of absence from work. Like “burnout,” we can understand this concept as a side effect of the cracks in the labor-of-love myth. Fewer of us may be getting physically injured on the job, but more of us are struggling with the emotional toll of work.21
Professional workers were becoming subject to the controls of capital, and yet, as more and more people made it through higher education, the demand for credentials only grew. More and more universities were opened across the world, and the percentages of school-age cohorts attending them exploded, from under 10 percent in 1960 to around 50 percent in many countries by the twenty-first century. Something like 3.5 million professors taught over 80 million students worldwide by 2000. Yet their working conditions were, in many ways, getting worse.22
For one thing, even as access appeared to be expanding, a degree was also becoming more expensive. The cost of a degree in the United States spiked between 1987 and 2007, from less than $3,000 a year for public universities, and less than $7,000 at private ones, to nearly $13,000 a year for public and nearly $35,000 for private. Since then, and in the wake of the global financial crisis and austerity, those numbers have ballooned again—by nearly 25 percent. In the United Kingdom, university fees were reintroduced in 1998, and have expanded since. Yet that money was not going to pay more qualified professors better salaries; instead, teaching faculty were facing cuts. Complaints of lower quality at the universities were used as justification for public budget cuts, firing professors, and raising tuition. Universities competed for a few prestigious faculty members, offering not just excellent pay but lowered teaching loads, an ability to focus on research, and the opportunity to mentor graduate students who might enhance their own reputations. Meanwhile, that teaching load being removed from the fancier professors fell on the shoulders of those same graduate students, adjuncts, or junior professors scrambling for the tenure track. The resulting competition meant that research requirements were going up even as fewer people were given the kinds of job supports that would allow them to do that research.23
The Humboldtian ideal of the university professor has always been a combination of two related but distinct forms of work: part of it in front of a classroom, part of it hidden away in the lab or the office with a stack of books. To Aronowitz, who enjoyed both of these parts of the job, the two parts were complementary. “I am one of a shrinking minority of the professoriat who have what may be the last good job in America,” he wrote. “Except for the requirement that I teach or preside at one or two classes and seminars a week and direct at least five dissertations at a time, I pretty much control my paid work time.… I work hard but it’s mostly self-directed. I don’t experience ‘leisure’ as time out of work because the lines are blurred.” He described “writing days” wher
e he composed articles and read and worked on longer book projects, fundraising time, and student-advising time and exams for grad students. Academic labor, he noted, bled, for professors like him, into everything; anything he read might make it into the classroom or into a piece of writing. For Aronowitz, teaching was a genuine pleasure; for many others, it’s simply a distraction from the research they’d prefer to be doing. One might be a good teacher and a brilliant lab scientist; it’s certainly possible, but nothing about the one suggests, necessarily, the other.24
The splintering of the academic workforce into tiers suggests that these two parts of the job have in fact come apart. The adjuncts and the full-time faculty, noted part-time lecturer and union activist Amy Higer, from Rutgers University, have a symbiotic relationship: full-time professors often don’t want to be in the classroom. “Some of them like teaching, but I would say most of them don’t. And it’s a research institution; that’s fine. I love to teach. This is what I wanted to do with my PhD.” The problem was not the split workload, to her, as much as it was the devaluation of the part of the work that she did—the feminine-gendered work of teaching. Adjuncts are paid per class for their teaching and given no support at all for their research. For Katherine Wilson, research was something she’d hoped to do, and she found herself stymied by the demands of adjunct work. She agreed with Higer that the research part was often seen as the higher-level part of the job, teaching the lesser.25
The “last good job in America” (or in England, or France) is now reserved for a few: all over the world, academics face the increase of part-time positions and the loss of autonomy and power. Increasing enrollment has not come along with increased full-time staffing, and salaries have stagnated as class sizes have increased. While European universities still offer more security than many US institutions, the situation of part-time faculty in the Americas (Latin America, too, has a long history of so-called taxicab professors, part-timers with little attachment to their institutions) is a bellwether for the rest of the world. By 1999, an estimated one-fifth to one-half of European countries’ academic staff were “nonpermanent.” In the United States between 1975 and 2003, according to the AAUP, “full-time tenured and tenure-track faculty members fell from 57 percent of the nation’s teaching staffs to 35 percent, with an actual loss of some two thousand tenured positions.” Professors don’t always have to be laid off; attrition does a lot of the work as tenured professors retire, and their jobs are filled in by temporary staff. Meanwhile, much of the expansion of college access has been at community colleges, where even if tenure exists, the job is nothing like that of a professor at a top-tier research university.26