Work Won't Love You Back

Home > Other > Work Won't Love You Back > Page 25
Work Won't Love You Back Page 25

by Sarah Jaffe


  Many writers trace the history of the intern back to the apprentice of old, and specifically to the apprentices within the guild system. Apprenticeship, a tradition by which practitioners of a craft were to pass on their trade, dates back centuries, perhaps all the way back to the Code of Hammurabi. Apprentices were to learn by doing, getting hands-on practice at a craft they could expect to spend the rest of their lives performing. But before the idea of “art” was separated from skilled crafts, craftspeople handed down what they knew in a process quite different from today’s internships.7

  The apprenticeship of the precapitalist guild years was intensive; apprentices spent years learning their trade at the master’s side, often becoming, in a way, a part of the family. The master was obligated to provide room and board, clothing, and other such things in lieu of parents. Wrote sociologist Alexandre Frenette, “Apprentices were expected to obey their master much as they would a parent, providing valuable labour as well as loyalty and child-like love.” Beyond simply a trade, the master was to pass on invaluable life advice and morality, and to have a lasting relationship based in a contract. Formalized in 1563 in England with the Statute of Artificers, the apprenticeship system had set rules for the obligations of master to apprentice as well as apprentice to master, and a set period of time (seven years). But that doesn’t mean the system wasn’t varied and rife with abuses.8

  Adam Smith was a notable critic of apprenticeships, considering them a restriction on the freedom of workers. He argued for a wage for the apprentice rather than payment in lodging and meals, writing that “the sweets of labour consist altogether in the recompense of labour.” And indeed, the system declined as the Industrial Revolution spread and wage labor became common. In the colonies, at first British law held sway and governed apprenticeships, but the fledgling country had a few factors that mitigated against the growth of a strong guild system. The myth of American independence was strong, but so was the promise of supposedly unsettled land, open for those who would rather try to make their own way than stay put and learn from a master. (As long as they didn’t mind displacing Native people from it.) Apprentices often skipped out on their indentures and lit out for the frontier. And so the rise of chattel slavery solved the problem of workers who could escape work; it helped level the playing field among white workers while condemning kidnapped African people to the undesirable labor.9

  In Canada, too, apprentices voted with their feet and left their positions. In Montreal, where centuries later interns would hit the streets on strike, the turning point was in the early 1800s, when the flight of the apprentices, combined with the growth of larger-scale manufacturing and the shift to cash wages, led to a precipitous decline in the system. The spread of public education and higher education, too, contributed to the decline of the individualized apprenticeship, and preparation for the workforce—if any was necessary—shifted form.10

  As reformers fought to ban child labor, and the family wage became common, reform movements instilled new ideas about young people and learning. Adolescence, they argued, was a special time of life, set aside from childhood or adulthood, and young people’s work should be a sideline, a summer or after-school job, something to be managed around their real work of getting an education. The apprenticeship system continued for some skilled trades, and continues to this day, but it was no longer the prevalent form of job training.11

  In 1861, the Land Grant Act was instituted in the United States to fund colleges for more practical education, in fields like agriculture and trades. The cooperative system (which persists in a few colleges today) was also created as a way for students to alternate classroom education with practical learning on the job, to formalize training in fields such as architecture. These programs, too, in some ways shaped our modern-day internships.12

  But the “internship” as such was actually born in the medical field. According to Ross Perlin, author of Intern Nation, young medical students were “interned (in the sense of confined)” within the walls of a hospital, “enduring a year or two of purgatory before entering the profession.” Before the 1900s, doctors, too, undertook apprenticeships, but as the profession formalized, young doctors began to go through a more standardized learning process, from medical school to internships, where they could get hands-on experience while remaining under the supervision of more practiced physicians. The American Medical Association’s Council on Medical Education recommended a yearlong internship after medical school in 1904; by 1914, the vast majority of medical students were interning. Critics, Perlin noted, “were soon accusing hospitals (as many still do today) of squeezing exhausting, cheap labor from young medical graduates.”13

  The medical internship expanded into what we now know as the “residency,” an extended period of years, and one that still denotes lengthy working hours, “scut work,” and little power on the job. American medical interns and residents, some of whom are members of the Service Employees International Union’s Committee of Interns and Residents, have fought to reduce their workweek to eighty hours and to trim back twenty-eight-hour shifts to a mere sixteen. Yet the interns, in particular, still face arguments that what they are doing isn’t really work but part of their education. They also hear the familiar argument that their demands for shorter hours or rest breaks shortchange patients, that they should put their needs on the back burner to care for those in their charge—despite studies that have repeatedly shown the deleterious effects of long hours on a physician’s quality of work. Although medical interns and residents are paid, their salaries are a fraction of what a full-fledged doctor makes—they’re closer in pay to the hospital’s cleaners than to the attending physicians. (The United States’ privatized health-care system is uniquely demanding of residents; in Europe, residents work closer to forty-eight hours.) There’s plenty of hope labor in this part of a doctor’s career, as they rationalize “paying their dues” while walking past the doctors’ luxury cars in the parking lot on their way to another sixteen-hour shift.14

  The internship began to trickle into other fields by the 1920s, with university professors advocating the practice and professional journals in a variety of fields, such as accounting and the burgeoning marketing industry, calling for students to take it up. White-collar professions seized on the internship as a badge of class status, to be differentiated from the apprenticeship, which was for manual workers. But it was in politics where the internship really took off. Programs were launched by city and state governments in the 1930s to bring ambitious young people in to learn about public service. For several years during the New Deal era, the National Institute of Public Affairs—a nonpartisan, nongovernmental organization—ran a yearlong, unpaid internship program (eventually taken over by the Civil Service Commission) designed to bring new talent into civil service.15

  The model of unpaid on-the-job learning made a certain kind of sense in politics, where the spirit of public service was supposed to draw people into the work. In practice, though, requiring unpaid work meant that only young people with a certain level of access and income could take advantage of the opportunity. After World War II, the US Congress changed shape significantly, with lawmakers hiring a growing number of staffers—and alongside them, the pools of interns who still today do much of the work on Capitol Hill. By the 1950s internship programs had spread across the country, but there was much variety among them: some were paid, some unpaid, lengths of time and coordination with educational institutions differed, and of course there was much disparity in the quality of the work carried out by the interns. In other words, the conditions under which today’s interns work, where it’s often a roll of the dice whether would-be learners find themselves scrubbing shelves and fetching coffee or collaborating closely with prestigious staffers, was coming into shape.16

  At about this time, the US Supreme Court handed down a ruling that would shape the future of interns for decades to come. Walling vs. Portland Terminal Co., a 1947 case, established guidelines under which trainees co
uld be considered exempt from legal protections for workers (including, notably, a minimum wage), guidelines that held for many years. The original case focused on railyard workers who undertook a two-week training program provided by the Portland Terminal Company. “In such a context, creating an exemption [under the Fair Labor Standards Act] for trainees must have seemed like a reasonable proposition: a way of encouraging firms to provide vocational training for future employees without having to pay them like regular employees,” Perlin noted. The decision laid out the criteria under which an employee could be considered a trainee and therefore ineligible for labor protections: the work had to be a practical training program, where the trainees benefited from the experience and did not replace any regular employees; the trainees could not be guaranteed a job after their training, and should not expect wages; and perhaps most importantly, the training could not “expedite the company business,” and could in fact actually get in the way of it. It is this last factor that has remained contentious over the years as competing administrations have changed their interpretation of this ruling to apply to interns. How far outside the normal run of business must training be in order to count as training and not simply unpaid work?17

  Internships continued to spread during the 1960s and 1970s, starting when Lyndon Johnson’s War on Poverty pumped money into work-based learning programs, and young, politically involved people sought opportunities to put their values to work. Congress reorganized yet again, expanding the range of subcommittees and staffers, attracting a new wave of nonprofits, lobbyists, and others seeking to influence policy—and stocking up on ambitious, cheap young interns. The rate of college attendance was rising, and as the idealistic 1960s faded into the recession 1970s, young grads were looking for toeholds anywhere they could find them. Internships were a new way to differentiate oneself from the masses. The number of university-backed internship programs rose from two hundred in 1970 to one thousand in 1983.18

  But it was in the 1990s that the modern internship really took off, and it was also in the 1990s that the pushback against the spread of unpaid internships began. Architecture students, organizing with the American Institute of Architecture Students, began to protest the prevalence of unpaid internships in their field—one already known for its grueling educational programs. The organization lobbied other groups to condemn the practice too and managed to change the culture in the field in favor of paying interns. Unpaid interns also sued a prominent public relations firm after the company had gone so far as to explicitly bill its clients for the hours worked by employees it wasn’t paying. The students won $31,520 in back wages.19

  But none of that stopped the spread of the unpaid internship. As the modern work ethic shifted and a job went from being a mere necessity—the main pleasure of which, as Adam Smith wrote, was the money—to something billed as the source of all fulfillment in life, it began to make a strange kind of sense that one had to earn one’s job. In fields like journalism, as the internship became more common, the likelihood that it would be paid did not—one study found that in 1976, 57 percent of TV and 81 percent of radio interns got paid at least something; by 1991, those numbers were down to 21 percent and 32 percent, respectively. The hollowing out of the middle of the job market that came with the disappearance of unionized industrial labor (to outsourcing or automation) meant that higher education, and, increasingly, personal connections, were necessary to compete for a smaller pool of better-compensated work with better conditions. And the stick to the carrot of the “dream job” was the also-expanding low-wage service economy all around. Internships like those at Disney World, where low-paid college students work twelve-hour shifts in a variety of service positions, from serving popcorn and cotton candy to cleaning up vomit on roller coasters, show the overlap. The difference between a Disney internship and a regular job is a Disney line on a résumé for one, and job security and decent pay for the other (most of the full-time Disney workers are represented by a union). To Disney, having a two-tier workforce is worth something like $20 million a year in savings.20

  Even when not literally serving food, interns remain subservient. Anyone who’s ever prepared coffee, scooped ice cream, or waited at a blank desk in a cubicle for someone to notice their unpaid presence knows the emotional labor of appearing grateful while doing the worst jobs. The internship advanced alongside other forms of contingent work, and alongside the idea that trading in security for enjoyable work was a deal worth making. Hope labor, everywhere you look.

  Interns are emblematic of what economist and author Guy Standing called “the precariat,” a class of workers that he argued are identifiable by their lack of security. The precariat, he wrote, does not map “neatly onto high-status professional or middle-status craft occupations.” Rather, it is a term for a set of working conditions that are becoming more and more common as the number of workers who have long-term security at work declines. Similar to the concept of hope labor is what Standing named “work-for-labor,” or the work that it is necessary to do in order to get paying work. In addition to forward-looking hope labor, work-for-labor includes “networking outside office hours, commuting or reading company or organisational reports ‘at home,’ ‘in the evening’ or ‘over the weekend.’” The internship is only one kind of work-for-labor, but it prepares the worker for a thousand other ways to go above and beyond.21

  Somewhere between 50 and 75 percent of four-year college students do at least one internship, according to researchers, though the lack of good data continues to be a problem. It is also true that those interns are often balancing unpaid work, schoolwork, and a paid job—something like half of all undergraduate students have paid work that averages twenty-five hours a week. And despite the myths, lower-income students are actually more likely to have the unpaid variety of internship, while higher-income students tend to have the kinds of personal networks that get them access to the best internships, too. One’s major in school is also a factor—as Camille Marcoux explained, for engineering and computer science students, in male-dominated fields that supposedly require more technical skill, internships are more likely to be paid. Education, the social sciences, and the arts are much less likely to have paid internships, and more likely to be filled with women.22

  This brings out the factor that would motivate Marcoux and her fellow interns to get organized. Internships are extremely gendered. As Miya Tokumitsu wrote, “internships, insofar as they demand meekness, complicity, ceaseless demonstrations of gratefulness, and work for free or for very little pay, put workers in a feminized position, which, historically, has been one of disadvantage.” The intern is, as Malcolm Harris pointed out in Kids These Days, the inverse of what people mostly imagine the working class to be—the stereotypical midcentury worker of a million nostalgic fantasies is a hard-hatted man, probably white. The unpaid intern is likely a smiling, retiring young woman, eternally grateful just for the opportunity to show up.23

  When one is expected to perform gratitude every day on the job, it makes summoning the mindset necessary to organize for change that much harder. And so of course interns are the opposite of factory workers in one other way: they are extremely unlikely to have unions. Unions, after all, built power by making trouble, by refusing to work unless their demands were met. But when you’re already expecting to give away your services for free, how much harder is it to get to the point where you’ll raise a little hell to get your way?

  Women have always been the largest part of the contingent labor force. Part-time work itself was a gendered concept, designed for women like the shop clerks and retail workers of the nineteenth and early twentieth centuries, who supposedly took jobs to earn “pin money” rather than because they needed a real job. As more women moved into the workforce, the conditions long expected to accompany “women’s work” spread to more and more workers, and the internship is a key hinge point where those conditions enter workplaces that, formerly, were associated with a sheen of masculine prestige and privilege.24
/>
  And the internship these days is more likely to be, as it was for Marcoux and her colleagues, mandatory, or at least highly encouraged. Universities serve as clearinghouses and recruitment spaces for unpaid positions; career centers steer students toward plum positions, and more and more majors require at least one internship in order to graduate. Some colleges even offer financial aid for unpaid interns, explicitly subsidizing the companies that take on their students. Meanwhile, students who do unpaid internships for college credit are often paying the university for the credits, literally paying in order to work. Internships, explained one professor who has researched the subject, are “a very cheap way to provide credits… cynically, a budget balance” for the universities that require or encourage them. In this way, the internship is connected to the corporatization of the university, which we’ll discuss in more depth in the following chapter.25

  As Camille Marcoux explained, because, in many places, unpaid interns are not considered employees under the law, they often fall into a legal black hole when it comes to various workplace abuses. Discrimination, sexual harassment? If you aren’t an employee, say goodbye to what little legal protection you might have to sue. When Bridget O’Connor was doing an unpaid internship at Rockland Psychiatric Center in New York, one of the doctors referred to her as “Miss Sexual Harassment.” The doctor also made other sexual comments, and other women who worked at the facility made similar reports of his conduct. Yet when O’Connor sued, her case was thrown out in court because she wasn’t an employee: federal law didn’t cover interns unless they received, according to a spokesperson for the US Equal Employment Opportunity Commission, “significant remuneration.”26

 

‹ Prev