Had I Known

Home > Nonfiction > Had I Known > Page 23
Had I Known Page 23

by Barbara Ehrenreich


  Like his own wife, John Howes was an atheist or, as they more likely put it at the time, a freethinker. He, too, had been raised as a Catholic—on a farm in Ontario—and he, too, had had a dramatic, though somehow less glorious, falling out with the local clergy. According to legend, he once abused his position as an altar boy by urinating, covertly of course, in the holy water. This so enhanced his enjoyment of the Easter communion service that he could not resist letting a few friends in on the secret. Soon the priest found out and young John was defrocked as an altar boy and condemned to eternal damnation.

  The full weight of this transgression hit a few years later, when he became engaged to a local woman. The priest refused to marry them and forbade the young woman to marry John anywhere, on pain of excommunication. There was nothing to do but head west for the Rockies, but not before settling his score with the church. According to legend, John’s last act in Ontario was to drag the priest down from his pulpit and slug him, with his brother, presumably, holding the scandalized congregation at bay.

  I have often wondered whether my great-grandfather was caught up in the radicalism of Butte in its heyday: whether he was an admirer of Joe Hill, Big Bill Haywood, or Mary “Mother” Jones, all of whom passed through Butte to agitate, and generally left with the Pinkertons on their tails. But the record is silent on this point. All I know is one last story about him, which was told often enough to have the ring of another “traditional value.”

  According to my father, John Howes worked on and off in the mines after his children were grown, eventually saving enough to buy a small plot of land and retire to farming. This was his dream, anyway, and a powerful one it must have been for a man who had spent so much of his life underground in the dark.

  Far be it from me to interpret this gesture for my great-grandfather, whom I knew only as a whiskery, sweat-smelling, but straight-backed old man in his eighties. Perhaps he was enacting his own uncompromising version of Christian virtue, even atoning a little for his youthful offenses to the faithful. But at another level I like to think that this was one more gesture of defiance of the mine owners who doled out their own dollars so grudgingly—a way of saying, perhaps, that whatever they had to offer, he didn’t really need all that much.

  So these were the values, sanctified by tradition and family loyalty, that I brought with me to adulthood. Through much of my growing-up, I thought of them as some mutant strain of Americanism, an idiosyncrasy that seemed to grow rarer as we clambered into the middle class. Only in the sixties did I begin to learn that my family’s militant skepticism and oddball rebelliousness were part of a much larger stream of American dissent. I discovered feminism, the antiwar movement, the civil rights movement. I learned that millions of Americans, before me and around me, were “smart” enough, in my father’s terms, to have asked “Why?”—and, beyond that, the far more radical question, “Why not?”

  These are also the values I brought into the Reagan-Bush era, when all the dangers I had been alerted to as a child were suddenly realized. The “phonies” came to power on the strength, aptly enough, of a professional actor’s finest performance. The “dumb” were being led and abetted by low-life preachers and intellectuals with expensively squandered educations. And the rich, as my father predicted, used the occasion to dip deep into the wallets of the desperate and the distracted.

  It’s been hard times for a traditionalist of my persuasion. Long-standing moral values—usually claimed as “Judeo-Christian” but actually of much broader lineage—were summarily tossed, along with most familiar forms of logic. We were told, at one time or another, by the president or his henchpersons, that trees cause pollution, that welfare causes poverty, and that a bomber designed for mass destruction may be aptly named the Peacemaker. “Terrorism” replaced missing children to become our national bugaboo and—simultaneously—one of our most potent instruments of foreign policy. At home, the poor and the middle class were shaken down, and their loose change funneled blithely upward to the already overfed.

  Greed, the ancient lubricant of commerce, was declared a wholesome stimulant. Nancy Reagan observed the deep recession of ’82 and ’83 by redecorating the White House, and continued with this Marie Antoinette theme while advising the underprivileged, the alienated, and the addicted to “say no.” Young people, mindful of their elders’ Wall Street capers, abandoned the study of useful things for investment banking and other occupations derived, ultimately, from three-card monte. While the poor donned plastic outerwear and cardboard coverings, the affluent ran nearly naked through the streets, working off power meals of goat cheese, walnut oil, and crème fraîche.

  Religion, which even I had hoped would provide a calming influence and reminder of mortal folly, decided to join the fun. In an upsurge of piety, millions of Americans threw their souls and their savings into evangelical empires designed on the principle of pyramid scams. Even the sleazy downfall of our telemessiahs—caught masturbating in the company of $10 prostitutes or fornicating in their Christian theme parks—did not discourage the faithful. The unhappily pregnant were mobbed as “baby-killers”; sexual nonconformists—gay and lesbian—were denounced as “child molesters”; atheists found themselves lumped with “satanists,” communists, and consumers of human flesh.

  Yet somehow, despite it all, a trickle of dissent continued. There were homeless people who refused to be shelved in mental hospitals for the crime of poverty, strikers who refused to join the celebration of unions in faraway countries and scabs at home, women who insisted that their lives be valued above those of accidental embryos, parents who packed up their babies and marched for peace, students who protested the ongoing inversion of normal, nursery-school-level values in the name of a more habitable world.

  I am proud to add my voice to all these. For dissent is also a “traditional value,” and in a republic founded by revolution, a more deeply native one than smug-faced conservatism can ever be. Feminism was practically invented here, and ought to be regarded as one of our proudest exports to the world. Likewise, it tickles my sense of patriotism that insurgents in developing nations have often borrowed the ideas of our own civil rights movement. And in what ought to be a source of shame to some and pride to others, our history of labor struggle is one of the hardest-fought and bloodiest in the world.

  No matter that patriotism is too often the refuge of scoundrels. Dissent, rebellion, and all-around hell-raising remain the true duty of patriots.

  The Cult of Busyness

  New York Times, 1985

  Not too long ago a former friend and soon-to-be acquaintance called me up to tell me how busy she was. A major report, upon which her professional future depended, was due in three days; her secretary was on strike; her housekeeper had fallen into the hands of the Immigration Department; she had two hours to prepare a dinner party for eight; and she was late for her time-management class. Stress was taking its toll, she told me: Her children resented the fact that she sometimes got their names mixed up, and she had taken to abusing white wine.

  All this put me at a distinct disadvantage, since the only thing I was doing at the time was holding the phone with one hand and attempting to touch the opposite toe with the other hand, a pastime that I had perfected during previous telephone monologues. Not that I’m not busy, too: as I listened to her, I was on the alert for the moment the dryer would shut itself off and I would have to rush to fold the clothes before they settled into a mass of incorrigible wrinkles. But if I mentioned this little deadline of mine, she might think I wasn’t busy enough to need a housekeeper, so I just kept on patiently saying “Hmm” until she got to her parting line: “Look, this isn’t a good time for me to talk, I’ve got to go now.”

  I don’t know when the cult of conspicuous busyness began, but it has swept up almost all the upwardly mobile, professional women I know. Already, it is getting hard to recall the days when, for example, “Let’s have lunch” meant something other than “I’ve got more important things to do than tal
k to you right now.” There was even a time when people used to get together without the excuse of needing something to eat—when, in fact, it was considered rude to talk with your mouth full. In the old days, hardly anybody had an appointment book, and when people wanted to know what the day held in store for them, they consulted a horoscope.

  It’s not only women, of course; for both sexes, busyness has become an important insignia of upper-middle-class status. Nobody, these days, admits to having a hobby, although two or more careers—say, neurosurgery and an art dealership—is not uncommon, and I am sure we will soon be hearing more about the tribulations of the four-paycheck couple. Even those who can manage only one occupation at a time would be embarrassed to be caught doing only one thing at a time. Those young men who jog with their headsets on are not, as you might innocently guess, rocking out, but are absorbing the principles of international finance law or a lecture on one-minute management. Even eating, I read recently, is giving way to “grazing”—the conscious ingestion of unidentified foods while drafting a legal brief, cajoling a client on the phone, and, in ambitious cases, doing calf-toning exercises under the desk.

  But for women, there’s more at stake than conforming to another upscale standard. If you want to attract men, for example, it no longer helps to be a bimbo with time on your hands. Upscale young men seem to go for the kind of woman who plays with a full deck of credit cards, who won’t cry when she’s knocked to the ground while trying to board the six o’clock Delta shuttle, and whose schedule doesn’t allow for a sexual encounter lasting more than twelve minutes. Then there is the economic reality: Any woman who doesn’t want to wind up a case study in the feminization of poverty has to be successful at something more demanding than fingernail maintenance or come-hither looks. Hence all the bustle, my busy friends would explain—they want to succeed.

  But if success is the goal, it seems clear to me that the fast track is headed the wrong way. Think of the people who are genuinely successful—pathbreaking scientists, best-selling novelists, and designers of major new software. They are not, on the whole, the kind of people who keep glancing shiftily at their watches or making small lists titled “To Do.” On the contrary, many of these people appear to be in a daze, like the distinguished professor I once had who, in the middle of a lecture on electron spin, became so fascinated by the dispersion properties of chalk dust that he could not go on. These truly successful people are childlike, easily distractable, fey sorts, whose usual demeanor resembles that of a recently fed hobo on a warm summer evening.

  The secret of the truly successful, I believe, is that they learned very early in life how not to be busy. They saw through that adage, repeated to me so often in childhood, that anything worth doing is worth doing well. The truth is, many things are worth doing only in the most slovenly, half-hearted fashion possible, and many other things are not worth doing at all. Balancing a checkbook, for example. For some reason, in our culture, this dreary exercise is regarded as the supreme test of personal maturity, business acumen, and the ability to cope with math anxiety. Yet it is a form of busyness which is exceeded in futility only by going to the additional trouble of computerizing one’s checking account—and that, in turn, is only slightly less silly than taking the time to discuss, with anyone, what brand of personal computer one owns, or is thinking of buying, or has heard of others using.

  If the truly successful manage never to be busy, it is also true that many of the busiest people will never be successful. I know this firsthand from my experience, many years ago, as a waitress. Any executive who thinks the ultimate in busyness consists of having two important phone calls on hold and a major deadline in twenty minutes, should try facing six tablefuls of clients simultaneously demanding that you give them their checks, fresh coffee, a baby seat, and a warm, spontaneous smile. Even when she’s not busy, a waitress has to look busy—refilling the salt shakers and polishing all the chrome in sight—but the only reward is the minimum wage and any change that gets left on the tables. Much the same is true of other high-stress jobs, like working as a telephone operator, or doing data entry on one of the new machines that monitors your speed as you work: “Success” means surviving the shift.

  Although busyness does not lead to success, I am willing to believe that success—especially when visited on the unprepared—can cause busyness. Anyone who has invented a better mousetrap, or the contemporary equivalent, can expect to be harassed by strangers demanding that you read their unpublished manuscripts or undergo the humiliation of public speaking, usually on remote Midwestern campuses. But if it is true that success leads to more busyness and less time for worthwhile activities—like talking (and listening) to friends, reading novels, or putting in some volunteer time for a good cause—then who needs it? It would be sad to have come so far—or at least to have run so hard—only to lose each other.

  Death of a Yuppie Dream

  Journal der Rosa Luxemburg Stiftung, 2013

  With John Ehrenreich

  Every would-be populist in American politics purports to defend the “middle class,” although there is no agreement on what it is. Just in the last couple of years, the “middle class” has variously been defined as everybody, everybody minus the 15 percent living below the federal poverty level; or everybody minus the very richest Americans. Mitt Romney famously excluded “those in the low end” but included himself (2010 income $21.6 million) along with “80 to 90 percent” of Americans. The Department of Commerce has given up on income-based definitions, announcing in a 2010 report that “middle class families” are defined “by their aspirations more than their income.…Middle class families aspire to home ownership, a car, college education for their children, health and retirement security and occasional family vacations”—which excludes almost no one.

  Class itself is a muddled concept, perhaps especially in America, where any allusion to the different interests of different occupational and income groups is likely to attract the charge of “class warfare.” If class requires some sort of “consciousness,” or capacity for concerted action, then a “middle class” conceived of as a sort of default class—what you are left with after you subtract the rich and the poor—is not very interesting.

  But there is another, potentially more productive, interpretation of what has been going on in the mid-income range. In 1977, we first proposed the existence of a “professional-managerial class,” distinct from both the “working class,” from the “old” middle class of small business owners, as well as from the wealthy class of owners.

  The Origins of the

  Professional-Managerial Class

  The notion of the “PMC” was an effort to explain the largely “middle-class” roots of the New Left in the sixties and the tensions that were emerging between that group and the old working class in the seventies, culminating in the political backlash that led to the election of Reagan. The right embraced a caricature of this notion of a “new class,” proposing that college-educated professionals—especially lawyers, professors, journalists, and artists—make up a power-hungry “liberal elite” bent on imposing its version of socialism on everyone else.

  The PMC grew rapidly. From 1870 to 1910 alone, while the whole population of the United States increased two-and-one-third times and the old middle class of business entrepreneurs and independent professionals doubled, the number of people in what could be seen as PMC jobs grew almost eightfold. And in the years that followed, that growth only accelerated. Although a variety of practical and theoretical obstacles keep us from making any precise analysis, we estimate that as late as 1930, people in PMC occupations still made up less than 1 percent of total employment. By 1972, about 24 percent of American jobs were in PMC occupations. By 1983 the number had risen to 28 percent and by 2006, just before the Great Recession, to 35 percent.

  The relationship between the emerging PMC and the traditional working class was, from the start, riven with tensions. It was the occupational role of managers and engineer
s, along with many other professionals, to manage, regulate, and control the life of the working class. They designed the division of labor and the machines that controlled workers’ minute-by-minute existence on the factory floor, manipulated their desire for commodities and their opinions, socialized their children, and even mediated their relationship with their own bodies.

  At the same time, though, the role of the PMC as “rationalizers” of society often placed them in direct conflict with the capitalist class. Like the workers, the PMC were themselves employees and subordinate to the owners, but since what was truly “rational” in the productive process was not always identical to what was most immediately profitable, the PMC often sought autonomy and freedom from their own bosses.

  By the mid-twentieth century, jobs for the PMC were proliferating. Public education was expanding, the modern university came into being, local governments expanded in size and role, charitable agencies merged, newspaper circulation soared, traditional forms of recreation gave way to the popular culture, entertainment, and sports industries, etc.—and all of these developments created jobs for highly educated professionals, including journalists, social workers, professors, doctors, lawyers, and “entertainers” (artists and writers, among others).

  Some of these occupations managed to retain a measure of autonomy and, with it, the possibility of opposition to business domination. The so-called “liberal professions,” particularly medicine and law, remained largely outside the corporate framework until well past the middle of the twentieth century. Most doctors, many nurses, and the majority of lawyers worked in independent (private) practices.

 

‹ Prev