It was for me a seminal moment: visual, tangible evidence of the ultimate convergence of left and right, an object lesson in the virtues of ideological moderation, and my good-bye, if there ever was a hello, to political romanticism. It set my own political course toward philosophical skepticism and political tolerance. That didn’t mean splitting differences or straddling some ideological midpoint. It meant viewing certainty with suspicion and acknowledging, with both regret and resolve, the imperfectibility of man, the fallibility of institutions and the tragic—rather than redemptive—nature of history.
When my senior year was up, I went on to pour myself into political philosophy at Oxford. It is there that I discovered my muse for this prudential view of the possibilities of politics: John Stuart Mill, whose On Liberty is the foundational document of classical liberalism. To this day it is the first thing I recommend to any young person seeking political guidance, to be followed by Isaiah Berlin’s Four Essays on Liberty, perhaps the finest, and surely the most accessible, 20th-century elaboration of that tradition.
I say Mill’s liberalism, invoking the 19th-century understanding of the word. Today that political orientation is the essence of conservatism. In the 20th century, liberalism became more ambitious. It outgrew the classical individualism of the Millian tradition and fell in thrall to the romantic progressivism of the age. Mill held that truth emerges from an unfettered competition of ideas and that individual character is most improved when allowed to find its own way uncoerced. That vision was insufficient for 20th-century American liberalism. Mill lacked sweep and glory. Modern liberalism’s perfectionist ambitions—reflected in its progenitor (and current euphemism), progressivism—seeks to harness the power of government, the mystique of science and the rule of experts to shape both society and citizen and bring them both, willing or not, to a higher state of being.
It was natural, therefore, that I would in time turn to conservatism. But that proved to be years away, largely because I then abruptly left the field and abandoned my studies. Mill had his crisis of conscience at 20. I had mine at 21. I had become increasingly uncomfortable with a life of theory. I began to regard it as something of an intellectual self-indulgence, increasingly divorced from reality and from duty. One morning that August, I called the registrar at Harvard Medical School, where I’d previously been accepted, to ask for immediate admission. Quite by chance, a student had just dropped out—a week before the beginning of the term. If I was there by Monday, the spot was mine.
I didn’t pack. I left immediately to take up what I was sure would be my life’s work: the practical, the real, the indisputably worthy work of medicine. There I spent the next seven years, a nice biblical sum, four as a medical student, three as a psychiatric resident at Massachusetts General Hospital.
Medicine had not been my first choice. I had long preferred the graceful lines of physics to the ragged edges of biology. But at 16, I’d come to the realization that I didn’t have what it took to do important work in theoretical physics, namely genius. A friend in my Special Relativity class, who had come to the same awful conclusion about his own intellectual capacities, told me that he was changing his major because he didn’t want to end up testing steel for General Motors. (The great physicist Max Planck did once tell John Maynard Keynes that he had thought of studying economics but decided it was too difficult. That I take as an exercise in either humor or good manners.)
Not thirsting to test steel, I chose medicine. I have no regrets. It was challenging and enlarging. I absorbed more knowledge in those seven years than at any other time in my life. Although it did turn out that I wasn’t quite the perfect fit for psychiatry either. In my first year at Mass General, I discovered that all freshmen residents were required to attend a weekly group-therapy session. Seeing this as pointless, I refused. This did not go over well with management. Around week seven, I was called into the department chairman’s office. He asked me why I had not been attending.
“Because I came here to give therapy, not get it,” I said.
“You’re just in denial,” he protested loudly.
“Of course I am, sir. It’s the best defense mechanism ever invented. Why, I’m a master of denial. I should be a professor—I could give a course in denial.”
I was enjoying the riff, but the chief was not. He cut me short: Attend the sessions or leave the program. Having few marketable skills and fewer prospects, I attended. But for the remaining 20 or so weeks, I said nary a word in the group. I was occasionally asked why. “I’m in denial,” I explained.
Don’t get me wrong. This was something of an aberration. I had gone to Mass General because it was the most biologically (and least psychoanalytically) oriented of all the psychiatric programs in the Boston area. Remarkably and brilliantly so. Their pharmacological treatments were superb. They pioneered a new method of (unilateral) electric shock therapy that was astonishingly successful in curing the most profound and recalcitrant depressions. And they even used hypnosis, which I learned to administer to in-patient burn victims in order to block the excruciating pain that accompanies bandage changes—thus sparing them the need for potentially addictive narcotics.
It was a noble life. I remain forever respectful of the work carried on by my colleagues and grateful for my own seven years there. Medicine—and particularly hospital medicine, which lives in a sea of human suffering—has a way of beating callowness out of even the most self-possessed youth. Its other invaluable residue is best described by Arthur Conan Doyle, himself a physician: “The moral training to keep a confidence inviolate, to act promptly on a sudden call, to keep your head in critical moments, to be kind yet strong—where can you, outside medicine, get such a training as that?”
Nonetheless, my life as a doctor felt constricted. It was conducted entirely within the four walls of the hospital. Outside, there was a history unfolding to which I felt impelled to contribute—but saw no way to. How to get from here to there? Having moved from New York to Montreal when I was five (prudently bringing my parents with me) and having not resided again in the United States until medical school, I had few contacts outside Boston, and none in politics or journalism.
And then, pure blind luck. With only a few months left in my residency, a professor with whom I’d written a couple of papers on manic-depressive disease was appointed out of the blue by President Carter to head a newly created federal superagency overseeing the National Institute of Mental Health (NIMH). At my suggestion, he took me to Washington to help shape national research on a new diagnostic system designed to move psychiatry from its anecdotal Freudian origins to a more empirical and scientific foundation.
It was my “Go West, young man” moment. Although I knew no one in Washington, I figured it was there that I would find my way back to my long-dormant interest in public affairs.
This is no place for an extended biography, so I’ll be brief about the unfolding. While working at the NIMH, I began writing articles for The New Republic and was then offered a speechwriting job for Vice President Walter Mondale. It was spring 1980. Six months of that and our side lost, badly. Such was my political career. At which point, The New Republic offered me a job as a writer and editor. I began my new life in journalism on the day Ronald Reagan was sworn in, January 20, 1981.
Some of the articles in this book are from those early years at The New Republic. Others are from the monthly back-page essay I began writing for Time magazine in 1983. Most of the pieces in this book, however, are from the weekly column I have been writing for the Washington Post since 1984.
These quite fantastic twists and turns have given me a profound respect for serendipity and a rather skeptical view about how much one really is master of one’s fate. A long-forgotten, utterly trivial student council fight brought me to journalism. A moment of adolescent angst led to the impulsive decision to quit political studies and enroll in medical school. A decade later, a random presidential appointment having nothing to do with me brought me to a place where my w
riting and public career could begin. When a young journalist asks me today, “How do I get to be a nationally syndicated columnist?” I have my answer: “First, go to medical school.”
These turns in my career path were a matter of fate. But there was one other transformation that was not, and it needs a word of explanation. It was deliberate, gradual and self-driven: my ideological evolution.
I’m often asked: “How do you go from Walter Mondale to Fox News?” To which the short answer is: “I was young once.” The long answer begins by noting that this is hardly a novel passage. The path is well trodden, most famously by Ronald Reagan, himself once a New Deal Democrat, and more recently by a generation of neoconservatives, led by Irving Kristol and Norman Podhoretz. Every story has its idiosyncrasies. These are mine.
I’d been a lifelong Democrat, and in my youth a Great Society liberal. But I had always identified with the party’s Cold War liberals, uncompromising Truman-Kennedy anti-communists led by the likes of Henry Jackson, Hubert Humphrey and Pat Moynihan. Given my social-democratic political orientation, it was natural for me to work for Democrats, handing out leaflets for Henry Jackson in the 1976 Massachusetts primary (Jackson won; I handed out a lot of leaflets) and working for Mondale four years later.
After Reagan took office in 1981, however, Democratic foreign policy changed dramatically. Some, of course, had begun their slide toward isolationism years earlier with George McGovern’s “Come Home, America” campaign. But the responsibility of governance imposes discipline. When the Soviets provocatively moved intermediate-range nuclear forces (INF) into Eastern Europe, President Carter and German chancellor Helmut Schmidt got NATO to approve the counterdeployment of American INFs in Western Europe.
However, as soon as they lost power in 1981, the Democrats did an about-face. They fell in the thrall of the “nuclear freeze,” an idea of unmatched strategic vacuity, which would have canceled the American IMF deployment while freezing the Soviet force in place. The result would have been a major strategic setback, undermining the nuclear guarantee that underwrote the NATO alliance.
Years later, leading European social democrats repented their youthful part in the anti-nuclear movement of the early ’80s. But the Democratic Party never did. It went even further left. It reflexively opposed every element of the Reagan foreign policy that ultimately brought total victory in the Cold War: the defense buildup, the resistance to Soviet gains in Central America and the blunt “evil empire” rhetoric that gave hope and succor to dissidents in the gulag. Democrats denounced such talk as provocative and naïve—the pronouncements of “an amiable dunce,” to quote Clark Clifford’s famous phrase disdaining Reagan.
And most relevant now, Democrats became implacable foes of missile defense, in large part because the idea originated with Reagan. The resistance was militant and nearly theological. It lasted 30 years—until, well, today, when a Democratic administration, facing North Korean nuclear threats, frantically puts in place (on Guam, in Alaska, in California and off the Korean coast) the few missile-defense systems that had survived decades of Democratic opposition and defunding.
I wrote most of the New Republic editorials opposing the Democratic Party’s foreign policy of retreat, drawing fierce resistance from and occasioning public debate with my more traditionally liberal TNR colleagues. My attack on the nuclear freeze, announced the publisher rather ruefully at the next editorial meeting, produced more canceled subscriptions than any other article in the magazine’s history. At that time, I still saw myself as trying to save the soul of the Democratic Party, which to me meant keeping alive the activist anti-communist tradition of Truman and Kennedy. But few other Democrats followed. By the mid-1980s, Humphrey and Jackson were dead and Moynihan had declined to pick up their mantle. The Cold War contingent of the Democratic Party essentially disappeared. As someone who had never had any illusions about either communism or Soviet power, I gave up on the Democrats.
On foreign policy, as the cliché goes, I didn’t leave the Democratic Party. It left me.
Not so on domestic policy. The Democratic Party remained true to itself. I changed. The origin of that evolution is simple: I’m open to empirical evidence. The results of the Great Society experiments started coming in and began showing that, for all its good intentions, the War on Poverty was causing irreparable damage to the very communities it was designed to help. Charles Murray’s Losing Ground was one turning point. Another, more theoretical but equally powerful, was Mancur Olson’s The Rise and Decline of Nations, which opened my eyes to the inexorable “institutional sclerosis” that corrodes and corrupts the ever-enlarging welfare state. The ’80s and ’90s saw the further accumulation of a vast body of social science evidence—produced by two generations of critics, from James Q. Wilson to Heather Mac Donald, writing in The Public Interest, City Journal and elsewhere—on the limits and failures of the ever-expanding Leviathan state.
As I became convinced of the practical and theoretical defects of the social-democratic tendencies of my youth, it was but a short distance to a philosophy of restrained, free-market governance that gave more space and place to the individual and to the civil society that stands between citizen and state. In a kind of full-circle return, I found my eventual political home in a vision of limited government that, while providing for the helpless, is committed above all to guaranteeing individual liberty and the pursuit of one’s own Millian “ends of life.”
Such has been my trajectory. Given my checkered past, I’ve offered this brief personal history for those interested in what forces, internal and external, led me to change direction both vocationally and ideologically. I’ve elaborated it here because I believe that while everyone has the right to change views, one does at least owe others an explanation. The above is mine. This book represents the product of that journey.
III. A WORD ON ORGANIZATION AND METHOD
The body of this book is made up of newspaper columns and shorter magazine pieces grouped by theme in 16 chapters. I have included, however, five longer essays on subjects of enough complexity to have required more extensive treatment. The first of these is on the ethics of embryonic research, originally published by the President’s Council on Bioethics, of which I was then a member. It is included as the last entry in chapter 9, “Body and Soul.” The second essay is a meditation on Jewish destiny, first published in the late 1990s and included here at the end of chapter 12, “The Jewish Question, Again.” And finally, the book’s last chapter, “Three Essays on America and the World,” is based on three speeches on U.S. foreign policy delivered over two decades about the structure and demands of the post–Cold War international system.
Every article in this book, long or short, is reproduced in its original form, except for three considerations. First, I have rewritten some of the headlines. Generally speaking, columnists and essayists don’t write their own headlines. Many times have I been dismayed by the editor’s ultimate choice, which was often dictated by the space requirements of the laid-out page.
That constraint doesn’t exist in a book. I am finally released from the tyranny of the one- or two-column head. It’s my chance to get it right. I do, however, include the location and exact date of publication of each article, to enable those readers enjoying far too much leisure time to consult the original headline if they wish.
Second, I altered some punctuation and usage, largely for the sake of uniformity. The different publications in which I appear follow different style books. (For example, regarding serial commas: whether you write a, b, and c rather than a, b and c.) My syndicated columns adhere to the Associated Press style book. The version appearing in the Washington Post follows the Post’s own style. Same with Time. As for the others, God knows what they use. To keep the chaos to a minimum, I have tried to impose a consistency in style and usage. This has the welcome subsidiary benefit of allowing me to exercise my petty punctuational prejudices, most importantly, my war on commas. They are a pestilence. They must be stopped. This
book is a continuation of that campaign.
Finally, I have corrected typographical errors and on rare occasions edited a line or two of text for reasons of obscurity, redundancy or obsolescence, the last encompassing references so historically obscure today as to have otherwise required cluttering up with explanatory footnotes. Altogether, these are perhaps a dozen or two. They change no meaning.
The rest, alas, remains untouched. It stands as it was on the day it was first published: imperfect, unimproved, unapologetically mine.
Washington, D.C., August 12, 2013
CHAPTER 1
THE GOOD AND THE GREAT
MARCEL, MY BROTHER
PLACE: Los Angeles area emergency room.
TIME: Various times over the last 18 years.
SCENE: White male, around 50, brought in by ambulance, pale, short of breath, in distress.
Intern: You’re going to be alright, sir. I’m replacing your fluids, and your blood studies and electrolytes should be back from the lab in just a few minutes.
Patient: Son, you wait for my electrolytes to come back and I’ll be dead in 10 minutes. I ran the ICU here for 10 years. I’m pan-hypopit and in (circulatory) shock. I need 300 mg of hydrocortisone right now. In a bolus. RIGHT NOW. After that, I’ll tell you what to run into my IV, and what lab tests to run. Got it?
Intern: Yes sir.
This scene played itself at least half a dozen times. The patient was my brother Marcel. He’d later call to regale me with the whole play-by-play, punctuated with innumerable, incredulous can-you-believe-its. We laughed. I loved hearing that mixture of pride and defiance in his voice as he told me how he had yet again thought and talked his way past death.
Amazingly, he always got it right. True, he was a brilliant doctor, a UCLA professor of medicine and a pulmonologist of unusual skill. But these diagnostic feats were performed lying flat on his back, near delirious and on the edge of circulatory collapse. Marcel instantly knew why. It was his cancer returning—the rare tumor he’d been carrying since 1988—suddenly popping up in some new life-threatening anatomical location. By the time he got to the ER and was looking up at the raw young intern, he’d figured out where it was and what to do.
Things That Matter: Three Decades of Passions, Pastimes and Politics Page 2