Taking the Stand

Home > Nonfiction > Taking the Stand > Page 14
Taking the Stand Page 14

by Alan Dershowitz


  Whatever the reasons, the upshot is that

  there has always existed a widespread series of practices, involving significant restraints on human liberty, without an articulated jurisprudence circumscribing and limiting its application. People are confined to prevent predicted harms without any systematic effort to decide what kinds of harms warrant preventive confinement; or what degree of likelihood should be required; or what duration of preventive confinement should be permitted; or what relationship should exist between the harm, the likelihood, or the duration.26

  Finally, I contrasted the absence of a jurisprudence regulating the prevention of future misconduct to the existing jurisprudence regulating the punishment of past misconduct.

  This is not to say that there currently exists a completely satisfactory jurisprudence or theory justifying the imposition of punishment for past acts. But at least many of the right questions have been asked and some interesting answers have been attempted. Even Blackstone’s primitive statement “that it is better that ten guilty persons escape, than that one innocent suffer” tells us something important about how to devise rules of evidence and procedure. There is no comparable aphorism for preventive confinement: is it better for X number of “false positives” to be erroneously confined (and for how long?) than for Y number of preventable harms (and of what kind?) to occur? What relationship between X and Y does justice require? We have not even begun to ask these kinds of questions, or to develop modes of analysis for answering them.27

  Since the time I wrote these words, “the preventive state” has expanded in scope, especially with regard to the prevention of terrorism. Many constitutionally and morally dubious measures—from Guantanamo, to waterboarding, to massive electronic snooping—have been justified in the name of preventing terrorist attacks. Yet we have still not come up with a satisfactory jurisprudence that appropriately balances the legitimate concerns of government in preventing terrorism against the compelling need to preserve the rule of law.

  Between 1968 and 1976, I wrote more than two dozen scholarly articles on these issues, becoming the first academic to focus in a systematic way on the preventive aspect of law. My work earned me a distinguished fellowship at the Center for Advanced Study of the Behavioral Sciences at Stanford University, a Guggenheim Fellowship, Ford Foundation and Fulbright travel grants, and several other scholarly recognitions. More important, it helped to change the laws governing the civil commitment of the mentally ill and the pretrial detention of defendants believed to be dangerous.

  My law review articles tended to be quite technical and written in legal-ese. But about a decade into my teaching career, I began to write for a more general audience. The New York Times asked me to contribute columns to the Week in Review section. In these columns, I analyzed Supreme Court decisions and other legal developments. I soon discovered that I had a knack for legal writing that was accessible to the general public. I recalled Professor Calabresi’s criticism of my earliest and unsuccessful efforts at writing: “You write like you’re having a conversation with your friends in Brooklyn.” I decided to turn that disadvantage as a law student into an advantage as a law professor: I would write for the general public the way I talked to my Brooklyn friends, who were smart as hell but not legally educated. It worked. I quickly became the “go to” professor when it came to explaining technical legal issues in ways that were understandable to the general public.

  Soon I was writing book reviews and articles for popular magazines, as well as law review articles. I loved writing, and wrote every single day. In that respect, I was emulating one of my mentors, Professor Alexander Bickel, who had told me that he never left his home until he had filled his daily quota of “three thousand good words.” I set myself a similar goal. I wrote everything by hand on legal pads, since I never learned to type. (“Typing is for girls,” my mother told me, as she typed all my term papers.)

  Within a few years of beginning to teach at Harvard, I had become one of the best known law professors in the country. My classes were oversubscribed, my scholarly articles were highly regarded, and my role as a public intellectual, though controversial among some faculty members, was gratifying. But there was something missing from my professional life. I was not yet a “real lawyer.”

  When I was offered the job at Harvard at age twenty-four, I knew that I was qualified to teach theoretical subjects, but I worried about my lack of real-world experience, since I had never practiced law. (One summer at a law firm between the second and third year at Yale does not a practitioner make.) My Brooklyn upbringing gave me a practical bent of mind—“street smarts”—but I craved some actual experience. I looked for opportunities to become involved in cases that would provide a smooth transition from theory to practice. I wanted to become a lawyer in addition to being a law professor.

  Deciding to become a real lawyer was not without controversy within the Harvard Law School faculty. One of my colleagues approached me, upon hearing that I had taken on a case, and said, “At Harvard Law School, we don’t have clients.” His voice dripped with contempt as he uttered the word “clients.” Most elite law schools prided themselves on teaching theory. To be sure, some of my colleagues came from a background of practice, but they had not begun to practice while teaching at the law school, although the school rules explicitly permitted it within certain limitations. Since I came to teaching directly from clerkships, I desperately felt the need for courtroom experience. But it would be untrue for me to suggest that this was my only reason for taking cases. I was dissatisfied with being only a teacher and a writer of articles. I wanted to make legal history, rather than just write about others who did. Like Justice Goldberg, I too was a man of action who could never be a complete person sitting behind a desk or standing in front of a classroom. I needed to be in the courtroom. I needed to be a real lawyer. I needed to know what it felt like to have flesh and blood clients whose lives would be affected by the law, rather than merely reading about abstract cases with faceless names. I needed to know what it felt like to win a case—and to lose one.

  I didn’t quite feel as Oliver Wendell Holmes did after a brief career as a Harvard law professor, that “academic life is but half life … a withdrawal from the fight in order to utter smart things that cost you nothing except for thinking of them from a cloister.” But I did feel that my life would be more complete and fulfilling if it included some cutting-edge litigation.

  In the beginning nearly all my cases were pro bono—without fee—and many were in association with the American Civil Liberties Union. They involved First Amendment challenges to censorship and Eighth Amendment challenges to the death penalty. Because these issues were within my area of academic expertise, the cases provided a smooth transition from theory to practice. In the next chapters I recount some of these cases and others that followed.

  THE CHANGING SOUND OF FREEDOM OF SPEECH

  From the Pentagon Papers to WikiLeaks

  5

  THE EVOLUTION OF THE FIRST AMENDMENT

  New Meanings for Cherished Words

  I always wanted to be a First Amendment lawyer. Everything in my upbringing led me to the defense of freedom of speech. I was always a dissident—though the authorities when I was growing up used the less polite terms “troublemaker” and “bon-dit.” I argued with everyone. I defended other troublemakers. I questioned everything and everybody. I rarely exercised my Fifth Amendment right to “remain silent.” For me, the freedom to speak, to write, to dissent, to seek a redress of grievances, to assemble, to doubt, to challenge, has always been central not only to democratic governance but to life itself. It is both a means and an end.

  The First Amendment has always been my favorite part of the Constitution, not because it is first—in its original, proposed form, it was the Third Amendment1—but because without its protection, all other rights are in danger.

  Not everyone agrees. Actor Charlton Heston once claimed that

  the Second Amendment is, in order o
f importance, the first amendment. It is America’s First Freedom, the one right that protects all the others. Among freedom of speech, of the press, of religion, of assembly, of redress of grievances, it is the first among equals. It alone offers the absolute capacity to live without fear. The right to keep and bear arms is the one right that allows “rights” to exist at all.2

  Both history and geography have proved Heston dead wrong: Nearly every other freedom-loving country has restrictions on gun ownership; while none has severe restrictions on expression. Experience has shown that liberty can thrive without the right to bear arms, but not without freedom of speech. The movement from weapons to words has marked the progress of civilization. As Sigmund Freud once put it: “The first human who hurled an insult instead of a stone was the founder of civilization.” That’s why hurling insults, not stones, is protected by our First Amendment.

  The stirring words of the First Amendment—“Congress shall make no law … abridging the freedom of speech or of the press”—haven’t been altered between my first case defending freedom of expression in the 1960s and my most recent ones, but the meaning of these iconic words has undergone dramatic transformation over the past half century. The major reason has been the rapid change in the manner by which speech is transmitted. Technology has altered the sound and look of freedom of expression.

  When I was a law clerk, carbon paper was the means by which a written message could be sent to a handful of readers. We were required to circulate memoranda to all the justices, and we did so by typing them and using carbon paper to produce nine “flimsies,” as we referred to the carbon copies. Even in the 1970s, when I first traveled to the Soviet Union, the dissidents and refuseniks asked me to bring carbon paper so they could make multiple copies of their banned samizdat literature. The Pentagon Papers were reproduced by hand on primitive copying machines. Today, with faxes and the Internet, the click of a button sends WikiLeak disclosures around the globe in an instant.

  Over the past fifty years I have defended every means, manner, and mode of expression—films, plays, books, magazines, newspapers, graffiti, photographs, leaflets, pamphlets, megaphones, wall postings, websites, Internet postings, speeches, heckling, cartoons, faxes, composites, noises, threats, incitements, videos, ads, prayers, classes, live and filmed nudity (frontal, sideal, backal), defamation, blasphemy, and digital communication (by which I mean a raised middle finger).

  I have defended neo-Nazi and racist speech, Stalinist rhetoric, anti-Israel hate speech, soft-core erotica, hard-core pornography, nude photographs of children, and disgusting videos of bestiality. I have defended the rights of major newspapers and book publishers, as well as of anonymous and not-so-anonymous bloggers, tweeters, website operators, and whistle-blowers, to disclose classified information, state secrets, and other material the government would prefer to keep under wraps.

  I have represented people I love, people I hate, and people I don’t give a damn about—good guys, bad guys, and everything in between. H. L. Mencken used to bemoan the reality that

  the trouble about fighting for human freedom is that you have to spend much of your life defending sons of bitches: for oppressive laws are always aimed at them originally, and oppression must be stopped in the beginning if it is to be stopped at all.3

  I have criticized nations I admire—such as the United States, Canada, Israel, Great Britain, Italy, and France—for censoring people I despise. And I have defended people I despise for attacking nations, institutions, and individuals I admire.

  In each instance, I’ve stood up for an important principle: the right of the individual, rather than the government, to decide what to say, what to show, what to hear, what to see, what to teach, what to learn. I have opposed the power of the state (and other state-like institutions) to censor, punish, chill, or impose costs on the exercise of the freedom of expression—even, perhaps especially, expression with which I disagree and despise or believe may be hateful, hurtful, or even dangerous.

  I have myself been the victim of outrageously false defamations,4 and I have been falsely accused of defaming others. I have been informally charged with inciting war crimes5 and formally charged with criminally defaming a judge6—to which I plead not guilty! I have defended the right of my enemies to lie about me, to heckle me, to boycott me, and even to try to get me fired. While defending the right of my opponents to say nearly anything they want, I have insisted on my own right to criticize, condemn, and vilify them for the wrongness of what they have chosen to say. Freedom of expression includes the right to be wrong, but it does not include the right to be immune from verbal counterattack.

  I am not a free speech absolutist when it comes to the First Amendment—at least not in theory. But in practice I nearly always side with the freedom to speak, rather than the power to censor.

  It’s not that I always trust the citizenry; it’s that I never trust the government. It’s not that I believe the exercise of the freedom of speech will always bring about good results; it’s that I believe that the exercise of the power to censor will almost always bring about bad results. It’s not that I believe the free marketplace of ideas will always produce truth; it’s that I believe that the shutting down of that marketplace by government will prevent the possibility of truth.7

  My family and educational background—especially my constant arguments with rabbis, teachers, neighbors, and friends—made me into a skeptic about everything. I am certain that certainty is the enemy of truth, freedom, and progress. Hobbes has been proved wrong by the verdict of history in his inclusion among the “rights of sovereigns” the power to censor “all books before they are published” that are “averse” to “the truth” or not conducive to peace.8

  I realize that I will never know “the truth.” Neither will anyone else. All I can do is doubt, challenge, question, and keep open the channels of knowledge, the flow of information, and the right to change my mind and the minds of others. To me, truth is not a noun; it is an active verb, as in “truthing” (or knowing, learning, or experiencing).9

  My favorite characters in the Bible and in literature are those who challenge authority: Adam and Eve defying God and eating the forbidden fruit of knowledge; Abraham chastising God for threatening to sweep away the innocent along with the guilty; Moses imploring God to change his mind about destroying the “stiff-necked” Jewish people; Jesus for arguing with the Pharisees and defying Roman authority; Don Quixote tilting at windmills; Ivan Karamazov challenging conventional wisdom; and the child who shouted “The Emperor has no clothes.”

  My favorite justices of the Supreme Court are the dissenters. My favorite historical figures are political and religious dissidents. My closest friends are iconoclasts. Some of my best teachers were fired.

  The First Amendment would have been nothing more than a parchment promise had it not been given life by brave political dissidents and bold judicial dissenters. Because of these provocateurs, the First Amendment has not become ossified with age. It has changed with the times, sometimes for the better, sometimes for the worse.

  Although the literal words have remained the same for more than two centuries, two of the most important ones have been changed beyond recognition. These words are “Congress” and “no.” (“Congress shall make no law.…”) The controversial role of these words can best be illustrated by a story, perhaps apocryphal but reflecting reality, about two great and contentious justices, Hugo Black, who claimed to be an absolutist and literalist when it came to the words of the First Amendment, and Felix Frankfurter, who advocated a more functional balancing approach (a “sound” view) despite the seemingly clear words of that amendment. In a case involving censorship by a state, Black pulled out his ragged copy of the Constitution and read the First Amendment out loud to the lawyer representing the state. “It says Congress shall make no law abridging the freedom of speech.” He banged the table as he shouted and repeated the word “no.” “What don’t you understand about the word ‘no’?” he asked rh
etorically. Justice Frankfurter interrupted and said, “You’re reading the words wrong.” The lawyer looked startled as the justice explained. “It doesn’t say ‘Congress shall make no law.’ It says, ‘Congress shall make no law.’ ” And he banged the table as he shouted and repeated the word “Congress.” He then continued, “This law wasn’t passed by Congress, it was passed by a state legislature. What don’t you understand about the word ‘Congress’?” he asked, mocking his fellow justice.

  By emphasizing different words, the justices were giving different meanings to the same language of the First Amendment.

  The reality is that both these words—“Congress” and “no”—have been excised over time. The first—“Congress”—was central to the history of the Bill of Rights, which was seen by its framers largely as a bill of restrictions on the power of the national legislature—namely “Congress.” There was considerable concern that the Constitution, which replaced the Articles of Confederation, bestowed too much power on the national legislature, thus reducing the rights (really the powers) of the states to legislate for their citizens.10 The First Amendment was not intended by its framers to impose restrictions on the states. In fact when the Bill of Rights was enacted, and for years thereafter, several states had laws severely abridging freedom of speech and the press.11 (Some states also had officially established churches and legally discriminated against Catholics, Jews, Turks, and “other” heathens.)12 If the framers had wanted to impose restriction on the states, they could have written a more general declaration protecting the right of free speech from abridgment by any government. For example: “The freedom of speech shall not be abridged by Congress or by the states.” Indeed, many scholars13 and judges believe that this was accomplished three-quarters of a century later by the Fourteenth Amendment, which provides in relevant part:

 

‹ Prev