Book Read Free

Things That Matter: Three Decades of Passions, Pastimes and Politics

Page 17

by Charles Krauthammer


  We might start by asking Sarah Palin to leave the room. I’ve got nothing against her. She’s a remarkable political talent. But there are no “death panels” in the Democratic health-care bills, and to say that there are is to debase the debate.

  We also have to tell the defenders of the notorious Section 1233 of H.R. 3200 that it is not quite as benign as they pretend. To offer government reimbursement to any doctor who gives end-of-life counseling—whether or not the patient asked for it—is to create an incentive for such a chat.

  What do you think such a chat would be like? Do you think the doctor will go on and on about the fantastic new million-dollar high-tech gizmo that can prolong the patient’s otherwise hopeless condition for another six months? Or do you think he’s going to talk about—as the bill specifically spells out—hospice care and palliative care and other ways of letting go of life?

  No, say the defenders. It’s just that we want the doctors to talk to you about putting in place a living will and other such instruments. Really? Then consider the actual efficacy of a living will. When you are old, infirm and lying in the ICU with pseudomonas pneumonia and deciding whether to (a) go through the long antibiotic treatment or (b) allow what used to be called “the old man’s friend” to take you away, the doctor will ask you at that time what you want for yourself—no matter what piece of paper you signed five years earlier.

  You are told constantly how very important it is to write your living will years in advance. But the relevant question is what you desire at the end—when facing death—not what you felt sometime in the past when you were hale and hearty and sitting in your lawyer’s office barely able to contemplate a life of pain and diminishment.

  Well, as pain and diminishment enter your life as you age, your calculations change and your tolerance for suffering increases. In the ICU, you might have a new way of looking at things.

  My own living will, which I have always considered more a literary than a legal document, basically says: “I’ve had some good innings, thank you. If I have anything so much as a hangnail, pull the plug.” I’ve never taken it terribly seriously because unless I’m comatose or demented, they’re going to ask me at the time whether or not I want to be resuscitated if I go into cardiac arrest. The paper I signed years ago will mean nothing.

  And if I’m totally out of it, my family will decide, with little or no reference to my living will. Why? I’ll give you an example. When my father was dying, my mother and brother and I had to decide how much treatment to pursue. What was a better way to ascertain my father’s wishes: what he checked off on a form one fine summer’s day years before being stricken; or what we, who had known him intimately for decades, thought he would want? The answer is obvious.

  Except for the demented orphan, the living will is quite beside the point. The one time it really is essential is if you think your fractious family will be only too happy to hasten your demise to get your money. That’s what the law is good at—protecting you from murder and theft. But that is a far cry from assuring a peaceful and willed death, which is what most people imagine living wills are about.

  So why get Medicare to pay the doctor to do the counseling? Because we know that if this white-coated authority whose chosen vocation is curing and healing is the one opening your mind to hospice and palliative care, we’ve nudged you ever so slightly toward letting go.

  It’s not an outrage. It’s surely not a death panel. But it is subtle pressure applied by society through your doctor. And when you include it in a health-care reform whose major objective is to bend the cost curve downward, you have to be a fool or a knave to deny that it’s intended to gently point the patient in a certain direction, toward the corner of the sickroom where stands a ghostly figure, scythe in hand, offering release.

  The Washington Post, August 21, 2009

  MASS MURDER, MEDICALIZED

  What a surprise—that someone who shouts “Allahu Akbar” (the “God is great” jihadist battle cry) as he is shooting up a room of American soldiers might have Islamist motives. It certainly was a surprise to the mainstream media, which spent the weekend after the Fort Hood massacre playing down Nidal Hasan’s religious beliefs.

  “I cringe that he’s a Muslim.… I think he’s probably just a nut case,” said Newsweek’s Evan Thomas. Some were more adamant. Time’s Joe Klein decried “odious attempts by Jewish extremists … to argue that the massacre perpetrated by Nidal Hasan was somehow a direct consequence of his Islamic beliefs.” While none could match Klein’s peculiar cherchez-le-juif motif, the popular story line was of an army psychiatrist driven over the edge by terrible stories he had heard from soldiers returning from Iraq and Afghanistan.

  They suffered. He listened. He snapped.

  Really? What about the doctors and nurses, the counselors and physical therapists at Walter Reed Army Medical Center who every day hear and live with the pain and the suffering of returning soldiers? How many of them then picked up a gun and shot 51 innocents?

  And what about civilian psychiatrists—not the Upper West Side therapist treating Woody Allen neurotics, but the thousands of doctors working with hospitalized psychotics—who every day hear not just tales but cries of the most excruciating anguish, of the most unimaginable torment? How many of those doctors commit mass murder?

  It’s been decades since I practiced psychiatry. Perhaps I missed the epidemic.

  But, of course, if the shooter is named Nidal Hasan, who National Public Radio reported had been trying to proselytize doctors and patients, then something must be found. Presto! Secondary post-traumatic stress disorder, a handy invention to allow one to ignore the obvious.

  And the perfect moral finesse. Medicalizing mass murder not only exonerates. It turns the murderer into a victim, indeed a sympathetic one. After all, secondary PTSD, for those who believe in it (you won’t find it in DSM-IV-TR, psychiatry’s Diagnostic and Statistical Manual), is known as “compassion fatigue.” The poor man—pushed over the edge by an excess of sensitivity.

  Have we totally lost our moral bearings? Nidal Hasan cold-bloodedly killed 13 innocent people. His business card had his name, his profession, his medical degrees and his occupational identity. U.S. Army? No. “SoA”—Soldier of Allah. In such cases, political correctness is not just an abomination. It’s a danger, clear and present.

  Consider the army’s treatment of Hasan’s previous behavior. NPR’s Daniel Zwerdling interviewed a Hasan colleague at Walter Reed about a hair-raising grand rounds that Hasan had apparently given. Grand rounds are the most serious academic event at a teaching hospital—attending physicians, residents and students gather for a lecture on an instructive case history or therapeutic finding.

  I’ve been to dozens of these. In fact, I gave one myself on post-traumatic retrograde amnesia—as you can see, these lectures are fairly technical. Not Hasan’s. His was an hour-long disquisition on what he called the Koranic view of military service, jihad and war. It included an allegedly authoritative elaboration of the punishments visited upon nonbelievers—consignment to hell, decapitation, having hot oil poured down your throat. This “really freaked a lot of doctors out,” reported NPR.

  Nor was this the only incident. “The psychiatrist,” reported Zwerdling, “said that he was the kind of guy who the staff actually stood around in the hallway saying: Do you think he’s a terrorist, or is he just weird?”

  Was anything done about this potential danger? Of course not. Who wants to be accused of Islamophobia and prejudice against a colleague’s religion?

  One must not speak of such things. Not even now. Not even after we know that Hasan was in communication with a notorious Yemen-based jihad propagandist. As late as Tuesday, the New York Times was running a story on how returning soldiers at Fort Hood had a high level of violence.

  What does such violence have to do with Hasan? He was not a returning soldier. And the soldiers who returned home and shot their wives or fellow soldiers didn’t cry “Allahu Akbar” as they squeeze
d the trigger.

  The delicacy about the religion in question—condescending, politically correct and deadly—is nothing new. A week after the first (1993) World Trade Center attack, the same New York Times ran the following front-page headline about the arrest of one Mohammed Salameh: “Jersey City Man Is Charged in Bombing of Trade Center.”

  Ah yes, those Jersey men—so resentful of New York, so prone to violence.

  The Washington Post, November 13, 2009

  THE DOUBLE TRAGEDY OF A STOLEN DEATH

  “Dying is easy. Parking is hard.” Art Buchwald’s little witticism nicely captured his chosen path to a good death: mocking it to the very end. There is great courage and dignity in that, which is why Buchwald’s extended good-bye (he died on Jan. 17) earned him such appreciation and admiration. But dying well is also a matter of luck. By unexpectedly living almost a full year after refusing dialysis for kidney failure, Buchwald won himself time to taunt the scythe.

  Timing is everything. When former congressman—and distinguished priest and liberal luminary—Robert Drinan died earlier this year, the Washington Post published a special appreciation. It ran together with a tribute to another notable who died just one day later: Barbaro. The horse got top billing.

  And does anyone remember when Mother Teresa died? The greatest saint of our time died on the frenzied eve of the funeral of the greatest diva of our time, Princess Di. In the popular mind, celebrity trumps virtue every time.

  Consider Russian composer Sergei Prokofiev, tormented in life by Stalin, his patron and jailer. Prokofiev had the extraordinary bad luck of dying on the same day as the great man, “ensconcing him forever in the tyrant’s shadow,” wrote critic Sarah Kaufman of the Washington Post, “where he remains branded as a compromised artist.”

  We should all hope to die well. By that, I don’t mean in the classic Greek sense of dying heroically, as in battle. I’m suggesting a much lower standard: just not dying badly. At a minimum, not dying comically—death by banana peel or pratfall or (my favorite, I confess) onstage, like the actor Harold Norman, killed in 1947 during an especially energetic sword fight in the last scene of Macbeth.

  There is also the particularly unwelcome death that not just ends a life but also undoes it, indeed steals it. The way Kitty Genovese’s was stolen. On March 13, 1964, she was repeatedly stabbed for 35 minutes in the street and in the foyer of her apartment building in Queens, New York. Many neighbors heard her scream. Not one helped. When the police eventually arrived, it was much too late. Her death became a sensation, her name a metaphor for urban alienation, her last hour an indictment of the pitiless American city.

  I’ve always been struck by the double injustice of her murder. Not only did the killer cut short her life amid immense terror and suffering, but he defined it. He—a stranger, an intruder—gave her a perverse immortality of a kind she never sought, never expected, never consented to. She surely thought that in her 28 years she had been building a life of joys and loves, struggle and achievement, friendship and fellowship. That and everything else she built her life into were simply swallowed up by the notoriety of her death, a notoriety unchosen and unbidden.

  That kind of double death can also result from an act of God. Disease, for example, can not just end your life; if it is exotic and dramatic enough, it can steal your identity as well. Without being consulted, you become an eponym. At least baseball great Lou Gehrig had the time and talent to be remembered for things other than what was generally known as ALS (amyotrophic lateral sclerosis). Ryan White, a teenager when he died in the early years of the AIDS epidemic, did not. He was hastily conscripted as poster boy for the Ryan White Comprehensive AIDS Resource Emergency (CARE) Act—defined by his dying, much like poor Megan Kanka, the little girl murdered by a sex offender in New Jersey, who lives today as Megan’s Law.

  No one grasps more greedily—and cruelly—the need for agency in death as does the greatest moral monster of our time: the suicide bomber. By choosing not only the time and place but the blood-soaked story that will accompany his death, he seeks to transcend and redeem an otherwise meaningless life. One day you are the alienated and insignificant Mohamed Atta; the next day, September 11, 2001, you join the annals of infamy with all the glory that brings in the darker precincts of humanity. It is the ultimate perversion of the “good death,” done for the worst of motives—self-creation through the annihilation of others. People often denounce such suicide attacks as “senseless.” On the contrary, they make all too much malevolent sense. There is great power in owning your own death—and even greater power in forever dispossessing your infidel victims of theirs.

  Time, March 1, 2007

  ESSAY: ON THE ETHICS OF EMBRYONIC RESEARCH

  THE PROBLEM

  You were once a single cell. Every one of the 100 trillion cells in your body today is a direct descendant of that zygote, the primordial cell formed by the union of mother’s egg and father’s sperm. Each one is genetically identical (allowing for copying errors and environmental damage along the way) to that cell. Therefore, if we scraped a cell from, say, the inner lining of your cheek, its DNA would be the same DNA that, years ago in the original zygote, contained the entire plan for creating you and every part of you.

  Here is the mystery: Why can the zygote, as it multiplies, produce every different kind of cell in the body—kidney, liver, brain, skin—while the skin cell is destined, however many times it multiplies, to remain skin forever? As the embryo matures, cells become specialized and lose their flexibility and plasticity. Once an adult cell has specialized—differentiated, in scientific lingo—it is stuck forever in that specialty. Skin is skin; kidney is kidney.

  Understanding that mystery holds the keys to the kingdom. The Holy Grail of modern biology is regenerative medicine. If we can figure out how to make a specialized adult cell dedifferentiate—unspecialize, i.e., revert way back to the embryonic stage, perhaps even to the original zygotic stage—and then grow it like an embryo under controlled circumstances, we could reproduce for you every kind of tissue or organ you might need. We could create a storehouse of repair parts for your body. And, if we let that dedifferentiated cell develop completely in a woman’s uterus, we will have created a copy of you—your clone.

  That is the promise and the menace of cloning. It has already been done in sheep, mice, goats, pigs, cows, and now cats and rabbits (though cloning rabbits seems an exercise in biological redundancy). There is no reason in principle why it cannot be done in humans. The question is: Should it be done?

  Notice that the cloning question is really two questions: (1) May we grow that dedifferentiated cell all the way into a cloned baby, a copy of you? That is called reproductive cloning. And (2) may we grow that dedifferentiated cell just into the embryonic stage and then mine it for parts, such as stem cells? That is called research cloning.

  Reproductive cloning is universally abhorred. In July 2001 the House of Representatives, a fairly good representative of the American people, took up the issue and not a single member defended reproductive cloning. Research cloning, however, is the hard one. Some members were prepared to permit the cloning of the human embryo in order to study and use its component parts, with the proviso that the embryo be destroyed before it grows into a fetus or child. They were a minority, however. Their amendment banning baby-making but permitting research cloning was defeated by 76 votes. On July 31, 2001, a bill outlawing all cloning passed the House decisively.

  Within weeks, perhaps days, the Senate will vote on essentially the same alternatives. On this vote will hinge the course of the genetic revolution at whose threshold we now stand.

  THE PROMISE

  This is how research cloning works. You take a donor egg from a woman, remove its nucleus, and inject the nucleus of, say, a skin cell from another person. It has been shown in animals that by the right manipulation you can trick the egg and the injected nucleus into dedifferentiating—that means giving up all the specialization of the skin cell and returning
to its original state as a primordial cell that could become anything in the body.

  In other words, this cell becomes totipotent. It becomes the equivalent of the fertilized egg in normal procreation, except that instead of having chromosomes from two people, it has chromosomes from one. This cell then behaves precisely like an embryo. It divides. It develops. At four to seven days, it forms a “blastocyst” consisting of about 100 to 200 cells.

  The main objective of research cloning would be to disassemble this blastocyst: pull the stem cells out, grow them in the laboratory, and then try to tease them into becoming specific kinds of cells, say, kidney or heart or brain and so on.

  There would be two purposes for doing this: study or cure. You could take a cell from a person with a baffling disease, like Lou Gehrig’s, clone it into a blastocyst, pull the stem cells out, and then study them in order to try to understand the biology of the illness. Or you could begin with a cell from a person with Parkinson’s or a spinal cord injury, clone it, and tease out the stem cells to develop tissue that you would reinject into the original donor to, in theory, cure the Parkinson’s or spinal cord injury. The advantage of using a cloned cell rather than an ordinary stem cell is that, presumably, there would be no tissue rejection. It’s your own DNA. The body would recognize it. You’d have a perfect match.

  (Research cloning is sometimes called therapeutic cloning, but that is a misleading term. First, because therapy by reinjection is only one of the many uses to which this cloning can be put. Moreover, it is not therapeutic for the clone—indeed, the clone is invariably destroyed in the process—though it may be therapeutic for others. If you donate a kidney to your brother, it would be odd to call your operation a therapeutic nephrectomy. It is not. It’s a sacrificial nephrectomy.)

  The conquest of rejection is one of the principal rationales for research cloning. But there is reason to doubt this claim on scientific grounds. There is some empirical evidence in mice that cloned tissue may be rejected anyway (possibly because a clone contains a small amount of foreign—mitochondrial—DNA derived from the egg into which it was originally injected). Moreover, enormous advances are being made elsewhere in combating tissue rejection. The science of immune rejection is much more mature than the science of cloning. By the time we figure out how to do safe and reliable research cloning, the rejection problem may well be solved. And finally, there are less problematic alternatives—such as adult stem cells—that offer a promising alternative to cloning because they present no problem of tissue rejection and raise none of cloning’s moral conundrums.

 

‹ Prev