Unfair
Page 9
The Supreme Court has given us an answer. Prosecutors are mostly upstanding people, but there are few bad ones mixed in, as in any profession. Our existing legal regime adequately trains lawyers on their ethical obligations, but, at the end of the day, lawyers make their own choices. Prosecutors know their legal responsibilities; when a troubling incident like this one occurs, it’s because someone decided to disregard his duty and act in a dishonest way. And in the Court’s view, not much can be done to prevent it, because the problem lies with individual moral compasses, set years earlier, rather than with institutional and other situational pressures.
After being released from prison, Thompson successfully sued District Attorney Connick on the grounds that he had acted with deliberate indifference to the need to train prosecutors in their responsibilities to disclose exculpatory evidence, like the blood evidence that was kept from Thompson’s attorneys. But when the case made it to the Supreme Court, Justice Clarence Thomas, writing for the majority, overturned the $14 million in damages Thompson was set to receive—$1 million for each year he was on death row—relying on the above reasoning. In the majority’s opinion, ethical legal training is effective, the prosecutor’s role is clear, and there just wasn’t proof that Gerry Deegan’s actions were anything but an isolated incident.
Similarly, Justice Scalia, in a concurrence joined by Justice Alito, characterized the withholding of evidence as the result of the actions of a lone “miscreant prosecutor” who had committed “a bad-faith, knowing violation.” Deegan, in this framing, was a rogue agent—a loose cannon “willfully suppress[ing] evidence he believed to be exculpatory, in an effort to railroad Thompson.” To suggest that broader forces within the D.A.’s office might have contributed to the deprivation of Thompson’s rights was ludicrous. If you had a large barrel of apples, a few rotten ones were unavoidable: “the inevitability of mistakes over enough iterations of criminal trials” was something that society just had to accept.
This aligns with our common sense, although many of us would go even further. Yes, we should expect there to be good and bad prosecutors, but we also think there is something unsavory about lawyers in general. They are sharks or hired guns, ready and willing to do whatever it takes to win.
Despite the pervasiveness of these stereotypes, explanations of dishonest behavior that focus on the bad character of certain lawyers or all lawyers are largely inaccurate. Lawyers do cheat and lie. They do break the rules and cause immense suffering. When Deegan failed to turn over the blood test to Thompson’s defense and hid the actual sample, he was violating the law and, more fundamentally, breaching basic moral tenets: a man’s life was at stake, and Deegan kept him from learning of evidence that could have saved him. That is all true. The big question is, why? Until we have an accurate picture of what is driving things, we won’t make serious progress in eliminating attorney misconduct.
—
If you spend time studying dishonesty, you will quickly notice a strange paradox: although most of us care about being moral and ethical, we step over the line all the time. Our behavior can be baffling.
On the one hand, we are rule-following creatures. We stop at red lights and pick up after our dogs even when no one is around; we do not shout curse words when visiting elementary schools or reach our hand into the tip jar at Starbucks or grope passing strangers or hunt other humans for sport. And we celebrate our rules of moral living in our religions, in our professions, and in our schools. We have monuments to the Ten Commandments and mandatory ethics seminars and criminal codes. To be labeled as unethical is to be tagged with a badge of ignominy.
On the other hand, take a look around: we are swimming in a sea of dishonesty. At this very moment there are people right in front of us—strangers and those we know; our idols, our enemies, and our friends—behaving badly: employees lying on their time sheets or padding their expense accounts; athletes feigning fouls to win penalties or taking performance-enhancing drugs to gain an edge; cheating spouses and partners; men and women engaging in insurance fraud and tax fraud and defrauding their elderly relatives; millions of Americans downloading billions of songs and videos that they did not pay for.
There is evidence of pervasive cheating at nearly every stage of life. In light of recent data, it may be time to retire our romantic notions of youthful innocence. Cheating by students is rampant: over half of high school students admit to cheating, and the numbers appear to be just as bad or worse at the college level. Graduate students cheat, too, with business students leading the way (56 percent admit to doing so) and aspiring lawyers actually below average (45 percent). What’s more, top students appear just as likely as the rest of the bell curve to violate academic rules. In recent years, significant cheating has been revealed at the elite Stuyvesant High School in New York, the Air Force Academy, and Harvard.
In truth, people dodge and skirt, hustle and scam on Beggars Row and Wall Street and every avenue between. Watch your children, your spouse, your students, your co-workers, your employees, and you will see the game being played.
So what’s driving it all? Why do people cheat? And what is it about our legal system that makes prosecutors particularly vulnerable?
—
If dishonesty doesn’t come down to just a few people with bad characters, might the underlying commonsense model of why people cheat still be correct—that is, that people choose to cheat whenever the benefits outweigh the costs of potentially getting caught and punished?
Researchers decided to test that theory by paying people to solve number puzzles and varying the factors that we assume influence dishonesty. What they found was baffling. Making it less likely that participants would be caught lying about the number of matrices they had solved did not significantly increase the level of cheating—nor did increasing the amount of money that participants were paid for correct answers. Indeed, when the researchers increased the payout up to $10 per matrix, cheating actually decreased.
Our dishonesty, then, is not a simple matter of cost-benefit analysis. Lots and lots of people engage in dishonest behavior, but even when provided with the chance and incentives, they don’t generally cheat “big.”
In one study documenting this phenomenon, behavioral scientists gave people the opportunity to cheat without getting caught on a multiple-choice test that awarded money for each general-knowledge question they got right. And sure enough, a very large number of study participants did. But each person cheated by a relatively small amount—only 20 percent of the amount he could have gotten away with.
It’s as if there is something inside of people limiting how fast the dishonesty engine will turn. According to researchers, that mechanism may be our own egos. We are each strongly motivated to maintain our image as a virtuous person—and that motivation can act as a powerful constraint on our self-interested actions. We want to believe that we are honest and ethical, and when we cheat, we endanger that rosy self-view. The more an instance of cheating threatens to darken our picture of ourselves, the less likely we are to act.
In an interesting demonstration of this dynamic, a group of experimenters presented participants with an opportunity to earn money from cheating by lying about how many times a coin they had flipped had come up heads. Although all participants were aware that they could cheat without getting caught, some participants were told, “Please don’t cheat,” while others were told, “Please don’t be a cheater.” It’s hard to imagine that such a subtle linguistic cue would have any effect at all, but the researchers found that it did. When the verb “cheat” was used, some people still cheated, but when the self-relevant noun “cheater” was used, participants’ identities were suddenly implicated, and there was no cheating.
To feel good about ourselves, the obvious answer is to swear off dishonesty or keep it small. Cheating on a few questions in a psychology experiment doesn’t make us feel like a bad person, nor does failing to report a small gambling win on our taxes. But there’s another way to address the in
ner conflict that arises when our actions are at odds with our positive self-image: trick ourselves into thinking we’re not behaving so badly after all.
So, although 51 percent of high school students in one study admitted to having recently cheated on a test, and 61 percent admitted to having lied to a teacher, and 20 percent admitted to having stolen something from a store, 93 percent of those surveyed reported that they were “satisfied with [their] own ethics and character.” Likewise, while taxpayers end up defrauding the government of some $385 billion each year—often by failing to report income—well over 90 percent of the public agrees that “it is every American’s civic duty to pay their fair share of taxes” and that “everyone who cheats on their taxes should be held accountable.” Though we are reluctant to acknowledge it, we all have this capacity for self-deception.
If we want to understand why people act dishonestly, we need to look at the factors that determine how easy or difficult it is to rationalize dishonest behavior. When justifying our actions is a struggle, we find it more difficult to break the rules. And therein lies the key to prosecutorial misconduct: most lawyers aren’t consciously trying to cheat defendants; they’re just extremely good at deceiving themselves.
—
One of the most promising strategies that people use to rationalize an ethical breach is to downplay the causal link between the dishonest action and any associated harm. If I can convince myself that my behavior is unlikely to be damaging, it becomes much easier for me to justify it and maintain my positive self-view. And the greater the distance—and the more intervening elements—between what I’m thinking about doing and any clear detriment, the less motivated I’ll be to choose the honest option. For example, scientists have found that people will cheat about twice as much when they are cheating to earn tokens that can be exchanged for money as when they are cheating to earn money directly.
To understand why this is relevant to prosecutorial misconduct, consider two attorneys. One, like Deegan, is deciding whether to hand over a potentially exculpatory report on blood evidence found at the scene of the crime; the other is deciding whether to pay a $10,000 bribe to a wavering juror to gain a conviction.
Research would suggest that the second prosecutor is much less likely than the first to take the dishonest action, because it is harder for him to see his actions as innocuous. After all, they come right at the critical moment of decision. And there are no other apparent interveners who can be blamed. By paying off the decision-maker, the second prosecutor is determining the outcome.
The first prosecutor, by contrast, is operating at a safe distance from any harmful consequences. Even if he doesn’t hand over the report, the defendant’s lawyers are likely to uncover additional evidence to vindicate him—if he is innocent—on their own. As a result, they should be able to present a convincing case to the jury, which should acquit. So the blame for any wrongful conviction would fall squarely on the defense attorneys and jurors.
In the Thompson case, the fact that Deegan was working on the robbery trial, not the murder trial, would have made it even easier for him to distance his actions from Thompson’s death sentence: after he completed his own role in the prosecution, there was still an entire additional trial that had to take place.
The nature of the adversarial system, which places the burden on lawyers to zealously advocate for their positions but does not charge them with the task of being the ultimate decision-makers, may itself promote dishonesty by allowing attorneys to feel less responsible for the consequences of their actions. And the earlier in the trial process it is, the more attenuated the connection to any eventual harm is likely to seem. All other things being equal, we ought to expect more dishonesty in the weeks leading up to trial than in the time after a jury is impaneled and proceedings have begun. Once a conviction has been attained, the process ought to reverse itself: new dishonest actions that simply maintain the status quo (for example, denying the defendant a fair appeal or a parole hearing through a dishonest act) should become easier to rationalize.
Since Thompson was on death row when Deegan revealed to his friend Riehlmann that he had withheld potentially exonerating evidence in Thompson’s case, it would seem harder for Riehlmann to justify not passing on the information to Thompson’s lawyers. But unlike Deegan, Riehlmann had not worked on the case and hadn’t failed to hand over evidence. He had only heard about the violation. Just as important, Riehlmann had learned about it well after Thompson had already been convicted and put on death row. And given that Thompson’s blood type was not known, it would not have been clear to Riehlmann that passing on the information to Thompson’s attorneys would make any difference at all. Indeed, it was possible that even if the report had been handed over, the defendant might have turned out to have the same blood type as the perpetrator, which would have made him more likely to be convicted, not less. Although we can’t know for sure what led Riehlmann to keep Deegan’s confession secret for five years as Thompson’s life hung in the balance, it seems plausible that someone in his position might do so without feeling much responsibility at all.
Note, too, that omissions and commissions are viewed differently. It is easier to see the harm in a commission (the hypothetical prosecutor bribing a juror) than in an omission (Deegan failing to turn over the lab report). An omission does not seem to upset the “natural” course of things, whereas a commission seems to steer events onto a different path. And omissions tend to be more readily justified because there are almost always benign reasons for not doing something: “I didn’t understand it was my responsibility,” “No one told me,” or simply “I forgot.”
So, the fact that Brady violations involve a prosecutor not doing something that she is supposed to do may make them particularly likely to occur. Other common types of prosecutorial misconduct involve omissions as well: for example, failing to alert the court when you know your witness is lying on the stand, or turning a blind eye to a law enforcement officer concealing or destroying evidence.
It can also feel like we’re not taking action if there is someone (or something) else we can characterize as controlling our behavior. If our boss made us do it, we are not responsible for what happened. Attorneys in the criminal justice system are often acting at the behest of another: an assistant D.A. will be very aware that he is working for the D.A., just as a public defender will be aware that it is his client who ultimately calls the shots. As a result, the establishment of rigid hierarchies and chains of command may provide a sturdy ladder for misconduct.
—
Another way we manage to justify bending the rules is through social comparison. Of the dishonesty I’ve witnessed in my life, it’s uncanny how many of the incidents involved groups of people rather than lone individuals. One person crossed the line and then suddenly three others were right there with him. Being in a group seems to alter people’s moral compasses, aligning the dials and leading people who would otherwise act ethically toward trouble. Researchers have begun to look more deeply at this seemingly infectious aspect of dishonesty. Rather than viewing our behavior in absolute terms (“Is it moral or immoral to cheat on this test?”), we measure our actions against those of people around us (“Are my best friends sharing answers on the exam?”).
In a recent study, psychologists had a group of people complete a test in which the better each person performed, the more money he earned. One of the “test-takers,” who was actually working for the experimenters, was instructed to cheat very publicly by standing up, just sixty seconds into the test, and announcing that he had finished, had gotten everything right, and would therefore be getting the maximum payment. The question was whether this behavior would encourage other people to cheat. It did, but only when the cheater seemed to belong to the same group as everyone else in the room—that is, when he was wearing a T-shirt from the university where the experiment was being administered. When he was wearing a rival university’s T-shirt, cheating actually dropped significantly below the control co
ndition.
In an adversarial legal system, lawyers have especially strong group identifications—it’s the prosecution versus the defense—and under such circumstances they should be particularly susceptible to moral cues from their compatriots. It is revealing in this regard that episodes of prosecutorial misconduct are often not isolated, as we would expect if it were a simple matter of rogue agents pursuing their own corrupt ends.
Responding to Justice Scalia’s characterization of the prosecutorial misconduct in Thompson’s case as “a single Brady violation by one of [Orleans Parish’s]…prosecutors,” Justice Ruth Bader Ginsburg, in a blistering dissent (which she read from the bench), pointed out that there were actually “no fewer than five prosecutors” who had acted to deprive Thompson of his rights and had “kept from him, year upon year, evidence vital to his defense.” As she described, this “was no momentary oversight, no single incident of a lone officer’s misconduct”: “Throughout the pretrial and trial proceedings against Thompson, the team of four engaged in prosecuting him for armed robbery and murder hid from the defense and the court exculpatory information Thompson requested and had a constitutional right to receive.”
Bruce Whittaker, the prosecutor who had initially approved the armed robbery indictment for Thompson, had received the crime lab report showing that the perpetrator’s blood was type B and had placed it on the desk of James Williams, an assistant D.A. who was trying the case. But neither man turned it over to Thompson’s counsel—despite an official request for all information “favorable to the defendant” and germane “to the issue of guilt or punishment,” including “any results or reports” of “scientific tests or experiments.” Then Deegan, who was working with Williams, transferred all of the physical evidence from the police property room to the courthouse property room, but left out the bloodstained swatch. Neither Williams nor Deegan mentioned the swatch or the crime lab report during the trial—and the swatch was never seen again.