A Manual for Creating Atheists

Home > Other > A Manual for Creating Atheists > Page 10
A Manual for Creating Atheists Page 10

by Peter Boghossian


  PB: I’m curious, if you don’t mind me asking, what did that guy say to you? The conversation looked really heated.

  W1: He wanted us to come to his church.

  PB: Now? At 8:00 p.m.?

  W1: No, no.

  PB: What did you say?

  W1: We said we already have a church.

  PB: Oh. So why didn’t you want to go his church? Then you’d have two churches.

  (Perplexed)

  W2: What?

  PB: I mean if having one church is good, maybe having two churches is better. I mean, that way you’d cover some of your bases. What if in your church they’re missing something key, but if in this other church they have what your church is missing?

  W1: Our church isn’t missing anything.

  PB: Oh. How do you know that?

  W2: Know what? What do you mean?

  PB: I mean how do you know that your church has everything you need, or that they’ve gotten all of the rules right and such, and that his church might have picked up on something that your church overlooked?

  W1: What are you talking about?

  PB: I’m taking about one reason I think you’re going to church. You want to be saved, right? Am I right?

  W1: Of course. We are saved.

  PB: That’s really great. Did they tell you that in your church?

  W2: Yeah, kind of.

  W1: Yes.

  PB: So then if you’re already saved, why did that man want you to go to his church?

  W1: What?

  PB: When you told that man that you went to your own church and that you’re already saved, which I assume you told him, why did he then want you to go to his church? Why would he want that? What would be the point of going to his church if you’re already saved?

  (Long pause)

  PB: Why didn’t you tell him that he’s the one who should be going to your church because you’re already saved?

  W1: We don’t care where he goes to church.

  PB: But he obviously cares where you go to church. He must think you’re not saved or he wouldn’t want you to go to his church. So if he thinks you’re not saved because you don’t go to his church, and you think you are saved because you do go to your church, how do you know you’re actually saved? Someone has to be wrong. How do you know it’s not you?

  W2: Because we know we’re saved. We know it.

  PB: But he knows you’re not saved. In fact, I think he’s more certain that you’re not saved than you are that you are saved.

  W2: We’re saved. We’re saved.

  PB: Don’t you think it’s strange that a fellow Christian would want you to leave your church?

  W1: What do you mean?

  PB: If you’re already saved why would it make any difference which church you go to?

  (Pause)

  PB: If you’re already saved then why would it make any difference which church you go to?

  W1: I guess it wouldn’t matter.

  PB: So if you’re already saved, why go to church at all?

  (Pause)

  W1: I don’t really know.

  (End of conversation)

  DIG DEEPER

  Article

  Daniel Dennett and Linda LaScola, “Preachers Who Are Not Believers” (Dennett & LaScola, 2010)

  Blog

  Matt McCormick, “The Defeasibility Test” (McCormick, 2011)

  Books

  Christopher Muran and Jacques Barber, The Therapeutic Alliance: An Evidence-Based Guide to Practice (Muran & Barber, 2010) (Focus on pp. 7–29, 97–210, and 285–320)

  Daniel Dennett, Breaking the Spell (Dennett, 2007)

  William Miller and Stephan Rollnick, Motivational Interviewing (Miller & Rollnick, 2002) (Focus on pp. 3–179)

  Dan Barker, Godless: How an Evangelical Preacher Became One of America’s Leading Atheists (Barker, 2008)

  For a frightening glimpse into the Christian world of “Relationship Evangelism,” see:

  Shawn Anderson, Living Dangerously: Seven Keys to Intentional Discipleship (Anderson, 2010)

  Arron Chamber, Eats with Sinners: Reaching Hungry People Like Jesus Did (Chambers, 2009)

  Dave Earley and David Wheeler, Evangelism Is …: How to Share Jesus with Passion and Confidence (Earley & Wheeler, 2010)

  NOTES

  Airplanes offer a fantastic opportunity to practice Street Epistemology—particularly if you fly Southwest Airlines, or any other airline that doesn’t have assigned seats. I usually get on the plane a little later and try to sit next to someone reading a religious text. Middle seats are good, as they increase your chance of sitting next to someone of faith.

  The creation of nonadversarial relationships is a necessary condition for a successful treatment. Trustfulness of reason and willingness to reconsider are two crucial posttreatment attitudes the faithful need in order to make a full recovery.

  I find this easy to do with rank and file believers, but difficult with faith leaders. I’m often left with the suspicion many reflective and published apologists don’t genuinely and sincerely believe what they claim to believe. I’ve always found Dinesh D’Souza to be an example of someone who’s insincere; Ravi Zacharias, who appears to me as someone who suffers from pathognomonic delusions, strikes me as someone who’s sincere. Toward the end of many of my conversations with apologists, I’m left with the feeling that they’ve often said things or taken positions to justify their beliefs to themselves. I find it bizarre during the pauses in our conversation when they wait for my approbation, and seem disappointed when it’s not forthcoming.

  Shermer has noted that the smarter someone is the better they are at rationalizing. I think he’s correct. Smart apologists are good at generating reasons for why they believe their irrational beliefs are true—and they spend a good deal of their time doing just that.

  Rank and file believers do not fill their days with thinking about how to defend their faith. The combination of insincerity, intelligence, and intentionally leading or coaxing others into an unreliable epistemology makes it difficult to be open in these communicative engagements.

  Often, the same people have a lower threshold for what constitutes reliable evidence. For more, see Mele’s, Have I Unmasked Self-deception or Am I Selfdeceived? (Mele, 2009, especially p. 264). The original citation can be found here: Trope, Y., & Liberman, A. (1996). Social hypothesis-testing: Cognitive and motivational mechanisms, in E. T. Higgins & A. W. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 237–270). New York: Guilford Press.

  One way to conceptualize the relationship between belief and evidence is through Israeli-American psychologist and economist Daniel Kahneman’s System 1 and System 2 thinking (Kahneman, 2011). System 1 thinking (intuition) is instantaneous, automatic, subconscious, and often has some degree of emotional valence; System 1 thinking is the result of habits and resistance to change. System 2 thinking (reasoning) is much slower, more subject to change, more conscious, and requires more effort. Many beliefs are formed on the basis of the System 1 fast-thinking phenomenon. Doxastic closure can come about when people lack the system capacity to reinsert evidence into their System 1 thinking—that is, their System 1 thinking is invulnerable to System 2 thinking. They haven’t developed the ability to allow System 2 thinking to penetrate System 1 beliefs.

  Dawkins explicitly stated that he will not debate creationists (Dawkins, 2006b). Noting Stephen Gould’s advice, he writes, “‘Don’t do it.’ The point is not, he said, whether or not you would “win” the debate. Winning is not what the creationists realistically aspire to. For them, it is sufficient that the debate happens at all. They need the publicity. We don’t. To the gullible public which is their natural constituency, it is enough that their man is seen sharing a platform with a real scientist. ‘There must be something in creationism, or Dr. So-and-So would not have agreed to debate it on equal terms.’” I’d go beyond this and state that for a reputable scientist to publicly debate a creationist borders on being unethical
. Providing a platform for someone who suffers from a pathogenic belief may push the creationist even further into delusion.

  For more here, I highly recommend Schick and Vaughn’s, “How to Think about Weird Things” (Schick & Vaughn, 2008). Specifically, pp. 179–189 cover the following criteria of adequacy: testability (180), fruitfulness (182), scope (185), simplicity (186), and conservatism (189). Arguing about what constitutes evidence and what are the criteria for evidence usually results in shifting the discussion into ever-receding tangents. Such shifts are common rhetorical tactics of apologists. If you choose to enter into a discussion about what constitutes reliable evidence, I suggest you carefully read the Schick and Vaughn text.

  This is particularly true among intelligent, articulate apologists. The more intelligent and articulate the apologist, the more conspicuous and epistemologically enfeebling the confirmation bias.

  It’s very difficult to start from a position of belief neutrality because everyone suffers from some form of confirmation bias—myself included. When I read the work of religious apologists, for example, I find myself incredulous and in a state of perpetual marvel that intelligent, thoughtful people could seriously entertain such hokum. I have to force myself to step back, to intellectually open myself up to looking at their evidence and, more importantly, the process of reasoning that they use to come to their conclusions. The process of genuinely opening oneself up to competing ideas is vital for one’s intellectual life, because it prevents doxastic closure. How one engenders this attitude in the first place, however, is complicated and subject to many personal, psychological, social, and emotional variables.

  Metaphysical discussions center on the furniture of the universe—what exists or does not exist. Bringing metaphysics into a discussion is usually fruitless and may even be counterproductive, in some cases pushing people further into their faith and metaphysical delusions. Conversations about what there is, as opposed to how one knows what there is, cannot gain cognitive traction because the entities in question (Gods, angels, demons) have no attributes that leave a footprint in the natural realm. Given this starting condition, there’s nowhere for the conversation to move. Consequently, these discussions almost invariably devolve into he said, she said. One reason many people assign belief in God a high number on the Dawkins’ Scale is because they started with metaphysics and worked their way back to epistemology. That is, people started with the belief God exists and then asked themselves how they know this. This is confirmation bias. No discussion of alternative formulations of what there is (maybe there’s a God but it’s somehow limited, maybe there was a God but in creating the universe it extinguished itself) will divorce this self-interested bond with metaphysics.

  An interesting question is whether it’s even possible to knowingly use an epistemology that will not guide one to the truth. For example, if one knows goat sacrifice will not lead one to the truth about how to build a better car battery, is it possible one can use the process of goat sacrifice and make oneself believe that it is a reliable way to build a better car battery? The opposing views of Clifford and James are helpful here. Clifford (Clifford, 2007) basically shares the conception of knowledge put forth by Plato in the Theaetetus: Knowledge = Justified True Belief. Knowledge is not a fuzzy thing that we can decide to have or not. For Clifford, one can’t decide to believe something. You lend your belief to a proposition because you’re forced to believe it by, among other things, the thoughtfulness that you have given to the problem. From Clifford’s perspective it’s not possible to force yourself to believe anything. And if somehow you do manage to force yourself to believe something, then you have a kind of epistemological sickness.

  William James takes a very different position (James, 1897). For James, we don’t always know things or believe things or regard the world in terms of the appreciation of evidence. Our attitude about how we go about our lives is everything. James went up to his room at his parent’s house and pretty much stayed there for years, thinking through questions of belief. James comes to the opposite conclusion from Clifford: one can decide to believe certain things; one can make a decision to believe something; it’s healthy for one to do so in certain cases and it would be unhealthy for one to do so in other cases. As a pragmatist, James is saying that his concern isn’t whether a belief is true according to some abstract standard of truth; rather, his concern is whether it is going to serve one’s purposes in living a fulfilling human life. Thus, James’s answer and Clifford’s answer are in direct opposition. The issue at stake in this debate is the idea of whether the human part of ourselves can supersede our scientific reasoning or whether some sort of appreciation of scientific evidence should supersede our human feelings.

  Choosing to believe a particular proposition is referred to in the philosophical literature as “doxastic volunteerism.” While James predates this literature, one of the examples James uses to demonstrate belief choice is in the health arena. One cannot, James argues, “by any effort of our will, or by any strength of wish that it were true, believe ourselves well and about when we are roaring with rheumatism in bed” (James, 1897, p. 5). However, evidence suggests that often people do in fact believe that they are well when they are quite ill (Livneh, 2009; Vos & de Haes, 2007). Whether this is a conscious choice remains unclear. Conversely, often people believe they are sick when they are well.

  Among the interesting manifestations of this phenomena are what medical anthropologists term “culture-bound syndromes,” which have recently been included in the DSM-IV (Bernstein & Gaw, 1990). Culture-bound syndromes are recognizable diseases only within a specific culture or society. Koro, for example, is the unsubstantiated belief that one’s penis is retracting into one’s body, and that it will ultimately disappear (Edwards, 1984). Koro is primarily found in China and Southeast Asia, though recently it has appeared in parts of the developing world, and even among the intellectually disabled (Faccini, 2009).

  Overlapping and particularly interesting arguments about belief choice, specifically in regard to God, can be found in the literature on Pascal’s Wager. Pascal’s Wager states that one should bet as if God exists, and consequently believe and live as if God exists, because if one does so then one has everything to gain and nothing to lose. One line of criticism is that one cannot force oneself to believe in God. Harris articulates this in a piece for the Washington Post titled, “The Empty Wager” (Harris, 2007). Harris writes, “But the greatest problem with the wager … is its suggestion that a rational person can knowingly will himself to believe a proposition for which he has no evidence. A person can profess any creed he likes, of course, but to really believe something, he must also believe that the belief under consideration is true” (Harris, 2007). Many Christian apologists, and even some secular writers (Braithwaite, 1998, pp. 37–44), would disagree.

  A common thread among these “God exists” discussions is that one is attempting to force oneself to believe. To my knowledge there have been no empirical studies demonstrating whether it’s possible to force oneself to believe in God, or perhaps more effortlessly, to believe trivial propositions (like whether or not a McDonald’s hamburger bun has more sesame seeds than a Burger King bun). Furthermore, it is unclear how to test whether it is possible to force oneself to believe in various propositions.

  Here are a few of the more well-known examples: Ted Haggard (had sex with a male prostitute and used crystal meth), Peter Popoff (exposed by James Randi for using a concealed radio receiver to deceive his flock), Jimmy Swaggart (had sex with a prostitute), Father Murphy (sexually abused countless deaf children), W. V. Grant (a faith healer who was exposed for using magic tricks to fool his followers), Father Thomas Laughlin (molested underage boys), Monsignor William Lynn (covered up countless cases of priest molestations and rapes by moving priests to new parishes), Benny Hinn (televangelist and faith healer exposed on Dateline NBC), Bishop James Hunter (arrested for selling drugs), Terry Hornbuckle (drugged and raped women in his congregation), Anthony M
artinez-Garduno (sold meth and “date rape” drugs at his church), Ryan Jay Muehlhauser (sexually assaulted two homosexual men while allegedly attempting to turn them straight during a counseling session about their sexual orientation), Oscar Perez (sexually assaulted boys in his congregation).

  Occasionally I’m told that no person of “true faith” would ever do anything immoral. This is a version of British philosopher Antony Flew’s “No True Scotsman” fallacy: Imagine a Scotsman reading the morning paper. He sees an article about how a person from Scotland commits an act of brutality against an innocent. He responds, “No true Scotsman would ever do that.” When reading the paper the next morning, he sees an article about a person from Scotland who does something even more horrific. He repeats, “No true Scotsman would ever do that.” This is one of the few instances when I don’t generate an example of a person of faith who is immoral. If I generate the example it could be met with, “Well, no person of true faith would ever do that.” It’s better to have subjects generate their own examples to avoid this fallacy.

  American philosopher Matt McCormick offers a variant of this that he calls, “The Defeasibility Test” (McCormick, 2011). It’s worth carefully reading. (“What would it take for you to abandon your faith?” is the first question I ask people interested in debating me.) McCormick writes, “Are there any considerations, arguments, evidence, or reasons, even hypothetically that could possibly lead me to change my mind about God? Is it even a remotely possible outcome that in carefully and thoughtfully reflecting on the broadest and most even body of evidence that I can grasp, that I would come to think that my current view about God is mistaken? That is to say, is my belief defeasible? If the answer is no, then we’re done. There is nothing informative, constructive, or interesting to be found in your contribution to dialogue. Anything you have to say amounts to sophistry. We can’t take your input any more seriously than the lawyer who is a master of casuistry and who can provide rhetorically masterful defenses of every side of an issue. She’s not interested in the truth, only is scoring debate points or the construction of elaborate rhetorical castles (that float on air).”

 

‹ Prev