Suppose a cult leader calls you crazy

January 22nd, 2014 § 2 comments

One day this fall my social psychology seminar had veered into an interesting discussion about the ethics of persuasion, and the class agreed that cult leaders are the paragons of this dark art. I was less sure. “What is it that makes their persuasion unethical?” I asked the class, “Can’t the same thing be said about vegetarians and atheists? What makes it ethical when you try to persuade people about your beliefs?” My professor replied, “The difference is, I appreciate reality for how it really is.” “But,” I responded, “Isn’t that exactly what a cult leader would say?” The class rolled with nervous laughter. The professor was briefly silent. “Yeah, I guess that’s right, huh…?” Objectivity is what makes persuasion seem ethical; but in practice, “objective” normally means “what I want to be true.”

Naive realism

This property of belief is called naive realism, which is a philosophical term appropriated by psychologists Lee Ross and Andrew Ward (full text). According to Ross and Ward, we all feel like our perception of reality is essentially accurate, and that our beliefs are merely a consequence of seeing the world how it really is. It’s clear to us that cult leaders believe impossible things, but they surely would say the same thing about you. How do we figure out who’s right? Naive realism is an error of perspective taking, as my professor accidentally demonstrated in class. So to clear up the confusion, we’re going to look at things from the other side.

“Why,” the doomsday prophet laments, “do people still not see the Truth?”

“The end is nigh, and it is my job to make the world repent.”

Despite his earnestness and the verity of his words, he fails to win any converts. Why do they still not believe?

“It must be that other people don’t, or can’t, see the facts. Maybe they’re simply incapable of connecting the dots. If I show them the way, they can’t possibly ignore the Truth.”

After another day of miserable failure, The Prophet sits ruefully on his soap box, head in hands. The world is doomed. How does he make sense of this?

“I’m sure I’m right.” he says. “Those who don’t believe are hopelessly corrupted.”

The Prophet fails because he presumes that he is the one person who understands the real Truth about the world. Ross and Ward note that we’re all prophets in this way, treating our own perspective on issues as sacrosanct, and with results that convince us too that doomsday is imminent. No doubt you’ve often inspired the same thoughts in others. Why does disagreement breed hopelessness?

Different construals resemble different realities

Certainly there exists a “real” world, yielding to interrogation, which sits right in front of our eyes. But we exist behind them; we all construct our own versions of reality, decoded by our brains and therefore biased by our motivations. (Of all the countless demonstrations of this fact, my favorites can be found here.) The world is messy and ambiguous, and we’re virtuosic at selecting facts that will convince us we’re right. As a salient example, Ross and Ward note that Israelis and Palestinians both see prejudice in objective, identical news reports, because each group is motivated to believe that their arguments are treated unfairly—how else could anyone deny their obvious correctness?

Both sides understand the relevant issues, but the core issues for each are distinct. For example, death penalty opponents promote arguments referencing compassion and equality, while supporters view capital punishment in terms of personal responsibility and deterrence. Hence, the groups often talk past each other; when a death penalty opponent cites the value of compassion, a supporter is likely to accuse her of intentionally avoiding the “real” issue—i.e., the issue he cares about. We see bias in the fact that people’s reasoning conveniently matches their preferences. They are proceeding from the top down, we think, and letting their motivations dictate their understanding of the facts. Of course, our beliefs match our preferences, but that’s because our reasoning works from the bottom up, with the evidence determining our preferences. That they match testifies to our objectivity.

Both parties have framed the debate to support their favored conclusions, yet both have accused the other of bias. Because of this misperception, each group invents motives that help them to rationalize the other’s behavior (e.g., “It’s not possible to seriously believe the death penalty is fair. He’s just a racist.”).

Certainty and Ambiguity

If you tried to tell the prophet that the world is ambiguous, meaning that his interpretation is only one of many reasonable perspectives, with its apparent superiority owing to the simple fact that it accords with his preferences, do you think he would reconsider? Again, probably not. How about you?

Getting people to recognize the failings of their reasoning abilities is an incredible challenge. We all need to feel like our beliefs are based on durable bottom-up reasoning. To accept that other people can come to diametrically opposite conclusions, using identical reasoning processes, is profoundly dissonance-inducing. It suggests that the facts we consider to be so clearly on our side only seem that way because we want them to; that when we seek to defend our beliefs we’re rarely honest in evaluating the evidence; and that if we were truly objective we might be forced to reconsider our entire worldview.

How can we use this information?

Reasoning biases are impossible to eliminate, and our only hope for objectivity is by keeping them in the front of our minds. Realize that people can disagree with you for good reasons. Remember that you also have a subjective point of view. Consider how your motivations affect your disagreements, and try to see things from the other perspective. In other words, increase your empathy and decrease your pride—it might create the understanding you were hoping for.

Tagged , , ,

§ 2 Responses to Suppose a cult leader calls you crazy"

  • Dad says:

    This is really a well developed essay.

    Reading it made me think of the JLT a.k.a. NYLT exercise and leadership goal of “representing the group.” The SPL meets with and accepts the plan, being persuaded by the Scoutmaster, and then must convince the ASPL and PLs that he has the right idea to act on. This real scout exercise is very dynamic and not so encumbered with pre-concieved bias perhaps, but still a good parallel illustration of ethical persuasion.

    The most interesting and, in my opinion, the best part of this blog submittal is the final section on how to use this information, which is clear and simple, and should be required reading by everyone seeking a career in diplomacy.

  • henry says:

    hello

Leave a Reply

Your email address will not be published. Required fields are marked *

What's this?

You are currently reading Suppose a cult leader calls you crazy at Exploring Ideology with Social Psychology.

meta

Skip to toolbar