We’ve all encountered it more and more these days: That person on facebook, a friend, a relative, an old high school or college friend who believes things that simply have little to no basis in facts. Any reasonable attempt on our part to persuade them with reliable facts only seems to make them believe more strongly in their erroneous beliefs and the reasoning (or lackthereof) they use to justify them. Many of us have, I’m sure, asked ourselves during or after such encounters, “How do you debate with or persuade someone who doesn’t believe in facts? Who is totally closed off from even considering any point of view but their own?” This is a VITAL question for our age. Changing hearts and minds will be the difference between society moving forward or slipping backward into a dark age America has not yet known.
In seeking answers to this question of how to talk to such people I recently observed more than one article on facebook about the phenomena of how the ignorance of ignorant people becomes self-regenerating and self-protecting. One article was about the Dunning-Kruger Effect and another was about the Backfire Effect.
The Dunning-Kruger Effect is when someone with little to no knowledge, talent, or expertise in a particular area perceives themselves as actually being very good and reliable in that area. Tom Nichols writes in Foreign Affairs:
In 2014, following the Russian invasion of Crimea, The Washington Post published the results of a poll that asked Americans about whether the United States should intervene militarily in Ukraine. Only one in six could identify Ukraine on a map; the median response was off by about 1,800 miles. But this lack of knowledge did not stop people from expressing pointed views. In fact, the respondents favored intervention in direct proportion to their ignorance. Put another way, the people who thought Ukraine was located in Latin America or Australia were the most enthusiastic about using military force there.
The other article was about a related phenomenon called the Backfire Effect. The Backfire Effect, as I understand it, is about a deliberate refusal or simply ignorance of critical thinking as a whole. A pre-existing perception, regardless of whether that perception is based on accurate information or not, is held as sacred. Any attempt to disprove that belief is an attempt to call the holder of that belief stupid and therefore a show of disrespect. “You disagree with me therefore you don’t respect me.” This creates a persecution complex which causes these people to be defensive and feel the need to cling all the more tightly to these beliefs they perceive as being under attack. Any facts used to attack these beliefs must be inaccurate, or at least ill intentioned, because to admit their validity is to admit you might be wrong and therefore stupid enough to have believed the wrong thing in the first place.
Forget the scientific method of hypothesis, testing, and only then arriving at a theory. These two psychological effects totally paralyze one’s ability to think critically. One’s ability to soberly comprehend reality, evaluate trustworthiness, and otherwise make sound decisions are also sorely inhibited.
So how do we break through then? How can ignorance that proudly insists on being called expertise be persuaded it isn’t? None of the articles I read seemed to offer any concrete advice or solutions, although the end of Tom Nichols’ article in Foreign Affairs came close. And then I looked up The Backfire Effect and found a “rationalwiki” article about it that directly addresses its causes and some ways to fight it. I highly recommend reading it, but here are my takeaways/interpretations as it relates to the problem of political discourse with people who are effected by the Backfire Effect and also the Dunning-Kruger Effect.
Critical thinking is the key thing these people are missing. So often when well informed, thoughtful people try to persuade or inform less informed, less thoughtful people, it ends with the thoughtful informed people thinking those who are less informed are simply stupid. This is not so. Less educated or less thoughtful perhaps, but not necessarily stupid. Some people are indeed less intelligent than others, but most everyone is capable of some degree of critical thinking and analysis. Critical thinking is a skill that takes practice. Like a muscle that grows weak from lack of use, so too the neurons of critical thinking grow weak and less likely to be used.
Conversations with people who have clearly not used facts or critical thinking to arrive at their opinions need to be about introducing them to using these invaluable tools in a way that doesn’t come off as telling them how to think. Right though you may be, telling someone they are wrong or stupid for thinking a certain way or arriving at a conclusion a certain way simply makes them angry and less likely to even consider listening to you or anything you have to say. By insulting them, accidentally or on purpose, you’ve made them an enemy to you and the ideals you’re trying to advance. And even though writing them off as fools who are destined to go on happily marching to Hell and calling it Heaven might make you feel superior and good about yourself in the short term, in the long term you have FAILED MISERABLY at achieving your goal of actually persuading someone.
So what’s the silver bullet to talking to and persuading such people? Not more facts, not rhetorical devices or mind games. The answer is simple, but difficult: Compassion, listening, and patience. The less you arouse their anger and yours, the more likely both of you are going to listen to one another. The less emotion that enters the conversation, the more room that is left for logic and sober thinking to enter into it. As for patience, no one’s mind is changed over night, especially if their beliefs are deeply ingrained. So the next time you get in a conversation with “one of those people” take a deep breath, and remember to take many more deep breaths after that.