When you fundamentally disagree with someone, what is the best way to change their mind? Actually, that’s a trick question, even though I didn’t realize it when I wrote it.
It’s a trick question because it’s impossible to change someone’s mind. They have to change their own mind. You might be able to say or do something that will make it likelier that that will happen, but in the end, they—and only they—decide whether to change their opinion or belief. As David McRaney, author of How Minds Change, says: “All persuasion is self-persuasion.”
So, what is the most effective way to improve the odds that they will change their mind about the topic you disagree about?
Those of us who like to see ourselves as logical, reasonable critical thinkers (and who doesn’t?) believe we know the answer: present evidence that the other person doesn’t have or that disproves what they think they know, and weave that evidence into a logical chain or coherent story which they can’t honestly contradict. That’s why logical persuading is by far the most commonly used persuasion technique used worldwide, including in countries as diverse as China and America, Germany and India.[1]
But, how well does logical persuading work in actual practice? Think for a minute about your won-loss record for arguments you’ve had with other people. How many times have you actually “won” an argument? How many times has your counterpart said: “I didn’t know that”, or “I hadn’t thought about it that way”, or Thanks for setting me straight”? On the other hand, how many times have same words emanated from your mouth? My guess is that the overwhelming percentage of your debates have ended in a draw, meaning that neither party budged much from their initial opinion.
I’m not saying that evidence and proper logic don’t have a place in persuading someone. In many situations, they will be decisive. If I’m comparing brands of washing machines, prices, features, and customer ratings will carry more weight than emotions. But when you’re trying to get someone to change their mind about a deeply held belief, evidence and logic are not enough, and may even backfire.
Why don’t facts and evidence work well in overcoming deeply held beliefs and opinions? The first reason is that people often didn’t acquire their beliefs consciously, or because of verifiable facts. Do you remember when you began to believe in God, or to disbelieve? Do you recall the reasoning that led you to be patriotic, or to register with your political party? For any of these types of beliefs, how often have you reexamined them in the light of contradictory evidence?
In reality, we form many of our beliefs and opinions first, and then find support for them only if needed, such as when arguing for our position.
Second, you most likely absorbed many of your beliefs from the people around you. As highly social animals, we fear ostracism from the group. So, if our objective view of the situation makes us lean one way, but everyone we like or respect thinks otherwise, we can usually find a way to fit our view into theirs. For this reason, our values, beliefs and opinions become part of our identity—who we are and how we see ourselves. And identity is something we fiercely protect.
The extension of this is that we automatically mistrust anyone who is not part of our circle, who is not one of “us”. Our default reaction to any statement they make is one of disbelief, and we look for counterarguments. Or, we may simply refuse to listen.
Third, we often rely on reasoning rather than logic to defend our beliefs. What’s the difference? Logic is a tool for seeking truth. Reasoning is a process for finding support for our point of view. That difference explains why highly intelligent people often are the most articulate in defending wrong beliefs; they are good at finding clever ways to defend falsehoods.
By the way, says McRaney, that’s not a bug in our thinking. It’s a feature. Our brains evolved to make us better at argument. Argument evolved as a way for the group to combine and integrate diverse perspectives and bits of information, leading over time to better group decisions, and increased likelihood of group survival. As McRaney says, “Rather than looking for flaws in our own arguments, it’s better to let the other person find them, then adjust our arguments if necessary.” Presumably, a tribe full of good arguers will adapt better than one dominated by one individual. Thus, human reason evolved to convince others and be skeptical of their attempts to convince.
Last but certainly not least, we are cognitive misers, which is a nice way of saying we’re mentally lazy. It can be difficult or scary to reexamine a long-held belief, or to listen carefully and follow the logic that someone spent a long time working out for themselves. It’s much easier to pluck the first objection that comes to mind and use it to defend what we already have.
These reasons all combine to dilute the strength of even the most impeccable logic and evidence. But then, you might think, at least we have to try, don’t we? Even if we are unlikely to succeed, maybe some of what we say will get through. Maybe we will be able to plant a seed that will cause them to think, and maybe if we do enough of it, diluted or not, it will at least have some effect. Maybe over time you can win them over.
But argument is like pushing a porcupine. Not only will you fail to move it, but you will pay a price. Suppose someone says something that they haven’t thought through very well. You immediately spot the weakness and supply your counterargument. Human nature being what it is, their response to that is not usually to think carefully about what you said and use it to modify their own thinking. Instead, what usually happens is that they begin to generate counterarguments. What was a half-formed thought or merely an attitude is bolstered by reasons—their reasons. Now, they feel even more secure in their position.
So, if argument doesn’t work, what does? That’s the subject of my next post.
[1] Terry R. Bacon, Elements of Influence, p. 54.