The UK’s recent referendum on European Union membership seems to have exposed deep fault lines in British society, if the strength of feeling shown in arguments on social media is anything to go by. I’ve certainly seen the normal rules of civilised conversation set aside, people on either side calling each other terrible things, and huge amounts of anger and discourtesy.
What I haven’t seen is anyone saying “Yes, I appreciate your argument even though I previously disagreed with it – I will re-examine my opinion.”
This tends to baffle ‘Remain’ voters who can’t understand why the ‘Leave’ partisans aren’t changing their minds, even though the claims made by the ‘Leave’ campaign about immigration and finances (on which presumably they based their decision) are rapidly being rowed back from or disavowed by the same politicians who were trumpeting them before the campaign. Why aren’t the ‘Leave’ enthusiasts changing their minds in the light of the new information that’s coming in daily?
Here’s why: facts, and reasoned argument, don’t change minds once those minds are made up. Although we can sometimes, with effort and awareness, overcome this tendency in ourselves, it applies to all of us. There are at least three ‘cognitive biases’ identified by psychologists that explain why:
1. Confirmation Bias
Once we believe something, we pay more attention to and amplify any evidence that confirms our belief, and play down or ignore any evidence that refutes it. We like to be right about things, so we seek out the evidence that supports our feeling of being right.
Why does this happen? We’re pattern-seeking organisms, always looking for connections that allow us to put together a coherent picture from the jumble of bits of information that our senses take in. And if there’s no pattern out there in the data, our minds will supply one.
As a former hypnotherapist, I’m also aware of the recognised hypnotic phenomenon (or ‘sign of trance’) known as hallucination. When in a trance, hypnotic subjects sometimes see things that aren’t there (‘positive hallucination’) or don’t notice things that are there (‘negative hallucination’). The same thing happens with supporting or disconfirming evidence.
You might say “Hang on, but people aren’t usually in trances!” Or are they? If we define trance as a state in which you pay more attention to some things than others, when are we not in a trance? The great hypnotherapist Milton Erickson taught us that trance is a normal, everyday experience. Stephen Wolinsky’s great book Trances People Live goes further, and suggests that we in one kind of trance or another most of the time. If you doubt this, remember the difficulty you have getting someone’s attention when they are looking at their phone.
So once they’ve taken a position on a political question, people will filter out evidence that contradicts that position. If it looks like ‘Brexiteers’ and ‘Remainers’ (or ‘liberals’ and ‘conservatives’ in the USA) are seeing different worlds, it’s because – subjectively – they are.
Practical implication: most of the facts and evidence you give people to argue them out of their views will be screened out, or they’ll draw different conclusions from it than you do.
2. Commitment and Consistency
This is a principle identified by the leading researcher into the psychology of influence, Dr Robert Cialdini.
“Once we have made a choice or taken a stand, we will encounter personal and interpersonal pressures to behave consistently with that commitment” – from his classic book Influence: Science and Practice
When someone has identified themselves as a ‘Leave’ voter (for example), they will feel an internal pressure to make their future actions conform with that identity. We like to be consistent – inconsistency is seen as an undesirable trait, associated with weakness, lying, and dishonesty.
Commitments are particularly powerful if they are made in public (e.g. the person changing their Facebook profile picture to an “I’m voting Leave” graphic, or arguing in favour of their choice in a discussion).
In line with confirmation bias, once the person has self-identified as a ‘Leaver’ and made a commitment to it, they will tend to screen out or reframe any evidence that suggests they were mistaken.
Practical implications: to change someone’s mind, you will be more likely to succeed if you go about it indirectly, perhaps eliciting a commitment from the person to some other value or identity of theirs that could lead them eventually to a different conclusion, but that isn’t in direct and obvious opposition to their original commitment.
Also, to stop yourself from being unable to take in relevant information by commitments that you’ve made, reframe ‘errors’ as ‘surprises’. No-one likes to think that they’ve made a mistake, and your mind will jump through a lot of hoops to avoid that feeling.
However, if reality doesn’t turn out the way your beliefs led you to expect, you’ll feel surprised. This just means there’s new information coming in. If you reframe that as a surprise rather than an error, you will be more likely to learn from it and update your worldview. For best results, keep a ‘surprise journal’. More about this, with some research backup, here.
3. The Backfire Effect
If you hold an opinion strongly, and someone presents you with actual facts that contradict that opinion, your belief in the validity of y0ur opinion will actually be strengthened.
Why? It’s a combination of confirmation bias – you will ignore or reframe or actively look for flaws in the disconfirming evidence – and consistency. Once you’ve declared for a position, you will go to any lengths to keep hold of it.
If news media in pursuit of ‘balance’ present two opposing points of view, you will ignore the one that you don’t like, even if it contains valid evidence. And if you hear just the opposing point of view, with facts supporting it, to preserve consistency you will probably construct a counter-argument in your mind that may lead you to a more extreme position than you held before.
Practical implications: you can never win an argument online! The more facts and links you post, the more you are strengthening your ‘opponent’ in their point of view.
Also, remember that the backfire effect is working on you as well! So the evidence presented by your ‘opponent’ is probably making your position more rigid. Don’t get sucked into that.
So – don’t expect someone to change their mind in the heat of the moment. Maybe some of the evidence you’ve presented will get through to them after they’ve had time to calm down and reflect on it.
Most importantly, don’t directly contradict people even if you ‘know’ they are wrong (and given that no-one has a monopoly on the truth, and that people do change their minds all the time when left to themselves, don’t assume that you have arrived at absolute truth yourself).
Instead, present ‘positive’ examples that don’t directly contradict your ‘opponent’ but nevertheless may lead them to different conclusions. These are more likely to work if they appeal to other strongly held values they may have. As they consider these examples, they may begin to modify their viewpoint.
Finally, go for that rarest of things – a conversation in which you are open to learning. Put yourself in the mind of the other person, look at things from their point of view, and as Stephen Covey says, “Seek first to understand, and then to be understood.” Maybe you’ll learn something that makes you update your point of view. At the very least, you’ll be in a better position to understand what reasoning might lead them to soften their position and become more open.
Read up more about this:
This is an interesting article and makes some valid points. But if that was true all the time, then why does propaganda and false information work in manipulating people, but citing real facts does not?
I believe this happens because people accept the false information if it feels like it confirms their existing beliefs. If they agree with it, and they don’t know about confirmation bias, they aren’t strongly motivated to check on whether it’s true.