I am still not comfortable at handling conflict. I’ve practised it many times now, and have most definitely improved over my adult years, but it still comes with a sense of foreboding, embarrassment, awkwardness and the shock-horror-potential of realising I might be wrong… yeurghh!
One thing I have loved though is being able to find a coherent and strong argument for my point of view and putting that forward, perhaps with some tempered smugness if I think I have truly nailed it. It feels good, doesn’t it? Feeling like you’ve got the stats to back your stance. I also like to think I am able to change my mind if someone comes back at me with an even more fabulously logical and robust argument that contradicts or challenges my view. I like seeing myself as someone who likes to learn and grow… even if it comes with some inevitable awkwardness from realising I was wrong. So it was with some minor discomfort that I learned about the Backfire Effect and Motivated Skepticism, and had to pause to think about whether I am truly open-minded in these instances. And whether my arguments are truly changing others minds too.
I only recently discovered podcasts (yes, yes, yes, I am VERY late to the podcast party, but hey – I got there, didn’t I? Let’s move on…) and this year I discovered one of my favourites so far – You Are Not So Smart (or YANSS to its fans). Anyone who knows me well wouldn’t be surprised to learn how much I love YANSS, particularly if I told you the theme of these podcasts is to shine a light on our bizarre and fascinating brains, and look at the ways we delude ourselves. Three recent episodes in this podcast series looked in depth at the Backfire Effect – which is essentially the effect you see when you try to change someone’s mind about something by firing facts and arguments at them, only to have them refuse to stand down and instead get even stronger in their stance against you. As explained in the podcast – they ‘double down’ on their beliefs. This is extremely topical at present, with the rise of ‘fake news’, ‘alternative facts’ and the rise of extreme partisan beliefs and division in politics. There’s a lot of ranting and fist-waving going on but doing it to those we disagree with is not always helping our cause.
In short, there are a number of things at play that are causing this effect. And liberal left-wing types are just as prone to this happening as the conservative right. It also isn’t necessarily to do with how much education you got – in fact, one of the findings is that having more education can make this effect worse.
When we hold strong political beliefs, we tend to integrate them into our sense of self. We see them as part of ourselves. When others argue against our beliefs and try to convince us we are wrong or misguided, we respond as if we are under threat. We display similar cognitive reactions to these verbal threats to those we would have if we were being chased by a ravenous and terrifying large animal: Under threat. Must protect self. Defences UP! This is the first thing that happens, and it makes us less open to all sorts of stuff, but crucially – less open to really hearing and considering the argument that made us feel threatened in the first place. It isn’t a life-saving strategy to reason with a bear when you think it’s trying to tear you apart…
The second thing that can happen is something called Motivated Reasoning and its (I like to think shady and sinister) counterpart, Motivated Skepticism. This is closely linked to Confirmation Bias – which is when we look for evidence that supports our already established views (and mentally slap ourselves on the back for finding the evidence). Motivated Skepticism is when we view opposing facts and evidence with higher… well… skepticism. It isn’t in our interest to change our mind, so we tend to put those arguments against ours through a lot more paces. We put barriers up – mentally insisting on more information, questioning the source, overly scrutinising the messenger. We don’t often realise we’re doing this, but it makes us less likely to seek out new information and even to challenge ourselves and test our own beliefs. Some open mind, hey?
Add in the layer of social media, where whole websites are using algorithms to put content we LOVE and AGREE WITH right up in our faces, and this whole process just gets more intense. We build up and reinforce our own echo chambers – both consciously and unconsciously. We hide posts from news sources we don’t trust. We stop following those in our social media circles who post outrageous views we disagree with. And so we come across counterarguments less, and for those we do encounter – we view them skeptically and dismiss them with other excuses.
Needless to say I found this all a bit sobering. As someone learning to really assertively argue and debate, and whose starting to enjoy the thrill of an argument more and more, it has made me question how truly open-minded I am. Made me question as to whether I am arguing with a view to really learn more, or instead to just get better at digging my heels in – but more eloquently.
How do we reduce the Backfire Effect? Well, there are some instances where arguing boldly is going to do more harm than good. Some arguments should just be left aside… but if you are keen to try help someone see your point of view and the evidence you are presenting – then connecting with those you’re debating with, showing similarities between both of you can assist, as does providing a safe environment within your conversation. A lot of this is in the nuances of how you talk and ensuring you come across as someone willing to change your mind too. Essentially making yourself look more vulnerable and flexible first, so the other person doesn’t feel so easily threatened. It’s a start. And to achieve change with someone whose really dug in, then you’ll need to do this often and in small pieces at a time… optimistically there is some lab evidence to show that people will eventually let go of their beliefs when presented with enough counter-evidence (quality and quantity of evidence is important, and consistently it must be happening regularly), but before that happens it is likely they’ll get more dug in…
But then, aren’t you dug in too? Why else would you be trying to change their opinion? (insert awkward smug smile here).