The Conversation had an interesting Climate Explained article a few weeks ago by Peter Ellerton (University of Queensland), which answered the question:
Why do humans instinctively reject evidence contrary to their beliefs?
The question was, of course, asked in the context of climate change denial. Ellerton's response included this:
We understand the world and our role in it by creating narratives that have explanatory power, make sense of the complexity of our lives and give us a sense of purpose and place.
These narratives can be political, social, religious, scientific or cultural and help define our sense of identity and belonging. Ultimately, they connect our experiences together and help us find coherence and meaning.
Narratives are not trivial things to mess with. They help us form stable cognitive and emotional patterns that are resistant to change and potentially antagonistic to agents of change (such as people trying to make us change our mind about something we believe).
If new information threatens the coherence of our belief set, if we cannot assimilate it into our existing beliefs without creating cognitive or emotional turbulence, then we might look for reasons to minimise or dismiss it.
I like the framing about understanding the world through the narratives we tell ourselves. However, Ellerton could easily have gone a bit further in his explanation, linking the unwillingness to accept new information that threatens our narrative to the behavioural economics concepts of endowment effects and loss aversion, as I have previously done in this 2018 post:
People are loss averse. We value losses much more than equivalent gains (in other words, we like to avoid losses much more than we like to capture equivalent gains). Loss aversion makes people subject to the endowment effect - we are unwilling to give up something that we already have, because then we would face a loss (and we are loss averse). Or at least, there would have to be a big offsetting gain in order to convince us to give something up that we already have. The endowment effect applies to objects (the original Richard Thaler experiment that demonstrated endowment effects gave people coffee mugs), but it also applies to ideas.
I've thought for a long time that ideology was simply an extreme example of the endowment effect and loss aversion in practice. Haven't you ever wondered why it's so difficult to convince some people of the rightness of your way of thinking? It's because, in order for them to agree with you, that other person would have to give up their own way of thinking, and that would be a loss (and they are loss averse). It seems unlikely that the benefits of agreeing with you are enough to offset the loss they feel from giving up their prior beliefs, at least for some people. Once you consider loss aversion, it's easy to see how ideologies can become entrenched. An ideology is simply lots of people suffering from loss aversion and the endowment effect.
Climate change denial is a good example of an ideological viewpoint. People are endowed with a particular view about climate change. They are unwilling to give up that view, because that would involve a loss to them (a loss of one of their beliefs), and people are loss averse (they want to avoid losses). So, people are reluctant to adjust their internal narratives about climate change, even in the face of overwhelming evidence, because they are loss averse.
Read more: