Most academics would like to think (or hope?) that people form opinions on the basis of some form of evidence. We might disagree on what constitutes appropriate evidence, but we'd like to think that if people are presented with sufficient evidence that runs counter to their established opinions, they
might change their opinion. Of course, this flies in the face of
confirmation bias - the idea that people selectively interpret information, readily accepting and remembering information that confirms pre-existing beliefs and opinions, while dismissing and quickly forgetting information that challenges those beliefs and opinions.
Although it doesn't refer to confirmation bias (at all),
this article in The Conversation by Will Grant (Australian National University) shows just how pervasive confirmation bias can be. The article refers to
this article published in the journal
Environmental Communication last year (sorry, I don't see an ungated version online), by Matthew Nurse (also Australian National University) and Will Grant.
Nurse and Grant asked people to solve a maths problem based on contingency tables, in order to answer a question about whether the data shows that something got better, or worse. There were two contexts: (1) a new skin cream, and its effect on a rash; and (2) the closure of coal fired power stations, and their effect on carbon dioxide emissions.
Solving a contingency table correctly is not easy. It requires a certain level of mathematical literacy. Here's the table that people were presented with (there were actually four versions, two each for skin creams and power stations, and two each where the correct answer was that things got worse, and where things got better):
In this table, for 223 patients who used the skin cream, the rash got worse, while for 75 patients who used the skin cream, the rash got better. So, of those who used the skin cream, the rash got better for 25.2% (75/[75+223]). For 107 patients who didn't use the skin cream, the rash got worse, while for 21 patients who didn't use the skin cream, the rash got better. So, of those who didn't use the skin cream, the rash got better for 16.4% (21/[21+107]). The table should provide evidence that the skin cream works.
Now, if the numbers were reversed, that should provide evidence that the skin cream does not work. Similarly, depending on which way the numbers are presented, there should be evidence either in favour of closing power stations reducing carbon dioxide emissions, or not.
Given that the numbers are identical in all four cases, and bearing in mind that solving this is reasonably challenging, you would expect that similar percentages get the correct answer no matter which version they are presented with. Unfortunately, that wasn't the case.
Nurse and Grant tested this on 504 Australians, half of whom were supporters of the Australian Greens (ideologically far left), and half who were supporters of the One Nation Party (ideologically far right). It turns out that political views affected the proportion who got the answer correct, but only for the climate change context. Here's Figure 2 from the paper:
In each panel, the two bars on the left show the proportion who got the answer correct, and the two bars on the right show the proportion who got the answer wrong. The green bars are supporters of the Australian Greens, and the yellow bars are supporters of the One Nation Party. The top two panels are the skin cream context, and you can see that (especially for the right panel) the proportion getting the answer correct doesn't appear to depend on political affiliation. The bottom two panels are the climate change context, and it shows that supporters of the Greens are much more likely to get the answer correct if the correct answer is that carbon dioxide emissions decrease when power plants are closed, while supporters of One Nation are much more likely to get the answer correct if the correct answer is that carbon dioxide emissions
increase when power plants are closed.
That isn't the end of the story though. Nurse and Grant calculated the difference in the odds of getting the correct answer between the two ideologies, for different levels of numeracy. You might expect that more numerate people would be less likely to be swayed by their political ideology. However, controlling for numeracy, the opposite appeared to be the case. For instance, they report that:
...a One Nation supporter with a numeracy score of three in the identity threatening “CO2 does decrease” condition was 26 per cent as likely to respond with the correct answer (odds ratio 0.26, P < .01) compared to a Greens supporter in the same numeracy category. However, in this condition, a One Nation supporter with a numeracy score of seven was only 5 per cent as likely to provide the correct answer as a Greens supporter in the same numeracy category (odds ratio 0.05, P < .01).
Nurse and Grant argue that this represents '
motivated reasoning'. From The Conversation article:
These findings build on the theory that your desire to give an answer in line with your pre-existing beliefs on climate change can be stronger than your ability or desire to give the right answer.
In fact, more numerate people may be better at doing this because they are have more skills to rationalise their own beliefs in the face of contradictory evidence.
This paper provides some discouraging news for those of us who hope that we can convince people with evidence. However, it isn't usually the average voter that we are trying to convince; rather it is policy makers, business people, etc. It would be interesting to see whether this study would replicate among those groups.