Wednesday 19 August 2015

Philosophers suffer the same cognitive biases as everyone else

Behavioural economics is essentially founded on the principle that decision-makers are affected by a range of cognitive biases. In his book "Thinking Fast and Slow" Daniel Kahneman distinguishes between two systems of thought: 'System 1' is fast, instinctive, emotional and subject to many of the observed cognitive biases; and  'System 2' is slower, logical and more deliberative and able to avoid at least some of the biases that System 1 is subject to. The obvious implication is that if you could train people to slow down their thinking and apply more logical reasoning and be more deliberative, you could help them avoid many of the common cognitive biases.

Which brings me to this recent paper (ungated version here) by Eric Schwitzgebel (University of California at Riverside) and Fiery Cushman (Harvard). In the paper, the authors test whether academic philosophers are subject to some common cognitive biases to the same extent as similarly-educated non-philosophers. They use two common experiments: (1) the trolley problem (a common ethics problem); and (2) the 'Asian disease' problem described by Kahneman and Tversky (which is explained here and here, and which I use in my ECON110 class each year).

Essentially the authors were looking for two cognitive biases: (1) Ordering effects, where the order that scenarios are presented affects how they are evaluated; and (2) Framing effects, where the way that different options are framed (or presented) affects how they are evaluated. They find:
...substantial order effects on participants’ judgments about the Switch version of trolley problem, substantial order effects on their judgments about making risky choices in loss-aversion-type scenarios, and substantial framing effects on their judgments about making risky choices in loss-aversion-type scenarios.
Moreover, we could find no level of philosophical expertise that reduced the size of the order effects or the framing effects on judgments of specific cases. Across the board, professional philosophers (94% with PhD’s) showed about the same size order and framing effects as similarly educated non-philosophers. Nor were order effects and framing effects reduced by assignment to a condition enforcing a delay before responding and encouraging participants to reflect on “different variants of the scenario or different ways of describing the case”. Nor were order effects any smaller for the majority of philosopher participants reporting antecedent familiarity with the issues. Nor were order effects any smaller for the minority of philosopher participants reporting expertise on the very issues under investigation. Nor were order effects any smaller for the minority of philosopher participants reporting that before participating in our experiment they had stable views about the issues under investigation.
In other words, even academic philosophers are subject to the same cognitive biases as non-philosophers, and even when they are familiar with the problems they are being asked to evaluate. Scary stuff, particularly as the authors conclude:
Our results cast doubt on some commonsense approaches to bias reduction in scenario evaluation: training in logical reasoning, encouraging deliberative thought, exposure to information both about the specific biases in question and about the specific scenarios in which those biases manifest.
All of which suggests that nudging may be one of the few solutions to cognitive bias.

[HT: Marginal Revolution]

No comments:

Post a Comment