Public Policy

Twixt Duck and Rabbit: Psychological Biases and Bad Coronavirus Policy

March 17, 2020 4391

Crises rarely see human decision-making operating at its best. Politicians and policymakers have to make important decisions in unfamiliar circumstances, with vast gaps in the available information, and all in the full glare of public scrutiny. The psychology of decision making doesn’t just tell us a lot about the potential pitfalls in our own thinking – it alerts us to ways in which some of the world’s governments may go astray.

The power of precedent

Our minds tackle the future by referring to the past. The question of what to think or do is mostly answered by asking: what do I (or other people like me) normally think and do? This tends to make all of us, politicians included, assume nothing too dramatic is happening in the early stages of an epidemic. It also encourages an initial tendency to carry on with business as usual – at least until the crisis becomes visible.

The Conversation logo
This article by Nick Chater originally appeared at The Conversation, a Social Science Space partner site, under the title “Coronavirus: are psychological biases causing politicians to make bad choices?”

We don’t like to be disturbed from our comfortable status quo, so we tend to ignore, downplay or simply fail to collect information that might conflict with this picture. Many governments initially denied the existence of COVID-19, attempted to silence those raising the alarm, or took few steps to search for cases. Many may still be downplaying the severity of the crisis.

As the crisis gets going, we search for analogies from past experience of other similar-looking crises. Perhaps COVID-19 is like seasonal flu, and we take no drastic actions to cope with that. Perhaps COVID-19 is like the deadly 1918 flu pandemic, with a particularly deadly second peak. Or it is more like Sars (another coronavirus), which infected 8,000 people in 2003, before being stamped out by aggressive infection control?

The power of stories

We reason about the world by constructing narratives. And the choice of narrative will be crucial. Suppose we think we are replaying the 1918 flu pandemic. Then we may reason that resistance is futile – the only way the pandemic will burn out is through most of the population becoming infected, when we will attain so-called herd immunity. So the goal of policy is then to spread infections as evenly as possible across time.

The narrative is one of stoical fatalism – we must accept a large death-toll, especially among the elderly and vulnerable, but manage it as best we can. The possible figures are sobering: if herd immunity requires 60% to 80% of the population to be infected, and assuming a very conservative death rate of 1 in 200, the death toll among the 66 million people in the UK, for example, would be about 200,000 people. If we scale up to the more than 7 billion people on the planet, the death toll will be 20 million – and probably far higher.

If, instead, we think we are replaying the Sars outbreak, albeit with a far more infectious virus, then the narrative is very different: with suitably drastic actions (social distancing, isolation, hand-washing, intensive testing, contact tracing and more), then the infection can be beaten back. This is the narrative that has driven China and South Korea, in radically reducing their numbers of cases.

Of course, on the first narrative, this may represent only a temporary reprieve – perhaps the disease will surge again, and perhaps be even more deadly than before. Or perhaps herculean national and global efforts can nonetheless stamp it out, or more likely hold it at bay until a vaccine or cure is developed.

One model thinking

The psychologist Philip Johnson-Laird once memorably remarked that the tendency to see only one possible model of a highly ambiguous and uncertain situation is perhaps the most pervasive and important error in human thinking. Looking at the famous duck-rabbit image, we see either a duck or a rabbit, but never both at once.

Duck or rabbit paradox
What do you see? A duck or a rabbit? (Image: Wikimedia Commons)

Similarly, it is hard to wrench ourselves from our current narrative (say, stoical delay) and switch to another (say, aggressive countermeasures). This is particularly hard for politicians and policymakers, who are often accused of inconsistency, even when reacting to changed circumstances or evidence.

Overconfidence all round

A rigid focus on our own model of the world leads all of us – citizens, scientists, governments – to be overconfident. We see the duck-rabbit as a rabbit and are astonished to hear it quack. Indeed, we may even deny that it was a quack and stick to the “rabbit” theory.

But making good decisions requires accepting that our narratives are incomplete and quite possibly plain wrong. Suppose, for example, that aggressive countermeasures can work, following China and South Korea; if so, then across the world, many millions of lives might be saved. If you believe these measures may be futile, there may be significant and perhaps unnecessary economic disruption (though surely a far lesser evil). Whatever we think is the right story, we are almost certainly more sure than we should be, whether we are politicians, epidemiologists or concerned citizens.

The natural human tendency is, then, to ask first: what is the one true story? And second, assuming this is the right story, what is the best thing to do? For example, if we think resistance is futile, then we recommend against early, aggressive action. If we believe that people can’t spread COVID-19 while asymptomatic, we may recommend against closing mass events.

This is very dangerous. In extreme uncertainty, we need to take actions that are robust, that work pretty well, even if our narrative turns out to be wrong. Sometimes, just buying ourselves time may be vital, while we find out more. That would suggest clamping down on the virus as much as possible, as a precautionary first step. Perhaps, for some reason, this has its own dangers – but surely it is at least the right starting point for debate.

But when it comes to making decisions, the only real counter to our psychological biases is transparency – then we can fix the holes in each other’s thinking. Governments across the world need now, more than ever, to explain their assumptions, their plans, and what they expect may happen, however alarming. In short, governments must open their thinking for public scrutiny and critical debate – both to help make the right decisions and to get us, the people, to back them.

Nick Chater is a professor of behavioral science at the University of Warwick's business school, which he joined WBS in 2010, after holding chairs in psychology at Warwick and UCL. He has over 200 publications, has won four national awards for psychological research, and has served as associate editor for the journals Cognitive Science, Psychological Review and Psychological Science. He was elected a Fellow of the Cognitive Science Society in 2010 and a Fellow of the British Academy in 2012. Chater is co-founder of the research consultancy Decision Technology; and is on the advisory board of the Cabinet Office's Behavioural Insight Team, popularly know as the 'Nudge Unit.'

View all posts by Nick Chater

Related Articles

All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture
Event
October 10, 2024

All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

Read Now
Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions
Research
October 10, 2024

Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions

Read Now
‘Settler Colonialism’ and the Promised Land
International Debate
September 27, 2024

‘Settler Colonialism’ and the Promised Land

Read Now
Daron Acemoglu on Artificial Intelligence
Social Science Bites
September 3, 2024

Daron Acemoglu on Artificial Intelligence

Read Now
Crafting the Best DEI Policies: Include Everyone and Include Evidence

Crafting the Best DEI Policies: Include Everyone and Include Evidence

Organizations shouldn’t back away from workplace DEI efforts. Rather, the research suggests, they should double down, using a more inclusive approach that emphasizes civility and dialogue – one aimed at finding common ground.

Read Now
The Public’s Statistics Should Serve, Well, the Public

The Public’s Statistics Should Serve, Well, the Public

Paul Allin sets out why the UK’s Royal Statistical Society is launching a new campaign for public statistics.

Read Now
Why, and How, We Must Contest ‘Development’

Why, and How, We Must Contest ‘Development’

Why is contestation a better starting point for studying and researching development than ‘everyone wants the same thing’?

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments