Analysis

Is government biased against using evidence in its decision-making?

How behavioural science can help explain the biases and mental shortcuts that make evidence-based policy difficult


How behavioural science can help explain the biases and mental shortcuts that make evidence-based policy difficult

1024x1024-1024x682

I’d like to start off by comparing government decision-making with Dr. Spock and Cosmo Kramer. I’ve given lots of talks on behavioural science, and one analogy I use compares the two. This analogy describes Dual Process Theory, which is a key principle on which behavioural science is based. It states that our behaviour is guided by two systems – ‘creatively’ titled System 1 and System 2. Kramer represents our ‘System 1’ – the part of our thought processes that is reactive, emotional, and easy to influence. Dr. Spock represents our ‘System 2’ – the part that is logical, rational, and deliberate.

We believe we predominantly use System 2 in our decision-making but behavioural scientists have demonstrated that we rely much more on System 1 and that we aren’t even aware of this reliance. Humans rely on the automatic System 1 to minimize the cognitive effort associated with trivial and routine things that we do (which is a good thing). But these automatic responses also make us susceptible to seemingly irrational biases and errors in decision making (which can be a bad thing).

In previous blogs, I pointed to structural issues within government and academic institutions that make evidence-based policy difficult. But because humans are the ones operating these systems, perhaps addressing structural issues will only be so useful – we all have a lot of Kramer in us that makes us susceptible to biased decision-making. This irrationality will remain, regardless of what structure we work within.

Even if we redesigned the systems, can we expect government officials to do the rational thing and regularly use evidence in their policy decisions when they (like all of us) are human – and we know that humans don’t always do the rational thing (regardless of their stature, context, or intelligence)?

While I do believe there are structural things that could be changed, I am beginning to think a better approach might be to target the inherent biases and short cuts that lead government officials away from using evidence in their policy decisions. Here are some examples of how common behavioural short cuts impact the use of facts and evidence in policy-making.

Choice overload – humans have a finite amount of cognitive bandwidth, meaning that we can only process so much information at one time. When we are presented with too many choices, we either make suboptimal decisions or do not make a decision at all. With the constant barrage of competing evidence and opinions from studies and stakeholders, senior government officials are presented with too many choices around the use of evidence. Therefore, it is more likely that policy-makers don’t take the time to determine the best evidence to use or that they simply decide to rely on intuition and not use evidence at all.

Present bias – humans place significantly more value on ‘things’ in the here-and-now at the expense of ‘things’ in the future (even the near future). This behavioural characteristic is a key reason that evidence isn’t used. Many evidence-based policy decisions involve ‘short term pain for long term gain’, which tends not to be popular with the citizenry and, as a result, politicians. Let’s consider climate policy and transportation as an example.

The evidence concerning global warming and the actions we need to take is quite clear. But public transit can be a pain and electric vehicles are expensive, so people purchase and regularly drive cars that burn gasoline. Present bias causes us to value the minor benefits in the now (e.g. convenience, cost-effectiveness , etc.) more than the rewards in the future which are substantially bigger (e.g. staving off a global catastrophe) because we discount future rewards, especially the further away they are. We have evidence pointing to the impact of greenhouse gases, yet few governments have been able to effectively act on it.

Confirmation bias – we all look for and interpret information in a way that confirms our own opinions. It is difficult for humans to change their minds once they’ve made a decision. So, if a decision-maker in government has decided on a policy direction, it can be difficult to sway that person even with strong evidence to counter it. ‘Policy-based evidence making’ is a term commonly used to refer to this phenomenon – “we know this is the right answer, so let’s figure out how to craft an argument to sell it to others.”

Loss aversion – people dislike losses much more than they like gains of an equal amount. Humans do not like the thought of losing things. From a government perspective, those that have responsibility for a program are very hesitant to have it cut, even when there is no evidence that it is driving a positive impact. Citizens are just as resistant to losing government supports, even if they’ll be replaced with something better. This resistance makes it very difficult to use evidence in an effort to eliminate programs and redirect resources to more effective interventions.

Availability bias – the human brain mistakes the ease in which it recalls something that happened with the likelihood of that thing happening again. Availability bias ends up shaping public sentiments, which in turn drives government to create policy to deal with unlikely things occurring. This can be observed in response to a crisis, which leads to public outcry and additional government regulation in response. Policy should not be driven by what people are (irrationally) worried about. Public sentiment always needs to be considered in good policymaking, but these perspectives need to be considered within the context of what the evidence points to.

The true power of behavioural science is not to only understand decision-making and biases, but to design responses. Implementing interventions that ‘nudge’ people past these biases allows them to behave in a way that is preferable to them and in their best interest. There are hundreds of examples where this has been applied successfully in public policy, including areas such as taxation, public health, voter turnout, income security, education, and environment (for those of you interested in more on this subject, I’d suggest starting with the book Nudge).

This begs the question – if we can influence citizens to make decisions beneficial to them, could we also influence policy makers to use evidence and therefore make decisions that are more beneficial to citizens?

I believe it is in everyone’s best interest for more policy to be more evidence-based. This is why I’ve begun research to look at behavioural interventions that will support those in government that are interested in using more evidence in decision-making.

Reach out here if you’d like to learn more about this research program.

Be sure to sign up here to receive our next blogs and whitepapers as soon as they go live.


About the AuthorMike Davis is CEO at Davis Pier / Advising on social innovation, government transformation, and evidence-based policy

 

Similar posts

Stay up to date.

 

The latest insights from our industry experts. Straight to your inbox.