Research

The Risks Of Using Research-Based Evidence In Policymaking Research
(Photo: Aymane Jdidi/Pixabay)

The Risks Of Using Research-Based Evidence In Policymaking

December 6, 2023 1855

Citing research-based evidence in policymaking more often is a no-brainer. After all, it stands to reason that making and delivering decisions that affect the public should be well reasoned using the latest objective information available. However, with research-based evidence increasingly being seen in policy, we should acknowledge that there are risks that the research or ‘evidence’ used isn’t suitable or can be accidentally misused for a variety of reasons. 

The use of published papers which have since been disproven or otherwise discredited

In 2010, a paper by Reinhard and Rogoff entitled “Growth in a Time of Debt” was published in the American Economic Review. The findings were used to shape one of the highest profile economic policy questions at the time: whether a cut in public spending should be used to offset debt, or whether pump-priming an economic resurgence using the state would be more economically sensible. This paper went on to be used to shape policy on austerity cuts in the US and the UK. We can see from the Overton database that to date that it has been cited 320 times in policy documents in 94 sources across 23 countries.

Unfortunately, the critical findings from the paper were disproven by a 2013 paper from the University of Amherst, and it has since been retracted. What’s interesting about this particular incident is that the retraction and ensuing debate was quite high profile at the time outside academic circles, so we were surprised to see that it’s still cited amongst policy publications almost a decade later – as evidenced by citation activity in the Overton database, despite the release of the Amherst paper in 2013. 

Figure 1: Citations by policy documents of “Growth in a Time of Debt” by  Reinhard and Rogoff

One might argue that on the face of it, a mere citation doesn’t say too much – for all we know, post-2013 publications may be disagreeing with or revising previous policy stances. Using the Overton database, we can take a look into whether or not this is the case. To do this, we can search by DOI to find citations of the paper within policy documents,  and then we can see the context of the mention within the text of the policy document, as shown in Figure 2. This allows us to quickly identify whether or not the retracted paper is being cited as a credible source of information post-retraction or whether it is being cited in the knowledge that it has been debunked. 

Figure 2: Search by DOI of the original paper reveals some policy documents which have cited the paper

A different approach might be to look for policy documents which cite both the retracted paper and the Amherst paper to see which documents cite both – this might indicate that the policy document author(s) are aware of the caveats of the 2010 retracted paper and are subsequently making revisions to decisions/understanding of economic policy in light of the updated findings presented in the 2013 paper. To do this, we can look at a list of all cited scholarly articles to see if both appear – or any other papers which contradict Rogoff and Reinhart! Let’s take the top example from Figure 2 – a policy document entitled “A Low Carbon Future for the Middle East and Central Asia: What are the Options?” by the IMF. We can see that it cites only the 2010 paper, without citation of anything (amongst the list of papers it mentions) which disproves it. This function also allows us to see the context as to how it has been referenced in the policy document.


Figure 3: A list of some journal articles and papers cited by a policy document entitled “A Low Carbon Future for the Middle East and Central Asia: What are the Options?” by the IMF. 

The use of pre-published research

logo for Overton
This article by Kat Hart originally appeared in the Overton Blog under the title “Research Cited By Policymakers: The Good, The Bad And The Speedy.” Overton is a searchable index of policy documents and guidelines that allows users to see where their work is cited and mentioned.

It is well established that one of the challenges of academic-policy engagement is the difference in pace; the ‘policy hemisphere’ often needs to respond quickly and decisively, whilst the ‘research hemisphere’ takes much longer to a) do the research and b) publish the results. There has been much discussion around the use of emerging research (before being ‘officially’ published). In the event of, let’s say, a rapidly advancing pandemic, policymakers need to be able to source information on a topic which has yet to be published material because well, it’s new to everyone. 

This has advantages and disadvantages, as discussed in a new report from funders and publishers on the effects of open sharing commitments during the COVID-19 pandemic that cites Overton’s data. Pre-prints can be valuable in ensuring rapid communication, triggering early and transparent discussion, but reliance upon these can be difficult in terms of ‘unlearning’ what we thought we knew to be true in the event that it’s then rejected by peer review. Academics who are used to weighing up evidence from different sources and have an understanding of the people, politics & publication venues involved as well as the ins and outs of the peer review process may account for these things automatically when they read a preprint, but policy makers don’t necessarily – neither do members of the public looking for the evidence later.

Given that we know pre-prints can be of enormous value and that peer review itself doesn’t always prevent inaccuracies from entering the scientific literature we just need to find a way of integrating them responsibly into policy. One way forward might be to settle on a standard way of flagging research that isn’t yet part of the “official” scholarly record in citations. You could also imagine regularly going back over the citations of policy documents as we did above, looking for preprints that turned out to be completely wrong – though it’s not clear what you’d do about them when you found them.

Rapid evidence reviews – a silver bullet?

As the sector looks for new mechanisms to improve the efficiency of academic-policy engagement, there has been quite a bit of interest in rapid evidence reviews. These offer an expedited version of a full systematic review, using shortcuts where deemed appropriate. This can be helpful in an environment where policy professionals often need to be able to respond quickly and effectively. The difficulty with these can be how best to use ‘shortcuts’ whilst achieving the balance between thoroughness and speed to minimize bias, as well as optimizing transparency.  These shortcuts might take the form of reducing the scope of the search – by sources, year(s) published or geography, however, only covering recent research could omit crucial information and there is yet to be a shared model of best practice for how these can be applied. Narrowing the range may restrict the grey literature such as think tank reports or lose sight of crucial information for not being published in the right year – ergo, shortcuts need to be chosen carefully to ensure that the research used is the ‘right’ research. In this scenario, we hope that our data might be able to assist with this by pulling everything into one searchable library – especially, we hope, when looking for research which has already been used within other policy documents.

This is of course, not an exhaustive list of issues that can arise when using research-based evidence in policy, but we hope that this blog piece has been of interest with regard to how a dataset like the one we have here at Overton might be able to reveal insights about such papers – by tracking where they’ve been cited (or not cited).

Sage Policy Profiles
Read more about the Sage Policy Profiles Powered by Overton here: Sage Policy Profiles Terms and Conditions – Social Science Space

Logo for Sage Policy Profiles Powered By Overton

Kat Hart is an analyst at Overton, where she helps find stories in data. Kat's background is in Higher Education management, and she is working towards her PhD project at the University of Nottingham surrounding river management policy.

View all posts by Kat Hart

Related Articles

The End of Meaningful CSR?
Business and Management INK
November 22, 2024

The End of Meaningful CSR?

Read Now
Deciphering the Mystery of the Working-Class Voter: A View From Britain
Insights
November 14, 2024

Deciphering the Mystery of the Working-Class Voter: A View From Britain

Read Now
How Managers Can Enhance Trust
Business and Management INK
November 11, 2024

How Managers Can Enhance Trust

Read Now
Doing the Math on Equal Pay
Insights
November 8, 2024

Doing the Math on Equal Pay

Read Now
Julia Ebner on Violent Extremism

Julia Ebner on Violent Extremism

As an investigative journalist, Julia Ebner had the freedom to do something she freely admits that as an academic (the hat she […]

Read Now
The Conversation Podcast Series Examines Class in British Politics

The Conversation Podcast Series Examines Class in British Politics

Even in the 21st century, social class is a part of being British. We talk of living in a post-class era but, […]

Read Now
The Cult of Donald Trump

The Cult of Donald Trump

David Canter considers the parallels between religious beliefs, and cults, with  those followers of  ex-President Trump who have a faith that he can be considered God-like.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments