Impact

Evidence-Based Policy: Do Knowledge Brokers Help? Impact
If you think about it, there have been knowledge brokers of one kind or another for some time working successfully.

Evidence-Based Policy: Do Knowledge Brokers Help?

July 5, 2018 1733

library

If you think about it, there have been knowledge brokers of one kind or another for some time working successfully.

There’s widespread and sustained interest in the role of evidence in policymaking. But because policymaking is inherently messy and complex, there’s no catch-all way of making sure evidence gets used. In this context, “knowledge brokers” are increasingly being recognized as a potential way to improve evidence-informed policymaking.

Knowledge brokers are individuals or organisations that bridge the gap between academic research and policymaking. They work to make sure that useful evidence arrives with the right people, in an appropriate format, at an opportune moment. Successful knowledge brokerage is based on building trusting relationships. This requires an intimate knowledge of both academia and policymaking, including their respective values, norms, and incentives. There’s a limited evidence base about knowledge brokers, but preliminary findings suggest that they do have the potential to improve the uptake of evidence.

LSE-impact-blog-logo

This article by Sarah Quarmby originally appeared on LSE British Politics and Policy and the LSE Impact of Social Sciences blog as “Evidence-informed policymaking: does knowledge brokering work?” and is reposted under the Creative Commons license (CC BY 3.0).

How does knowledge brokerage work in practice?
For the last six months, I have been working as a research assistant at the Wales Centre for Public Policy, an independent research centre based at Cardiff University. The centre opened in October 2017, building and expanding upon the work of its predecessor, the Public Policy Institute for Wales, and is a member of the UK-wide network of What Works Centres. We work closely with Welsh Government Ministers and public service leaders to help them identify their evidence needs and then facilitate the provision of evidence. In practice, this means guiding a series of projects, each relating to different policy or public service topics, from the initial ideas stage to delivering a final product.

To take the example of the Welsh Government side of our work, projects usually begin by meeting up with ministers, their special advisers and/or policy officials (under the auspices of the Cabinet Office) to discuss potential areas of work. When we have agreed the kind of evidence they would find useful, we conduct a short review into what’s already known about the topic. From here we can decide whether to do the research in-house, or to commission it out to an external expert. If we’re outsourcing the project, we identify the most appropriate experts and liaise with them to see if they would be interested in working with us.

Each project is different, but our work often involves facilitating and managing relationships between the experts and the Welsh Government, as well as ensuring effective communication so that the final product meets expectations. The form that the evidence produced takes depends on the specifics of each project. It may be a report (for example, see here), an eventworkshop, or simply a series of structured conversations between the expert and the Welsh Government.

In this way, we navigate the space between academic researchers and policymakers, who have long been thought of as separate communities. Nathan Caplan’s “Two-Communities” theory is still a useful tool for thinking about how to bridge the gap between academic research and policymaking. He suggests that the research and policymaking worlds operate according to such different value systems and timescales it is as though they were speaking different languages. Policymakers face political pressures and public scrutiny, and are looking for timely, practical input into policy matters, whereas academics are more interested in longer-term, theory-driven research and are under pressure to publish in academic journals. Caplan pointed to the need for intermediaries who are sympathetic towards both cultures and can mediate to best effect.

Why it’s important
Our work at the Centre puts into practice some of the latest research on encouraging evidence use in policymaking. A recent study from the Alliance for Useful Evidence looked at what can be done to put policymakers in a position where they are both able and motivated to make use of evidence, and identified six “mechanisms” to improve evidence uptake. Our approach focuses on four of these: fostering mutual understanding of evidence needs and policy questions; facilitating communication and access to evidence; facilitating interaction between decision-makers and researchers; and building the skillset required to engage with research.


Drawn from

This blog post draws on the presentation given by James Downe, Steve Martin, and Sarah Quarmby at the International Research Symposium on Public Management at the University of Edinburgh.


Indeed, the way we operate is informed by a wide range of academic literature on policymaking. For example, we’re currently exploring alternative approaches to presenting evidence. Research suggests that policymakers often respond to narratives and case studies which show how policies affect individuals’ everyday lives. Advocates of this line of thinking claim evidence presented in this way is far more likely to be used. The problem with this approach is it has implications for the need for academic neutrality and we don’t want to risk compromising the centre’s impartial status.

Knowledge brokerage is a work in progress, but research suggests that it’s important that the early trial-and-error approach to facilitating the use of evidence in policymaking doesn’t turn into an unsuitable longer-term strategy. For this reason, we’re refining our “theory of change”; i.e. what we want to make happen, how we’re going to do it, and how we’re going to measure whether it’s been done. Further down the line this will allow us to assess whether we have been effective, and what approaches have worked better or worse than others. We know our model works in our context and have a lot of examples from our work with the Welsh Government to support this. But the challenge is to systematise ways of working and collect clear examples of where and why we have been able to have “real-world” impact.


Sarah Quarmby is a research assistant at the Wales Centre for Public Policy.

View all posts by Sarah Quarmby

Related Articles

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Deciphering the Mystery of the Working-Class Voter: A View From Britain
Insights
November 14, 2024

Deciphering the Mystery of the Working-Class Voter: A View From Britain

Read Now
Doing the Math on Equal Pay
Insights
November 8, 2024

Doing the Math on Equal Pay

Read Now
Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

With over 50 countries around the world holding major elections during 2024 it has been a hugely significant year for democracy as […]

Read Now
‘Settler Colonialism’ and the Promised Land

‘Settler Colonialism’ and the Promised Land

The term ‘settler colonialism’ was coined by an Australian historian in the 1960s to describe the occupation of a territory with a […]

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

The creation of the Coalition for Advancing Research Assessment (CoARA) has led to a heated debate on the balance between peer review and evaluative metrics in research assessment regimes. Luciana Balboa, Elizabeth Gadd, Eva Mendez, Janne Pölönen, Karen Stroobants, Erzsebet Toth Cithra and the CoARA Steering Board address these arguments and state CoARA’s commitment to finding ways in which peer review and bibliometrics can be used together responsibly.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments