Interview

Jonathan Breckon On Knowledge Brokerage and Influencing Policy Interview
(Photo: Photo Mix/Pixabay)

Jonathan Breckon On Knowledge Brokerage and Influencing Policy

December 6, 2023 1340

Overton spoke with Jonathan Breckon to learn about knowledge brokerage, influencing policy and the potential for technology and data to streamline the research-policy interface.

Jonathan has over 20 years’ experience as a knowledge broker, bridging research with policy and practice. He previously led the Alliance for Useful Evidence for nine years, championing the smarter use of evidence in government, NGOs and frontline practice. While at the Alliance, he created a training program for the UK Policy Profession, co-founded the Evidence Quarter in Whitehall in London and was a founding Board member of What Works Children’s Social Care.

Jonathan was also appointed a Policy Fellow with the ‘Capabilities in Academic Policy Engagement (CAPE)’ project within the Open Innovation Team, running rapid reviews and deep dives of research and expertise for policymakers. For light relief, he is part way through a part-time PhD on evidence-based practice and the professions. He is a Senior Associate at Transforming Evidence, Visiting Fellow at Campbell Collaboration UK and Ireland and a Fellow of the Academy of Social Sciences.

The influence framework

Overton: You’ve worked a lot with policy stakeholders, encouraging them to use research and helping them understand how to access good evidence. Have you learned any lessons from this that might be useful to researchers wanting to influence policy?

Jonathan: The main heuristic I’ve taken from this work is something called a COM B Framework. It was devised by the psychologist Susan Michie. It’s a framework for thinking about behavior change (the B in the acronym), that I find really valuable in the context of influencing policymaking.

This framework prioritises three elements that you need together to influence policymakers. None of this is rocket science, but people tend to focus only on some of them whereas it’s important to do all of these things in concert.

So, the C is for capability. This means that researchers need to ask if their audiences have the skills and knowledge and confidence to understand your research? We often take it for granted that these educated, influential people will get it. And sometimes they will! But often there’ll be a nuance that it’s difficult for non-experts to grasp. So, an understanding of your target’s capability is crucial.

O is for opportunity. It’s perhaps obvious to point out that there are specific windows of opportunity in which to target policy makers with your research. You’ve got research going back to the 1980s around the so-called ‘policy window – the period of time where it’s possible to push or introduce an idea. But this is difficult for researchers because you have to be very fleet of foot. So often I hear “we need information on this in parliament tomorrow or it’s not going to happen,” and given all the pressures that academics are under that’s often incredibly difficult to achieve and rightly resisted. But that is the nature of the beast.

Policy making moves incredibly quickly. We saw this especially during Covid, when we needed things not tomorrow, but yesterday. So researchers are starting to realize that responsiveness is key, but I think it needs stressing further. Everyone is so time poor, that opportunity is key. Finding the right moment is a real art.

M is for motivation. Researchers should ask themselves why policymakers would bother to access the research. So many people still ascribe to the ‘deficit model’, in which decision makers are just waiting for academic evidence to be brought to them – empty vessels ready to be filled with knowledge. But in reality, most of the time this isn’t how policymakers think. Why do we assume that research would be privileged over, for example, the evidence that a campaigning charity or a similar body would bring with them? So, I think that’s a helpful reality check. But fortunately, you can encourage motivation, you can incentivize stuff.

So these three elements are crucial. It’s not just about packaging up a nice report. You have to ensure that the context is right.

Building motivation

Overton: These three pillars are a really helpful theoretical framework for understanding the influence landscape. Do you have any advice to researchers about how to encourage capability, motivation and opportunity?

Jonathan: There’s a dozen things I would recommend, all of which you can find in the report “Using Evidence: What Works?” that I wrote while at NESTA. There’s no one mechanism. There’s only a cocktail of measures, which are explored extensively in that document.

Motivation is a particularly interesting one, however. There are ways to encourage motivation, to incentivize engagement. For instance, there’s a really strong sense that if you can show that you’re going to save money that can motivate policymakers. Fear is also a powerful motivator! If you can make them understand that getting something wrong will have all these negative implications.

You can encourage motivation by stressing relatedness – if a policy maker is shown that someone else, who looks and feels like a peer, is engaging with the issue then they’re more likely to engage themselves.

logo for Overton
This interview with Jonathan Breckon originally appeared in the Overton Blog under the title “Jonathan Breckon’s Tips For Influencing Policy.” Overton is a searchable index of policy documents and guidelines that allows users to see where their work is cited and mentioned.

How to have impact

Overton: Do you have any practical tips for people just starting out? If someone is unfamiliar with the policy influence landscape, what would you say to them?

Jonathan: There are two major learnings that I’ve taken from my work with policy makers as well as my experience being a REF impact assessor.

Firstly, is the value of using knowledge brokers, or intermediaries. These are people or groups within universities, or bodies linking different institutions (for example, the Universities Policy Engagement Network) who alert and facilitate opportunities for researchers to engage with policy.

Relationship building is so important. It’s where we focus most of our resources. So developing networks that can help you access the right people is essential. People in government or select committees get bombarded, so if you can get a bit of brokerage or have your evidence packaged up a bit better, then you’ll increase your chance of decision makers engaging.

My second tip is to diversify who and what you mean by policymakers. Our default understanding of this sphere in the UK is Westminster and Whitehall. And it’s important to remember that they are not the center of power. So rethink who you can influence – think about arms length bodies, regulators. Even business bodies. Because often they are the people driving the industry forward. Renewables is a good example of this – the industry is innovating and pushing the agenda far more than Whitehall. The government is playing catch up.

It’s also important not to forget the frontline – the professionals that are actually doing the work, be it a police officer or a teacher. The devolution of power over the last few decades means that local bodies now have far more autonomy and influence. They can enact change in a way that central decision makers can’t. The center is constrained by things like the media in a way that lower-level decision makers aren’t – a local educational authority doesn’t have to worry in the same way about what the Daily Mail will say, for example.

Where policy meets the rubber road is in schools, hospitals and so on. Researchers wanting to affect change should focus more on people delivering policy – they can be agile and responsive and truly embed the learnings that you bring to them. Not the people in Whitehall making the speeches and writing documents and so on.

You can actually have more impact – in part because it can be easier to access these people, and easier to influence them because you’re speaking the same language and they have related capabilities. And this type of local influence can also be a pathway to bigger impact. But it is an end in itself, if what you actually want to do is to affect change and make a difference. In fact, there’s a growing recognition about the value of local impact among universities – for example the Civic Network for Higher Education was set up to advance the agenda of universities’ civic duties to their local area.

Overton: I suppose this is central to the question of ‘what is impact?’ Is it affecting change, and influencing hearts and minds? Or is it policy citations and the like?

Jonathan: Yes, it’s all those things. Anyone who’s ever participated in the REF (Research Excellence Framework) or the KEF (Knowledge Exchange Framework) will understand that it’s all connected, and it’s important and it’s not going away.

In terms of impact assessment, I think researchers should know that assessors aren’t snobby about it! There’s no one definition of impact, and – to come back to what I said earlier – it certainly doesn’t just mean getting cited in a publication from Westminster. There are so many types of policy impact, let alone influencing business and society in more frontline ways. The impact just has to be credible and be making some sort of difference. As long as you’re not over-claiming or pretending that you’re the only person having that type of impact – as long as what you’re saying is authentic – then the assessors will accept it.

Another thing I want to stress is how beneficial impact work is to the researcher. Particularly the less centralized impact – influencing at a local level, or working with civil society and so on. So many academics that I speak with talk about how much they enjoy the process. That this kind of co-production that you do when you work with the people on the ground is more than a means to an end. They find it motivational because they can see the research taking effect in real time. But also, it feeds directly into their scholarship – they learn from their partners, and it improves their own research process. It’s not a separate thing. One good thing to come from the recent focus on impact is that research and policy and practice are no longer compartmentalized in a way that they used to be. And that absolutely improves the research that is produced.

But in terms of your more conceptual question around ‘what is research impact?’ I just want to refer back to some resources that already exist, because I think it’s important that we build on these conversations as opposed to having the same ones over and over again. There’s a paper by Kathryn Oliver and Paul Carney on the evidence-policy gap which reflects on having impact – specifically the need for scientists to stop trying to maximize their own impact, and rather re-frame their approach to understand the demand for evidence, as well as how it will be taken in, before they start producing evidence to try and ‘get into policy’.

Photo of Jonathan Breckon.
Jonathan Breckon is a knowledge broker, consultant and policy adviser.

Synthesizing evidence to capitalize on opportunity

Overton: More recently you’ve been focusing on the use of rapid evidence reviews, as a means of getting research into policy. Can you talk us through that?

Jonathan: A rapid evidence review is a type of knowledge synthesis in which the traditional systematic review process is accelerated. This allows you to produce resources for stakeholders in a quick and efficient way, by refining the traditional review process in terms of scope, subject or geography. It means that you can very quickly get a broad selection of evidence, that’s been reviewed by experts in the field.

Part of my work with Capabilities in Academic Policy Engagement (CAPE) involved testing and refining the process of rapid evidence reviews to ensure that they were a trustworthy way to synthesize evidence.

This relates to what I was saying about windows of opportunity – the O element of the COM B framework and the importance of being timely. While there are occasionally times where you have a longer timescale, in general the policymaking process is frenetic so it’s important to be rapid. Rapid reviews really took hold during Covid, when policymakers and academics alike realized the importance of getting access to reliable evidence as quickly as possible.

So, these systematic reviews are different to literature review because we don’t cherry pick studies. Individual experts are inherently biased – that’s a controversial opinion and academics hate to hear it, but there’s good evidence for it. And so we have to integrate every piece of evidence that we can in order to overcome the biases of any individual study. It’s a rigorous process.

This is incredibly important for policy making, particularly if you’re dealing with something controversial. You shouldn’t approach this like a journalist – finding the story in the data – but rather accept that the result might be quite a bland synthesis. You need to know everything, so you have to incorporate all the evidence, even the bland or inconclusive studies.

We’re used to using these techniques in health (as I mentioned, Covid really brought them into the mainstream) because you couldn’t possibly approach healthcare policy with the attitude of “I like this one drug study, but this other one contradicts it, so I won’t look at that result.” The need to be exhaustive in healthcare evidence review is well established. I see no reason why it can’t become mainstream in other areas. Particularly as it’s getting easier and easier with technology and machine learning.

How technology improves the research/policy pathway

Overton: Can you speak a little more on technology and AI and how that might be useful?

Jonathan: The main benefit is that technology allows us to pull together evidence in a much less laborious way, so it’s quicker. It’s now possible to use AI to automate the literature screening processes, whereas previously this would have had to be done manually.

It’s always helpful for searching. Since the advent of Google there’s been a massive explosion of information – the so-called ‘infodemic’. The sheer weight of data emphasizes the need for synthesis. Policy makers can’t possibly get to grips with all the evidence that’s out there otherwise.

In fact, often we need to synthesize the synthesis! This is called an umbrella review, essentially a review of reviews. We have so much of these across subject areas that they need updating regularly. This is an area that technology can really help us with. At the moment, there’s still a human aspect – lots of people are fantasizing about total automation and we aren’t there yet – but currently the main value of technology is to help with searching. But you need an interpreter.

Overton is the world’s largest searchable index of policy documents, guidelines, think tank publications and working papers.

View all posts by Overton

Related Articles

Deciphering the Mystery of the Working-Class Voter: A View From Britain
Insights
November 14, 2024

Deciphering the Mystery of the Working-Class Voter: A View From Britain

Read Now
How Managers Can Enhance Trust
Business and Management INK
November 11, 2024

How Managers Can Enhance Trust

Read Now
Doing the Math on Equal Pay
Insights
November 8, 2024

Doing the Math on Equal Pay

Read Now
Julia Ebner on Violent Extremism
Insights
November 4, 2024

Julia Ebner on Violent Extremism

Read Now
The Conversation Podcast Series Examines Class in British Politics

The Conversation Podcast Series Examines Class in British Politics

Even in the 21st century, social class is a part of being British. We talk of living in a post-class era but, […]

Read Now
Alondra Nelson Named to U.S. National Science Board

Alondra Nelson Named to U.S. National Science Board

Sociologist Alondra Nelson, who until last year was deputy (and at times acting) director of the White House Office of Science and […]

Read Now
The Cult of Donald Trump

The Cult of Donald Trump

David Canter considers the parallels between religious beliefs, and cults, with  those followers of  ex-President Trump who have a faith that he can be considered God-like.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments