At a Glance: The UK’s Twin-track Approach to Measuring Impact
Around the world, there’s a persistent belief that policy, practice and the public should all be better informed by evidence. Whether it’s immunisation programmes, literacy initiatives, or development banks, we hope decisions made by politicians, civil servants, doctors, teachers and others make use of the best available evidence (along with their professional judgement). Likewise, the public is expected to engage with, understand and respond to information that makes a range of different claims.
This should be good news for academic researchers, creating a growing demand for the valuable outputs they supply. However, the real world is never as simple as linear impact flowcharts suggest. The ‘evidence’ is often contested, unclear, incomprehensible, inaccessible, or absent. And that professional judgement can often be accompanied by other priorities, human error, biases, and sheer randomness. As for the public, the recent surge of populism suggests things other than ‘facts’ or ‘truth’ can drive them.
Evidently
In the UK, a growing number of intermediary organisations are springing up to facilitate the flow of evidence from researchers to decision makers: What Works Centres, the Behavioural Insights Team (or nudge unit), the Policy Lab, the Parliament knowledge exchange team, the Cabinet Office Open Innovation team, and the Universities Policy Engagement Network (UPEN).
A wide range of sources of research funding either require plans for or evidence of impact, e.g. Impact Acceleration Accounts and the Higher Education Innovation Fund (HEIF). The same focus on impact is also seen among funders outside government e.g. the huge medical charity the Wellcome Trust has a focus on research translation and commercialisation.
However, the two key policies for evaluating (and incentivising) UK universities to have an impact (beyond citations) are the venerable Research Excellence Framework (REF) and the upstart Knowledge Exchange Framework (KEF). The two are not directly comparable as each framework is focused on different types of impact and uses different measurement approaches. Despite that, it’s often the same people in universities who have to deal with both acronyms and as the blossoming relationship between the two is still at the flirting stage, it’s a good moment to look deeper at the particulars of this courtship:
‘Impact’ in REF and KEF
REF | KEF | |
1. | For REF2021 impact case studies, impact is defined as … “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” | Aims to provide more useful, visible, accessible and comparable information about knowledge exchange activities [including different forms of impact] |
2. | A research-specific beauty contest | A holistic assessment of impact, including (but also beyond) research and teaching |
3. | Uses a case study approach, with each put forward by institutions | Uses a case study approach, with each put forward by institutions |
4. | Largely qualitative peer-review by 38 panels of subject-specific experts, though may feature quantitative indicators | Largely quantitative analysis of data, though ‘evidence-based statements’ are submitted for some areas |
5. | Whole sector and first-past-the-post | Clustered and benchmarked |
6. | Focused on wider impact – economic and social | Focused on proxies for economic impact e.g. consultancy income, IP, or spin-outs |
7. | Essential that impact is built on the institution’s own research | Doesn’t necessarily have to be built on the institution’s research |
8. | Focused on linear impact pathways | Can account for a wider range of pathways, by focussing on a handful of outcome metrics |
9. | Subject-cluster level (units of assessment or UoAs) | Institution-level (though split across seven ‘perspectives’) |
10. | A massive effort every six years by institutions | Hopefully only a little extra effort each year by institutions |
11. | UK wide | England only at present |
12. | A quarter of REF2021’s £2bn of funding will be for impact | £0 …for now, though likely to determine HEIF funding (at least) from 2020/21 |
Many around the world are watching these developments in the UK with interest, in particular, what the impact and ROI of the UK exercises are versus other national approaches. But how do other countries do it? And how is learning shared between them? A future article will look at just those questions…