Featured

At a Glance: The UK’s Twin-track Approach to Measuring Impact

March 18, 2019 2103

Around the world, there’s a persistent belief that policy, practice and the public should all be better informed by evidence. Whether it’s immunisation programmes, literacy initiatives, or development banks, we hope decisions made by politicians, civil servants, doctors, teachers and others make use of the best available evidence (along with their professional judgement). Likewise, the public is expected to engage with, understand and respond to information that makes a range of different claims.

This should be good news for academic researchers, creating a growing demand for the valuable outputs they supply. However, the real world is never as simple as linear impact flowcharts suggest. The ‘evidence’ is often contested, unclear, incomprehensible, inaccessible, or absent. And that professional judgement can often be accompanied by other priorities, human error, biases, and sheer randomness. As for the public, the recent surge of populism suggests things other than ‘facts’ or ‘truth’ can drive them.

Evidently

In the UK, a growing number of intermediary organisations are springing up to facilitate the flow of evidence from researchers to decision makers: What Works Centres, the Behavioural Insights Team (or nudge unit), the Policy Lab, the Parliament knowledge exchange team, the Cabinet Office Open Innovation team, and the Universities Policy Engagement Network (UPEN).

A wide range of sources of research funding either require plans for or evidence of impact, e.g. Impact Acceleration Accounts and the Higher Education Innovation Fund (HEIF). The same focus on impact is also seen among funders outside government e.g. the huge medical charity the Wellcome Trust has a focus on research translation and commercialisation.

However, the two key policies for evaluating (and incentivising) UK universities to have an impact (beyond citations) are the venerable Research Excellence Framework (REF) and the upstart Knowledge Exchange Framework (KEF). The two are not directly comparable as each framework is focused on different types of impact and uses different measurement approaches. Despite that, it’s often the same people in universities who have to deal with both acronyms and as the blossoming relationship between the two is still at the flirting stage, it’s a good moment to look deeper at the particulars of this courtship:

‘Impact’ in REF and KEF

REF KEF
1.For REF2021 impact case studies, impact is defined as … “an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia”Aims to provide more useful, visible, accessible and comparable information about knowledge exchange activities [including different forms of impact]
2.A research-specific beauty contest A holistic assessment of impact, including (but also beyond) research and teaching
3.Uses a case study approach, with each put forward by institutionsUses a case study approach, with each put forward by institutions
4.Largely qualitative peer-review by 38 panels of subject-specific experts, though may feature quantitative indicatorsLargely quantitative analysis of data, though ‘evidence-based statements’ are submitted for some areas
5.Whole sector and first-past-the-postClustered and benchmarked
6.Focused on wider impact – economic and social
Focused on proxies for economic impact e.g. consultancy income, IP, or spin-outs
7.Essential that impact is built on the institution’s own research Doesn’t necessarily have to be built on the institution’s research
8. Focused on linear impact pathwaysCan account for a wider range of pathways, by focussing on a handful of outcome metrics
9. Subject-cluster level (units of assessment or UoAs)Institution-level (though split across seven ‘perspectives’)
10.A massive effort every six years by institutionsHopefully only a little extra effort each year by institutions
11.UK wideEngland only at present
12.A quarter of REF2021’s £2bn of funding will be for impact£0 …for now, though likely to determine HEIF funding (at least) from 2020/21

Many around the world are watching these developments in the UK with interest, in particular, what the impact and ROI of the UK exercises are versus other national approaches. But how do other countries do it? And how is learning shared between them? A future article will look at just those questions…


Louis Coiffait is a commentator, researcher, speaker, and adviser focused on higher education policy, with a particular interest in impact and knowledge exchange He has worked with Pearson, Taylor & Francis, SAGE Publishing, Wonkhe, think tanks, the Higher Education Academy, the National Foundation for Educational Research, the National Association of Head Teachers, the Teacher Training Agency, an MP, and a Minister. He has led projects on how publishers can support the impact agenda, the future of higher education (the Blue Skies series at Pearson), access to elite universities, careers guidance, enterprise education, and the national STEM skills pipeline (for the National Audit Office). He is also committed to volunteering, including over a decade as a school governor and chair of an eight-school federation in Hackney in East London, and recently as vice-chair of a school in Tower Hamlets. He spent three years as chair of Westminster Students’ Union. He studied at York, UCLA and Cambridge. Louis is an RSA Fellow, amateur photographer, “enthusiastic” sportsman, proud East London citizen and Yorkshireman (really).

View all posts by Louis Coiffait

Related Articles

Crafting the Best DEI Policies: Include Everyone and Include Evidence
Public Policy
August 30, 2024

Crafting the Best DEI Policies: Include Everyone and Include Evidence

Read Now
The Public’s Statistics Should Serve, Well, the Public
Industry
August 15, 2024

The Public’s Statistics Should Serve, Well, the Public

Read Now
The Decameron Revisited – Pandemic as Farce
Public Engagement
August 6, 2024

The Decameron Revisited – Pandemic as Farce

Read Now
Why, and How, We Must Contest ‘Development’
Insights
July 29, 2024

Why, and How, We Must Contest ‘Development’

Read Now
New SSRC Project Aims to Develop AI Principles for Private Sector

New SSRC Project Aims to Develop AI Principles for Private Sector

The new AI Disclosures Project seeks to create structures that both recognize the commercial enticements of AI while ensuring that issues of safety and equity are front and center in the decisions private actors make about AI deployment.

Read Now
Paper Opening Science to the New Statistics Proves Its Import a Decade Later

Paper Opening Science to the New Statistics Proves Its Import a Decade Later

An article in the journal Psychological Science, “The New Statistics: Why and How” by La Trobe University’s Geoff Cumming, has proved remarkably popular in the years since and is the third-most cited paper published in a Sage journal in 2013.

Read Now
Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Megan Stevenson’s work finds little success in applying reforms derived from certain types of social science research on criminal justice.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments