Higher Education Reform

With the REF, We Can Evaluate the Impact of Impact Higher Education Reform
There's impact, and then there's impact, if you know what I mean ...

With the REF, We Can Evaluate the Impact of Impact

December 18, 2014 1242

Drops impact

There’s impact, and then there’s impact, if you know what I mean …

For the first time, the “impact” of academic research on the wider world has been included in a large-scale assessment of the quality of university research, which has just been published. One-fifth of the overall score awarded to each university research department that submitted academics for assessment in the Research Excellence Framework (REF) was based on the impact of their research. It is thought that this weighting will increase to at least a quarter for the next round of assessment in 2020.

Impact could cover the socio-economic or cultural effects of research, as well as its impact on quality of life. It had to be beyond the world of academia, have taken place between 2008 and 2013 and be linked to “internationally recognised” research.

The Conversation logo

This article by Anthony Kelly originally appeared at The Conversation, a Social Science Space partner site, under the title “The impact of impact on the REF”

The results showed that across all the research submitted for assessment, 44 percent was rated as “outstanding”, or four-star, and another 40 percent was judged as “very considerable”, or three-star. These judgements were made by panels of academics and “end users” of research, from across business, the public sector and charities.

Anecdotally, the impact aspect of the REF exercise seems to have run fairly smoothly within universities and across the assessment panels. Yet there are issues going forward. The question is whether it is sustainable in terms of assessment, desirable in terms of unintended consequences and beneficial.

Some subjects easier

Some of the subjects or “units of assessment” in the REF were more easily geared up for “impactfulness” than others. For example, one would expect education and clinical medicine to do well with little effort, while music and philosophy, say, might need a broader definition in order to score as highly.

This is not to say that areas like music and philosophy do not have impact – they clearly do – it’s just that how impact is defined has had to be broadened so that the funding councils can use the same assessment criteria across all disciplines.

Yet this may have unintended negative consequences for the more impactful subjects: as they get increasingly considered “practical” subjects they may be pushed to the margins of theoretical academia. The greater the practical expectations made of a subject, the fewer incentives academics have to develop theory.

The way it is defined to preclude impact within the academic world leaves disciplines such as pure mathematics (now included within “mathematical sciences”) in danger of being amalgamated and merged in order to make them more assessable. This is a classic problem in the world of educational assessment generally: we are in danger of valuing most what can most easily be measured.

Dangers of short-term thinking

On a practical level, there is an issue with limiting impact to a given period and the encouragement this gives to short-termism. This is relatively easy and sensible to do for publications and research income metrics, but it is not so simple for impact.

For the REF 2014, research impact could only be claimed if it occurred during the period 2008-13, but the research that gave rise to it could go back more than a decade before that. Quite properly, that underpinning research must itself have been high quality, but the issue remains for subsequent REFs, starting in 2020, as to how they will deal with overlap and institutions claiming slightly different or extended impacts for the same underpinning research.

Game-playing

Impact within the REF 2014 was assessed on the production of case studies. One case study was required for every ten staff submitted, plus one. So, for example, a department submitting 34 staff would need four case studies or a department submitting 76 staff would need nine.

Needless to say, this resulted in a certain amount of game-playing, so university departments might choose to reduce the number of staff submitted in order to reduce the number of case studies, or vice-versa. My experience would suggest that very few departments increased the number of staff they submitted in order to include an additional case study. Any manipulation was in the downward direction, reducing the size of the staff returned to avoid another case study.

My own experience of serving on the REF panel for education was that impact, as with the other aspects of the exercise, was taken very seriously and reviewed conscientiously. Moderation within and between the sub-panels assessing each subject area was frequent, thorough and necessary. This was time-consuming and costly, but if the Higher Education Funding Council for England, which runs the REF, decides to increase the impact component beyond its current 20 percent in 2020 – and all the signs are that it will – it will mean more extensive cross-checking to make sure it is fair across the board.

***
Anthony Kelly was a member of the Research Excellence Framework 2014 sub panel for Education. He receives funding from the Engineering and Physical Sciences Research Council, the Jersey government, Audit Commission and the Department of Education. The Conversation


Anthony Kelly, the head of Southampton Education School, University of Southampton, is a theoretician specializing in educational effectiveness and improvement; in particular, as it relates to educational leadership, governance and policy analysis, in adapting capability and game theoretic concepts to schooling, and in developing innovative quantitative approaches and mathematical modelling techniques for use in educational research.

View all posts by Anthony Kelly

Related Articles

Young Scholars Can’t Take the Field in Game of  Academic Metrics
Infrastructure
December 18, 2024

Young Scholars Can’t Take the Field in Game of Academic Metrics

Read Now
From the University to the Edu-Factory: Understanding the Crisis of Higher Education
Industry
November 25, 2024

From the University to the Edu-Factory: Understanding the Crisis of Higher Education

Read Now
Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

The creation of the Coalition for Advancing Research Assessment (CoARA) has led to a heated debate on the balance between peer review and evaluative metrics in research assessment regimes. Luciana Balboa, Elizabeth Gadd, Eva Mendez, Janne Pölönen, Karen Stroobants, Erzsebet Toth Cithra and the CoARA Steering Board address these arguments and state CoARA’s commitment to finding ways in which peer review and bibliometrics can be used together responsibly.

Read Now
Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Sage 1255 Impact

Psychologists Jonathan St. B. T. Evans and Keith E. Stanovich have a history of publishing important research papers that resonate for years.

Read Now
Webinar: Fundamentals of Research Impact

Webinar: Fundamentals of Research Impact

Sage 1094 Event, Impact

Whether you’re in a research leadership position, working in research development, or a researcher embarking on their project, creating a culture of […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments