Impact

Can We Use Altmetrics to Measure Societal Impact?

April 9, 2019 2344

Measuring sticks
(Image: Frankieleon/Flickr)

Many benefits emerge from academic research, and they have impact on stakeholders in diverse ways. Impact is in this way a multi-faceted phenomenon, which raises the following question; what are the most informative tools to track these different outcomes?

LSE-impact-blog-logo
This article by Lutz Bornmann and Robin Haunschild originally appeared on the LSE Impact of Social Sciences blog as “Are altmetrics able to measure societal impact in a similar way to peer review?” and is reposted under the Creative Commons license (CC BY 3.0).

When quantitative approaches to research evaluation were first trialed at the end of the 1980s, they mostly drew on publication and citation data (bibliometrics) and they mostly targeted academic impact – the impact of research on other academics. More highly-cited work was taken as an indicator of research ‘excellence’, which was widely pursued as a public policy goal. Academic research excellence remains important, but the policy agenda has shifted, notably since the introduction of societal impact considerations into the UK’s Research Excellence Framework (REF). However, assessing the nature, scale, and beneficiaries of research impact, especially quantitatively, remains a complex undertaking.

One potential way of quantitatively assessing societal impact has been through altmetrics – online indicators of research use. In a recent study, based on data from the UK REF, we therefore decided to examine the extent to which altmetrics are able to measure societal impact in a way similar to the peer review of case studies. We found a relationship, but not one that provides a firm indicator.

Fortunately, data for REF2014 are available for comprehensive studies. One key feature of the review process is that we have two distinct and therefore comparable types of publications being submitted: (i) evidence of academic achievement based on four selected outputs per researcher and (ii) evidence of socio-economic impact based on case studies with six underpinning references.

Drawn from …
This blog post is based on the authors’ co-written article “Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)” in the Journal of Informetrics.

Our study focused on those items submitted to REF2014 that can be uniquely identified via DOIs (Digital Object Identifiers): essentially, journal papers rather than other output types. For journal papers, we can also acquire impact and attention data: citation counts and media mentions of various kinds. We anticipated that the impact of papers submitted for academic outputs would be significantly different from the impact of references cited in case studies: the former should be strong in academia, but weak in other sectors of society, whereas, the opposite should be true for the latter.

For our analysis, the test prediction was that papers that were submitted as evidence of academic achievement would be relatively well-cited compared to papers that supported case studies. By contrast, the papers supporting case studies might be relatively less well-cited but would have wider societal recognition, trackable through Altmetric.com data (sourced from Twitter, Wikipedia, Facebook, policy-related documents, news items, and blogs). If we discovered that there was no difference between these publication sets in their bibliometric citations and altmetric mentions, then our ability to quantitatively distinguish between different kinds of impacts is brought into doubt.

In practice, we compared three publication categories, not two, because the pool of submitted outputs and case study references overlap to a substantial degree. There were 120,784 journal papers in the REF2014 database of submitted research outputs (PRO) and 11,822 journal papers among the case study references (PCS) which we were able to match with citation data via their DOI. 5,703 papers were submitted in 2014 as both PROs and PCSs (PRO/PCS). Intriguingly, the overlap was lower in basic research areas than in applied research areas.

Our study examined convergent (and discriminant) validity: do indices of societal and academic impact vary in distinct ways between PCS and PRO articles? Do different approaches to societal impact (REF scores and altmetrics) create comparable measures of a common construct in case study data (if they measure the same construct and are convergently valid, then REF scores should correlate with altmetrics data).

We expected higher correlations (i) for PRO between REF output scores and citation impact and (ii) for PCS between REF impact scores (for case studies) and altmetrics. Lower correlations are expected for (i) REF output scores and altmetrics for PRO and (ii) REF impact scores (for case studies) and citation impact for PCS.

We found:

  • Average bibliometric citation impact is higher for PRO than for PCS.
  • Mentions of papers in policy-related documents (especially) and Wikipedia are significantly higher for PCS than for PRO; the result for news items is similar, though slightly smaller.
  • For Twitter counts, the PCS-PRO difference is close to zero, nor do counts correlate with citations: tweets do not appear to reflect any serious form of impact.
  • The highest scores, across all indicators, were associated with the PCS/PRO overlap. These publications were cited as frequently as the pure PRO set and scored higher than the pure PCS on altmetrics for every source.

We then correlated REF scores and metrics on the basis of UK research institutions, following the approach of comparing decisions in peer review with later citation impact (Bornmann, 2011). We found that REF scores on impact case studies correlated only weakly with altmetrics, thereby disqualifying arguments in favor of using altmetrics for societal or broader impact measurements. The weak relationship between peer assessments of societal impact and altmetrics data mirrors other studies (Thelwall & Kousha 2015) and questions any application of altmetrics in measuring societal impact in research evaluation. Whereas peers can acknowledge a successful link between research and societal impacts (based on descriptions in case studies), altmetrics do not seem to be able to reflect that. Altmetrics may instead demonstrate distinct public discussions around certain research topics (which can be visualized, see Haunschild, Leydesdorff, Bornmann, Hellsten, and Marx, 2019).

Perhaps the most interesting results here, are the relatively high scores – across the board – for publications that were submitted by individual researchers and then also used to support case studies. Some outputs have an evident capacity for impact, whether that is among other researchers, or in their potential for wider application. There is therefore no necessary gap between academic and societal value, a conclusion that has been known at least since Vannevar Bush’s seminal Science, the Endless Frontier. Societal value can be expected from research that follows high academic standards.


Lutz Bornmann (pictured) is a habilitated sociologist of science and works at the Division for Science and Innovation Studies of the Max Planck Society, He received the Derek de Solla Price Memorial Medal in 2019. Robin Haunschild is at the Max Planck Institute for Solid State Research, A chemist, his current research interests include the study of altmetrics and the application of bibliometrics to specific fields of natural sciences, e.g. climate change.

View all posts by Lutz Bornmann and Robin Haunschild

Related Articles

Young Scholars Can’t Take the Field in Game of  Academic Metrics
Infrastructure
December 18, 2024

Young Scholars Can’t Take the Field in Game of Academic Metrics

Read Now
Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures
Impact
September 23, 2024

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Read Now
Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Sage 1254 Impact

Psychologists Jonathan St. B. T. Evans and Keith E. Stanovich have a history of publishing important research papers that resonate for years.

Read Now
Webinar: Fundamentals of Research Impact

Webinar: Fundamentals of Research Impact

Sage 1094 Event, Impact

Whether you’re in a research leadership position, working in research development, or a researcher embarking on their project, creating a culture of […]

Read Now
Paper Opening Science to the New Statistics Proves Its Import a Decade Later

Paper Opening Science to the New Statistics Proves Its Import a Decade Later

An article in the journal Psychological Science, “The New Statistics: Why and How” by La Trobe University’s Geoff Cumming, has proved remarkably popular in the years since and is the third-most cited paper published in a Sage journal in 2013.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments