Industry

We Unintentionally Hit a Nerve When Bemoaning the State of Peer Review

July 26, 2022 1516
Astronaut floating in space working on repair to International Space Station
You don’t have to be a rocket scientist to fix peer review. An astronomic number of ideas to repair peer review followed a tweet about the system. (Photo: NASA)

The peer review process is a cornerstone of modern scholarship. Before new work is published in an academic journal, experts scrutinize the evidence, research and arguments to make sure they stack up.

However, many authors, reviewers and editors have problems with the way the modern peer review system works. It can be slow, opaque and cliquey, and it runs on volunteer labor from already overworked academics.

Last month, one of us (Kelly-Ann Allen) expressed her frustration at the difficulties of finding peer reviewers on Twitter. Hundreds of replies later, we had a huge crowd-sourced collection of criticisms of peer review and suggestions for how to make it better.

The suggestions for journals, publishers and universities show there is plenty to be done to make peer review more accountable, fair and inclusive. We have summarised our full findings below.

The Conversation logo
This article by Kelly-Ann Allen, Jonathan Reardon, Joseph Crawford and Lucas Walsh originally appeared on The Conversation, a Social Science Space partner site, under the title “The peer review system is broken. We asked academics how to fix it”

Three challenges of peer review

We see three main challenges facing the peer review system.

First, peer review can be exploitative.

Many of the companies that publish academic journals make a profit from subscriptions and sales. However, the authors, editors and peer reviewers generally give their time and effort on a voluntary basis, effectively performing free labor.

And while peer review is often seen as a collective enterprise of the academic community, in practice a small fraction of researchers do most of the work. One study of biomedical journals found that, in 2015, just 20 percent of researchers performed up to 94 percent of the peer reviewing.

Peer review can be a ‘black box’

The second challenge is a lack of transparency in the peer review process.

Peer review is generally carried out anonymously: researchers don’t know who is reviewing their work, and reviewers don’t know whose work they are reviewing. This provides space for honesty, but can also make the process less open and accountable.

The opacity may also suppress discussion, protect biases, and decrease the quality of the reviews.

Peer review can be slow

The final challenge is the speed of peer review.

When a researcher submits a paper to a journal, if they make it past initial rejection, they may face a long wait for review and eventual publication. It is not uncommon for research to be published a year or more after submission.

This delay is bad for everyone. For policymakers, leaders and the public, it means they may be making decisions based on outdated scientific evidence. For scholars, delays can stall their careers as they wait for the publications they need to get promotions or tenure.

Scholars suggest the delays are typically caused by a shortage of reviewers. Many academics report challenging workloads can discourage them from participating in peer review, and this has become worse since the onset of the COVID-19 pandemic.

It has also been found that many journals rely heavily on US and European reviewers, limiting the size and diversity of the pool of reviewers.

Can we fix peer review?

So, what can be done? Most of the constructive suggestions from the large Twitter conversation mentioned earlier fell into three categories.

First, many suggested there should be better incentives for conducting peer reviews.

This might include publishers paying reviewers (the journals of the American Economic Association already do this) or giving some profits to research departments. Journals could also offer reviewers free subscriptions, publication fee vouchers, or fast-track reviews.

However, we should recognize that journals offering incentives might create new problems.

Another suggestion is that universities could do better in acknowledging peer review as part of the academic workload, and perhaps reward outstanding contributors to peer review.

Some Twitter commentators argued tenured scholars should review a certain number of articles each year. Others thought more should be done to support non-profit journals, given a recent study found some 140 journals in Australia alone ceased publishing between 2011 and 2021.

Most respondents agreed that conflicts of interest should be avoided. Some suggested databases of experts would make it easier to find relevant reviewers.

Use more inclusive peer review recruitment strategies

Many respondents also suggested journals can improve how they recruit reviewers, and what work they distribute. Expert reviewers could be selected on the basis of method or content expertise, and asked to focus on that element rather than both.

Respondents also argued journals should do more to tailor their invitations to target the most relevant experts, with a simpler process to accept or reject the offer.

Others felt that more non-tenured scholars, Ph.D. researchers, people working in related industries, and retired experts should be recruited. More peer review training for graduate students and increased representation for women and underrepresented minorities would be a good start.

Rethink double-blind peer review

Some respondents pointed to a growing movement toward more open peer review processes, which may create a more human and transparent approach to reviewing. For example, Royal Society Open Science publishes all decisions, review letters, and voluntary identification of peer reviewers.

Another suggestion to speed up the publishing process was to give higher priority to time-sensitive research.

What can be done?

The overall message from the enormous response to a single tweet is that there is a need for systemic changes within the peer review process.

There is no shortage of ideas for how to improve the process for the benefit of scholars and the broader public. However, it will be up to journals, publishers and universities to put them into practice and create a more accountable, fair and inclusive system.


The authors would like to thank Emily Rainsford, David V. Smith and Yumin Lu for their contribution to the original article, “Towards improving peer review: Crowd-sourced insights from Twitter.

Kelly-Ann Allen (pictured) is an associate professor in the School of Educational Psychology and Counselling at the Faculty of Education at Monash University. Jonathan Reardon is a researcher at Durham University. Joseph Crawford is a senior lecturer of educational innovation at the University of Tasmania. Lucas Walsh is a professor and director of the Centre for Youth Policy and Education Practice at Monash University.

View all posts by Kelly-Ann Allen, Jonathan Reardon, Joseph Crawford and Lucas Walsh

Related Articles

Mapping the Connections: Understanding the Network of Social Science Editors-in-Chief 
Communication
April 29, 2025

Mapping the Connections: Understanding the Network of Social Science Editors-in-Chief 

Read Now
How Science Can Adapt to a New Normal
Public Policy
March 14, 2025

How Science Can Adapt to a New Normal

Read Now
Those ‘Indirect Costs’ Targeted by DOGE Directly Support America’s Research Excellence
News
February 12, 2025

Those ‘Indirect Costs’ Targeted by DOGE Directly Support America’s Research Excellence

Read Now
AI is Here, But Is It Here to Help Us or Replace Us?
Bookshelf
February 11, 2025

AI is Here, But Is It Here to Help Us or Replace Us?

Read Now
An Investigation Showing How Fake Academic Papers Contaminate Scientific Literature

An Investigation Showing How Fake Academic Papers Contaminate Scientific Literature

Over the past decade, furtive commercial entities around the world have industrialized the production, sale and dissemination of bogus scholarly research, undermining […]

Read Now
Data Sharing: Let’s Do More Than Just What’s FAIR

Data Sharing: Let’s Do More Than Just What’s FAIR

Research into pressing societal challenges increasingly depends on data coming from across different disciplines and research contexts. Gordon Blair argues that to create a research culture that makes the best use of available data, the 2016 FAIR principles need to be extended in ways that address issues that have emerged in the decade following their creation.

Read Now
From the University to the Edu-Factory: Understanding the Crisis of Higher Education

From the University to the Edu-Factory: Understanding the Crisis of Higher Education

It is a truism that academia is in crisis, in the UK as much as in many other countries around the world. […]

Read Now
4 1 vote
Article Rating
Subscribe
Notify of
guest


This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments