Research

A New Front in the Replication Wars: Economics Research
They might not be exactly identical, but shouldn't they at least taste the same?

A New Front in the Replication Wars: Economics

November 5, 2015 1497

Replicating cookies

They might not be exactly identical, but shouldn’t they at least taste the same?

A sense of crisis is developing in economics after two Federal Reserve economists came to the alarming conclusion that economics research is usually not replicable.

The economists took 67 empirical papers from 13 reputable academic journals. Without assistance from the original researchers they were only able to get the same result in a third of cases.

With the original researchers’ assistance, that percentage increased to about half, suggesting reporting practices and requirements are seriously deficient.

The Conversation logo

This article by Andreas Ortmann originally appeared at The Conversation, a Social Science Space partner site, under the title “The replication crisis has engulfed economics”

The replication crisis in psychology is well-documented. Science recently published a stunning report by the Open Science Collaboration. Almost 300 researchers were involved in trying to directly replicate the results of 100 papers published in 2008. This followed earlier exercises involving many labs (such as here, here and here.)

The researchers did not succeed in the clear majority of cases. On average they found the mean effect size to be only half of what was reported in the original studies. While the report has been questioned (here and here,) there is growing concern that a cornerstone of the scientific edifice is in serious need of renovation.

What’s the problem?
Researchers are too often granted inappropriate degrees of freedom, and some are just fraudulent. But that said, some of these distressing replication results are because good science is messy. It involves hard work and reasonable people can reasonably disagree on the various calls that have to be made.

A good illustration is this just-published study by Raphael Silberzahn and Eric Uhlmann. The researchers engaged in methodological debates with well-known data sleuth Uri Simohnson.

Simohnson questioned the results of an earlier study from the pair that suggested noble-sounding German names could boost careers. Re-running the analysis with a better analytical approach, Simonsohn did not confirm the effect. Silberzahn and Uhlmann eventually conceded the point in a joint paper with Simonsohn.

In their new study, the researchers provided a data set and asked more than two dozen teams of researchers to contribute. They sought to determine, based on the data set, whether skin color of soccer players from four major leagues (England, France, Germany, and Spain) influenced how often they were given a red card.

Somewhat shockingly, the answers were rather diverse. Of the 29 teams, 20 found a statistically significant correlation with the median, suggesting dark-skinned players were 1.3 times more likely than light-skinned players to be sent off.

But the researchers reported:

“Findings varied enormously, from a slight (and non-significant) tendency for referees to give more red cards to light-skinned players to a strong trend of giving more red cards to dark-skinned players.”

Interestingly, this diversity of results survived even after the researchers debated the methodological approach.

The upshot is that even under the best of circumstances – one data set, what seems like a straightforward question to answer, and an exchange of ideas on the best method – arriving at consensus can be extraordinarily difficult. And it surely becomes even more difficult with multiple data sets and many teams.

Further scrutiny
That, of course, is hardly news to most social scientists, who largely accept that any single study is worth only so much. This is why replication efforts and meta-analyses are as important as the recent focus on publication bias and underpowered studies. There is tantalising evidence that many experimental economics studies are severely under-powered (although the evidence so far has been established only for a very simple class of games).

It will be interesting to see the result of a current collaborative effort by economists to replicate 18 laboratory economics studies from 2011 to 2014.

It is not just the social sciences that are in the grip of replication crises. The extent and consequences of p-hacking, and publication biases (studies that report no effect not being published) in science, are well-documented and have been known for a while.

So, where to from here? With a number of journals (including the Journal of the Economic Science Association, Experimental Economics, Journal of Experimental Social Psychology, Journal of Personality and Social Psychology, Psychological Science, Perspectives on Psychological Science) opening their doors to replication in various guises, we can expect more results to seemingly discredit the social sciences.

Hopefully in the long run it will up the ante on what it takes for a study to be reliable. Replication studies can inflict considerable damage on individuals’ productivity and reputation. There’s a need for minimal reporting standards and acceptable replication etiquette to be clarified, such as whether original authors have to be invited or consulted. Journals should become more serious about their data set collection efforts, when not prevented by confidentiality.The Conversation


Andreas Ortmann took up his current position of professor of experimental and behavioural economics in the School of Economics, UNSW Australia Business School in 2009. Prior to his appointment at the Business School, he was the (Boston Consulting Group) professor of economics at CERGE-EI, a joint workplace of Charles University and the Academy of Sciences, Prague, Czech Republic. Prior to that appointment, he taught at Bowdoin and Colby College, in the US state of Maine. He also was, for a year each, a visiting scholar of the Program on Non-Profit Organizations at Yale University, the Max-Planck Institute for Psychological Research in Munich, the Max-Planck Institute for Human Development in Berlin, and the Harvard Business School.

View all posts by Andreas Ortmann

Related Articles

Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions
Research
October 10, 2024

Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions

Read Now
Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work
Social Science Bites
July 1, 2024

Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Read Now
How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment
Insights
June 14, 2024

How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment

Read Now
How Social Science Can Hurt Those It Loves
Ethics
June 4, 2024

How Social Science Can Hurt Those It Loves

Read Now
Digital Scholarly Records are Facing New Risks

Digital Scholarly Records are Facing New Risks

Drawing on a study of Crossref DOI data, Martin Eve finds evidence to suggest that the current standard of digital preservation could fall worryingly short of ensuring persistent accurate record of scholarly works.

Read Now
Analyzing the Impact: Social Media and Mental Health 

Analyzing the Impact: Social Media and Mental Health 

The social and behavioral sciences supply evidence-based research that enables us to make sense of the shifting online landscape pertaining to mental health. We’ll explore three freely accessible articles (listed below) that give us a fuller picture on how TikTok, Instagram, Snapchat, and online forums affect mental health. 

Read Now
New Fellowship for Community-Led Development Research of Latin America and the Caribbean Now Open

New Fellowship for Community-Led Development Research of Latin America and the Caribbean Now Open

Thanks to a collaboration between the Inter-American Foundation (IAF) and the Social Science Research Council (SSRC), applications are now being accepted for […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments