International Debate

Can We Replicate the Reported Crisis in Psychology?

June 27, 2016 1446

bad apple_optModern psychology is apparently in crisis. This claim is nothing new. From phrenology to psychoanalysis, psychology has traditionally had an uneasy scientific status. Indeed, the philosopher of science, Karl Popper, viewed Freud’s theories as a typical example of pseudoscience because no test could ever show them to be false. More recently, psychology has feasted on a banquet of extraordinary findings whose scientific credibility has also been questioned.

Some of these extraordinary findings include Daryl Bem’s experiments, published in 2011, that seem to show future events influence the past. Bem, an emeritus professor at Cornell University, revealed that people are more likely to remember a list of words if they practice them after a recall test, compared with practicing them before the test. In another study, he showed that people are significantly better than chance at selecting which of two curtains hide a pornographic image.

The Conversation logo

This article by Keith Laws originally appeared at The Conversation, a Social Science Space partner site, under the title “Is psychology really in crisis?”

Then there’s Yale’s John Bargh who in 1996 reported that, when unconsciously primed with an “elderly stereotype” (by unscrambling jumbled sentences containing words such as “Florida” and “bingo”), people subsequently walk more slowly. Add to this Roy Baumeister who in 1998 presented evidence suggesting we have a finite store of will-power which is sapped whenever we resist temptations such as eating chocolates. Or, in the same year, Ap Dijksterhuis and Ad Van Knippenberg showing that performance on Trivial Pursuit is better after people list typical characteristics of a professor rather than those of a football hooligan.

These studies are among the most controversial in psychology. Not least because other researchers have had difficulty replicating the experiments. These types of studies raise concerns about the methods psychologists use, but also more broadly about psychology itself.

Do not repeat

A survey of 1,500 scientists published in Nature last month indicated that 24% of them said they had published a successful replication and 13% published an unsuccessful replication. Contrast this with over a century of psychology publications, where just 1% of papers attempted to replicate past findings.

Editors and reviewers have been complicit in a systemic bias that has resulted in high-profile psychology journals becoming storehouses for the strange. Many psychologists are obsessed with the “impact factors” of journals (as are the journals) – and one way to increase impact is to publish curios. Certain high-impact journals have a reputation of publishing curios that never get replicated but which attract lots of attention for the author and journal. By contrast, confirming the findings of others through replication is unattractive, rare and relegated to less prestigious journals.

Despite psychology’s historical abandonment of replication, is the tide turning? This year, a crowd-sourced initiative – the OSC Reproducibility project – attempted to replicate 100 published findings in psychology. The multinational collaborators replicated just over a third (36 percent) of the studies. Does this mean that psychological findings are unreliable?

Replication projects are selective, targeting studies that are cheaper and less technically complicated to replicate or those that are simply unbelievable. Other projects such as “Many Labs” have reported a replication rate of 77 percent. All initiatives are non-random and headline replication rates reflect the studies that are sampled. Even if a random sample of studies were examined, we don’t know what would constitute an acceptable replication rate in psychology. This is not an issue specific to psychology. As John Ioannidis noted: “most published research findings are false””. After all, scientific hypotheses are our current best guesses about phenomena, not a simple accumulation of truths.

Questionable research practices

The frustration of many psychologists is palpable because it seems so easy to publish evidence consistent with almost any hypothesis. A likely cause of both unusual findings and non-replicability is psychologists indulging in questionable research practices (QRPs).

In 2012, a survey of 2,000 American psychologists found that most indulged in QRPs. Some 67% admitted selectively reporting studies that “worked”, while 74% failed to report all measures they had used. The survey also found that 71% continued to collect data until a significant result was obtained and 54% reported unexpected findings as if they were expected. And 58% excluded data after analyses. Astonishingly, more than one-third admitted they had doubts about the integrity of their own research on at least one occasion and 1.7% admitted to having faked their data.

The problems associated with modern psychology are longstanding and cultural, with researchers, reviewers, editors, journals and news-media all prioritizing and benefiting from the quest for novelty. This systemic bias, coupled with minimal agreement on fundamental principles in certain areas of psychology, means questionable research practices can flourish – consciously or unconsciously. Large-scale replication projects will not address the cultural problems and may even exacerbate them by presenting replication as something special that we use to target the unbelievable. Replication – whether judged as failed or successful – is a fundamental aspect of normal science and needs to be both more common and more valued by psychologists and psychology journals.The Conversation


Keith Laws is professor of cognitive neuropsychology in the School of Life and Medical Sciences at the University of Hertfordshire. He completed a PhD at the Department of Experimental Psychology, University of Cambridge and is the author of more than100 papers and a recent book, Category-Specificity: Evidence for Modularity of Mind. His research focuses on cognitive function in a variety disorders including Alzheimer's disease, schizophrenia, and obsessive-compulsive disorder.

View all posts by Keith Laws

Related Articles

Deciphering the Mystery of the Working-Class Voter: A View From Britain
Insights
November 14, 2024

Deciphering the Mystery of the Working-Class Voter: A View From Britain

Read Now
Julia Ebner on Violent Extremism
Insights
November 4, 2024

Julia Ebner on Violent Extremism

Read Now
Emerson College Pollsters Explain How Pollsters Do What They Do
International Debate
October 23, 2024

Emerson College Pollsters Explain How Pollsters Do What They Do

Read Now
All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture
Event
October 10, 2024

All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

Read Now
Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions

Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions

When scientists make important discoveries, both big and small, they typically publish their findings in scientific journals for others to read. This […]

Read Now
‘Settler Colonialism’ and the Promised Land

‘Settler Colonialism’ and the Promised Land

The term ‘settler colonialism’ was coined by an Australian historian in the 1960s to describe the occupation of a territory with a […]

Read Now
Webinar: Banned Books Week 2024

Webinar: Banned Books Week 2024

As book bans and academic censorship escalate across the United States, this free hour-long webinar gathers experts to discuss the impact these […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

1 Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Bryan

Can this also be a link to the publish or perish attitude towards funding? I’ve seen people under pressure just to publish articles otherwise their funding is taken away. Maybe we do need to have a good look at ourselves and that replications should be required and taken more seriously. I know I’d trust something more if the effect could be found in more than one study, but they don’t get the funding, it’s always something new and different that needs to be the latest article. We MUST go back and give ourselves a good solid foundation to build on,… Read more »