International Debate

Can Greater Transparency Lead to Better Social Science? International Debate
Mind if we drop in and make things a bit clearer? (Photo of extreme window cleaning at the Spinnaker Tower, Portsmouth, UK © Graham Robson and licensed for reuse under this Creative Commons Licence)

Can Greater Transparency Lead to Better Social Science?

September 25, 2014 1249

This article, a guest post by political scientists Mike FindleyNathan JensenEdmund Malesky and Tom Pepinsky, originally appeared on The Washington Post’s Monkey Cage blog under the title,“Can greater transparency lead to better social science?” It is republished here with permission from the Monkey Cage editors.

***

Mind if we drop in and make things a bit clearer? (Photo of extreme window cleaning at the Spinnaker Tower, Portsmouth, UK © Graham Robson and licensed for reuse under this Creative Commons Licence)

Mind if we drop in and make things a bit clearer? (Photo of extreme window cleaning at the Spinnaker Tower, Portsmouth, UK © Graham Robson and licensed for reuse under this Creative Commons Licence)

The distinguishing features of the scientific enterprise — what makes it different from art, or rhetoric — is that its standards and methodologies are publiccontested and replicable.  This applies equally to the social sciences as it does to the hard or natural sciences. Scientific progress requires that scholars articulate their arguments, describe their methodologies, and reproduce the evidence that they use, and that others participate in this endeavor by questioning and critiquing each. This requires transparency. If an investigator is sufficiently transparent with her procedures, then anyone else should be able to verify the investigator’s work by following her exact steps.

But many social scientists and natural scientists alike have come to believe that the rigors and realities of the scientific publication process are actually standing in the way of scientific progress. Reviewers and editors of academic books and journals tend to privilege interesting or provocative findings, and are unable to witness the entire history of a research project. It stands to reason, then, that scholars may think strategically about what editors will want, and make particular design choices at the expense of others that could be more scientifically sound.

Moreover, scientists in all fields are often rewarded for producingpublications, not for producing knowledge. Taken together, this means that “boring” findings, or findings that fail to support an author’s preferred hypotheses, are unlikely to be published — the so-called “file drawer problem.” More perniciously, it can incentivize scholars to hide known problems in their research, or even encourage outright fraud, as evinced by the recent cases of psychologist Diederik Stapel and acoustician Peter Chen.

The result is that scientists may have created an incentive system that stands in the way of producing credible science. It is very difficult to publish “non-findings” or findings of no effect or relationship, even though these are absolutely central to scientific progress in any field. And those findings that do survive the perilous peer review process may be of dubious validity.

Political scientists have been at the forefront of recent debates about transparency and reform in scientific research, and have accumulated compelling evidence that, at least in the social sciences, the publication process is distorting scientific progress (see e.g. here and here). Recently,Brendan Nyhan proposed a set of reforms that, if implemented, could ameliorate these problems. His recommendations include pre-registration of research plans, encouraging the publication of non-results and replication studies, and others (see the full white paper for more).

In cooperation with the editors of the journal Comparative Political Studies, we are implementing some of what Nyhan recommends. The core of our idea is that for one special issue of this journal, articles will be judged based on reviewers’ evaluations of what authors intend to do rather than what they report as their findings. Some authors will submit manuscripts with all mention of the results eliminated, which means that they will have to justify their theories and methods without reference to what they have found. Other authors will submit manuscripts with full descriptions of research projects that have yet to be executed, so the results are not even known to the authors themselves. In both cases, reviewers and editors must judge manuscripts solely on the coherence of their theories, the quality of their design, the appropriateness of their empirical methods, and the importance of their research question.

In an ideal world, embracing transparency in this fashion will eliminate the pernicious incentives that we identified above. Authors should have no incentives to manipulate their results (after all, if you have a guarantee that your findings will be published anyway, what does it matter what you find?). And if there are problems with their methodological choices or theoretical expectations, they can be identified ex ante rather than justifiedex post.

Are there any possible costs? Skeptics worry that pre-registration will discourage creativity and discovery, it will create a kind of science police, it will flood journals with null results, and others.


Interested?

The editors of the special issue of Comparative Political Studies will accept submissions for the issue through October 15. To submit a proposal, or to learn more about the special issue, e-mail them at transparency@ipdutexas.org.


Some of these worries are more reasonable than others. Our special issue will guard against frivolous submissions of trivial non-findings because there is no incentive to submit a frivolous manuscript in the first place. Only plausible arguments with plausible research designs have a chance at surviving the review process, and to the extent that reviewers view their designs as appropriate, we should welcome them even if they do produce inconsistent or null findings. (The added benefit is that there will be no temptation to justify frivolous findings just because they are findings.)

The worry that pre-registration and result-free peer review will produce a science police is a non-issue, from our perspective. Scientific progress requires community policing. Of course, the terms of that policing must be subject to debate, but the benefit of our special issue is that we will be able to see what happens when the model is actually implemented.

However, other worries are perhaps more reasonable. For one, we suspect that our special issue will attract mostly quantitative research on well-defined issues in political science. Exploratory, qualitative, historical, and case-based research is much harder to present in a results-free manner, and perhaps impossible to pre-register. This problem is not unique to the social sciences: case studies, such as those published in the New England Journal of Medicine as case records, could almost certainly never be pre-registered, nor the results made blind. Indeed, NEJM case studies are important precisely because they illustrate the complexity and uncertainty of actual medical practice. Our hunch is that the model of peer review we are proposing is more difficult in areas of new research, than more mature fields of inquiry. We hope, nonetheless, that the special issue may ignite discussion of how the principle of transparency could be applied in other areas of social science.

We also recognize that pre-registration may commit scholars to executing research plans that they quickly learn to be infeasible when implementing them. Our hope is that flexible registration procedures and the guarantee of publication will encourage scholars to report how reality forced them to change their approach, shedding light on actual scientific practice rather than disguising it.

These procedures also cannot eliminate all of the problems we identified above. We will still depend on authors to execute faithfully their research plans, and not to misreport their findings. For scholars working in established fields, the temptation to make findings consistent with existing results will still exist. So too will the temptation to produce novel findings. Results free peer review and pre-registration should help to align scholars’ incentives with those of the scientific community at large, but complete transparency is impossible to achieve.

Rather than debate the costs and benefits of transparency-enhancing measures in abstract terms, though, our special issue will gather some evidence about how the process actually looks. And we pledge to describe as fully and transparently as possible all of the challenges we face, expected and unexpected, in implementing these procedures. We are grateful that a prominent outlet in political science, Comparative Political Studies, is helping social scientists to explore this issue by devoting a full issue to this idea.

We’ve received a terrific response from a number of senior and junior scholars in political science. Given some of the recent attention to the need for greater transparency, we will continue to accept submissions for the special issue through October 15. To submit a proposal, or to learn more about the special issue, you may e-mail the editors at transparency@ipdutexas.org.


Related Articles

Emerson College Pollsters Explain How Pollsters Do What They Do
Communication
October 23, 2024

Emerson College Pollsters Explain How Pollsters Do What They Do

Read Now
All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture
Event
October 10, 2024

All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

Read Now
Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions
Research
October 10, 2024

Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions

Read Now
‘Settler Colonialism’ and the Promised Land
International Debate
September 27, 2024

‘Settler Colonialism’ and the Promised Land

Read Now
Webinar: Banned Books Week 2024

Webinar: Banned Books Week 2024

As book bans and academic censorship escalate across the United States, this free hour-long webinar gathers experts to discuss the impact these […]

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

The creation of the Coalition for Advancing Research Assessment (CoARA) has led to a heated debate on the balance between peer review and evaluative metrics in research assessment regimes. Luciana Balboa, Elizabeth Gadd, Eva Mendez, Janne Pölönen, Karen Stroobants, Erzsebet Toth Cithra and the CoARA Steering Board address these arguments and state CoARA’s commitment to finding ways in which peer review and bibliometrics can be used together responsibly.

Read Now
Revisiting the ‘Research Parasite’ Debate in the Age of AI

Revisiting the ‘Research Parasite’ Debate in the Age of AI

The large language models, or LLMs, that underlie generative AI tools such as OpenAI’s ChatGPT, have an ethical challenge in how they parasitize freely available data.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments