Announcements

NAS Takes Detailed Look at Reproducibility and Replicability

September 23, 2019 1682

This Tuesday at 9 a.m. the National Academies of Sciences, Engineering, and Medicine will host a symposium in response to the recent report, Reproducibility and Replicability in Science. The 200-page report attempts to define reproducibility and replicability, explore issues related to these topics, and investigate how these topics might impact public confidence in science.

Both the academic and popular media have voiced concerns about reproducibility and replicability in recent years. As The Atlantic noted for one discipline, “Psychology’s Replication Crisis is Running Out of Excuses.” The attention isn’t a bad thing: reproducibility and replicability deserve the spotlight now more than ever as new forums and mediums for data collection are arising and research methods are rapidly transforming through new technologies.

For the sake of having knowledge about the world, scientific validity is of utmost importance. But scientific validity is even more important when scientific studies become the basis of policy or have an otherwise direct effect on human affairs or human well-being. Think of products that may have been mis-marketed due to flawed research (cigarettes), or legislation based on inconsistent findings. You most certainly wouldn’t want to ban night-flights if one study of four pilots revealed that they became exceptionally tired at 2 to 3 a.m. First and foremost, the research methodology would be questionable. A sample size of four? But, more importantly, individual studies which have not been replicated (or which prove non-replicable) are suspect vessels to inform legislation.

In seeking to improve the reproducibility and replicability of studies, the report begins by defining the two: “We define reproducibility to mean computational reproducibility—obtaining consistent computational results using the same input data, computational steps, methods, and code, and conditions of analysis; and replicability to mean obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data.”

The report offers researchers, academic journalists, funding agencies, science foundations, etc., procedures to ensure the reproducibility and replicability of research. “Researchers should take care to estimate,” the report leads, “and explain the uncertainty inherent in their results, to make proper use of statistical methods, and to describe their methods and data in a clear, accurate, and complete way.”

There is a distinct possibility that the inability to repeat a study may have widespread consequences– public confidence in the sciences may be undermined. However, when a scientific effort fails to independently confirm the computations or results of a previous study, some may see a lack of rigor, while others argue it can presage new discovery.

To read the report, click here.

Augustus Wachbrit (or, if you’re intimidated by his three-syllable name, Gus) is the Social Science Communications Intern at SAGE Publishing. He assists in the creation, curation, revision, and distribution of various forms of written content primarily for Social Science Space and Method Space. He is studying Philosophy and English at California Lutheran University, where he is a research fellow and department assistant. If you’re likely to find him anywhere, he’ll be studying from a textbook, writing (either academically or creatively), exercising, or defying all odds and doing all these things at once.

View all posts by Gus Wachbrit

Related Articles

Survey Suggests University Researchers Feel Powerless to Take Climate Change Action
Impact
April 18, 2024

Survey Suggests University Researchers Feel Powerless to Take Climate Change Action

Read Now
Daniel Kahneman, 1934-2024: The Grandfather of Behavioral Economics
News
March 27, 2024

Daniel Kahneman, 1934-2024: The Grandfather of Behavioral Economics

Read Now
2024 Holberg Prize Goes to Political Theorist Achille Mbembe
News
March 14, 2024

2024 Holberg Prize Goes to Political Theorist Achille Mbembe

Read Now
AAPSS Names Eight as 2024 Fellows
Announcements
March 13, 2024

AAPSS Names Eight as 2024 Fellows

Read Now
Apply for Sage’s 2024 Concept Grants

Apply for Sage’s 2024 Concept Grants

Three awards are available through Sage’s Concept Grant program, which is designed to support innovative products and tools aimed at enhancing social science education and research.

Read Now
New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible

New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible

Gloria Media, with support from Sage, has launched The Evidence, a feminist newsletter that covers what you need to know about gender […]

Read Now
Economist Kaye Husbands Fealing to Lead NSF’s Social Science Directorate

Economist Kaye Husbands Fealing to Lead NSF’s Social Science Directorate

Kaye Husbands Fealing, an economist who has done pioneering work in the “science of broadening participation,” has been named the new leader of the U.S. National Science Foundation’s Directorate for Social, Behavioral and Economic Sciences.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments