Communication

The Research on Communicating Science in a Post-Truth Era

December 14, 2016 3173

2016-12-13-17_39_52-what-does-research-say-about-how-to-effectively-communicate-about-science__optTruth seems to be an increasingly flexible concept in politics. At least that’s the impression the Oxford English Dictionary gave recently, as it declared “post-truth” the 2016 Word of the Year. What happens when decisions are based on misleading or blatantly wrong information? The answer is quite simple – our airplanes would be less safe, our medical treatments less effective, our economy less competitive globally, and on and on.

Many scientists and science communicators have grappled with disregard for, or inappropriate use of, scientific evidence for years – especially around contentious issues like the causes of global warming, or the benefits of vaccinating children. A long debunked study on links between vaccinations and autism, for instance, cost the researcher his medical license but continues to keep vaccination rates lower than they should be.

The Conversation logo

This article by Andrew Maynard and Dietram A. Scheufele originally appeared at The Conversation, a Social Science Space partner site, under the title “What does research say about how to effectively communicate about science?”

Only recently, however, have people begun to think systematically about what actually works to promote better public discourse and decision-making around what is sometimes controversial science. Of course scientists would like to rely on evidence, generated by research, to gain insights into how to most effectively convey to others what they know and do.

As it turns out, the science on how to best communicate science across different issues, social settings and audiences has not led to easy-to-follow, concrete recommendations.

About a year ago, the National Academies of Sciences, Engineering and Medicine brought together a diverse group of experts and practitioners to address this gap between research and practice. The goal was to apply scientific thinking to the process of how we go about communicating science effectively. Both of us were a part of this group (with Dietram as the vice chair).

The public draft of the group’s findings – “Communicating Science Effectively: A Research Agenda” – has just been published. In it, we take a hard look at what effective science communication means and why it’s important; what makes it so challenging – especially where the science is uncertain or contested; and how researchers and science communicators can increase our knowledge of what works, and under what conditions.

Evidence for effective approaches
As we discovered, effective science communication – including listening to and engaging with audiences – is particularly complex, and far from simple to study. It’s highly dependent on what is being communicated, its relevance to who’s participating in the conversation and the social and media dynamic around the issues being addressed (especially if those issues or their policy implications are contentious). But it also depends on what people feel and believe is right and the societal or political contexts within which communication and engagement occur. And this makes getting it right and deriving lessons that can be applied across issues and contexts particularly challenging.

Because of this complexity, the practice of science communication (and there are many great practitioners) is currently more of an art than a science. Good communicators – whether reporters, bloggers, scientists or people active on social media and platforms like YouTube – typically learn from others, or through professional training, and often through trial and error. Unfortunately, the social sciences haven’t provided science communicators with concrete, evidence-based guidance on how to communicate more effectively.

Two earlier NAS meetings identified how diverse the areas of expertise are when it comes to research on science communication. Research spans behavioral economics and sociology along with media and communication studies. They also began to map out what we do and don’t know about what works.

For instance, it’s becoming increasingly clear that the “deficit model” of science communication – the assumption that if we just “fill people up” with science knowledge and understanding, they’ll become increasingly rational decision-makers – simply does not work. This is not because people are irrational; rather, we all have our own built-in psychologies of how we make sense of information, and how we weigh different factors when making decisions.

We also know all of us are predisposed to accept, reject or interpret information based on a plethora of mental shortcuts, including a tendency to take on face value information that seems to confirm our worldview.

And we know how information is presented, or framed, can have a profound impact on how it is interpreted and used. The power of the “Frankenfood” frame, for example, used with genetically modified foods, has nothing to do with providing new information. Instead, the term subconsciously connects genetically modified organisms to mental concepts we all share – worrisome ideas about scientists creating unnatural organisms with unintended consequences – and raises moral questions about science going too far.

Decisions factor in more than facts
Science communication may involve communicating scientific consensus about, for instance, the benefits and risks of vaccines to patients. Or it may encompass much broader societal debates about the ethical, moral or political questions raised by science.

For example, our ability to edit the genetic code of organisms is developing at breakneck speed. Over the next decade, CRISPR and similar technologies will have a profound impact on our lives, from how we modify plants and animals and control disease, to how we produce our food, and even how we change our own genetic code as human beings.

But it will also present all of us with questions that cannot be answered with science alone. What does it mean to be human, for instance? Is it ethical to edit the genome of unborn embryos? If people involved in those decisions don’t have the opportunity to grasp the evidence-informed implications of the technology and make informed choices about its development and use, the future becomes little more than a lottery.

For those communicating the science, then, the endeavor comes with some degree of responsibility. Even deciding what information to share, and how to share it, involves personal values, beliefs and perspectives, and can potentially have far-reaching consequences.

There’s an especially high level of ethical responsibility associated with communication designed to influence opinions, behavior and actions. Scientists are well equipped to document the public health risks of lowered vaccination rates, for example. The question of whether we should mandate vaccinations or remove belief-based exemptions, however, is an inherently political one that scientists alone cannot answer.

Mapping out a better way
At some level, all science communication has embedded values. Information always comes wrapped in a complex skein of purpose and intent – even when presented as impartial scientific facts. Despite, or maybe because of, this complexity, there remains a need to develop a stronger empirical foundation for effective communication of and about science.

Addressing this, the National Academies draft report makes an extensive number of recommendations. A few in particular stand out:

  • Use a systems approach to guide science communication. In other words, recognize that science communication is part of a larger network of information and influences that affect what people and organizations think and do.
  • Assess the effectiveness of science communication. Yes, researchers try, but often we still engage in communication first and evaluate later. Better to design the best approach to communication based on empirical insights about both audiences and contexts. Very often, the technical risk that scientists think must be communicated have nothing to do with the hopes or concerns public audiences have.
  • Get better at meaningful engagement between scientists and others to enable that “honest, bidirectional dialogue” about the promises and pitfalls of science that our committee chair Alan Leshner and others have called for.
  • Consider social media’s impact – positive and negative.
  • Work toward better understanding when and how to communicate science around issues that are contentious, or potentially so.

Addressing these and other areas is going to take focused research efforts that draw on expertise across many different areas. It’s going to need strategic and serious investment in the “science” of science communication. It will also demand much greater engagement and collaboration between those who study science communication and those who actually do it. And it’ll require serious thinking about why we communicate science, and how we can work respectfully with audiences to ensure that the science we do communicate about is of value to society.

This will not be easy. But the alternative – slipping further into a post-truth world where disdain for evidence creates risks that could be avoided – gives us little option but to dig deeper into the science of science communication, so that science and evidence are more effectively incorporated into the decisions people make.
The Conversation


Andrew Maynard is a professor in the School for the Future of Innovation in Society at Arizona State University, and director of the Risk Innovation Lab. His research and professional activities focus on risk innovation, and the responsible development and use of emerging technologies. Dietram A. Scheufele is the John E. Ross Professor in Science Communication and Vilas Distinguished Achievement Professor at the University of Wisconsin-Madison and in the Morgridge Institute for Research. Since 2013, he’s also held an honorary professorship at the Dresden University of Technology in Germany.

View all posts by Andrew Maynard and Dietram A. Scheufele

Related Articles

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research
Communication
November 21, 2024

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Read Now
Ninth Edition of ‘The Evidence’: Tackling the Gender Pay Gap 
Communication
October 31, 2024

Ninth Edition of ‘The Evidence’: Tackling the Gender Pay Gap 

Read Now
The Conversation Podcast Series Examines Class in British Politics
Communication
October 25, 2024

The Conversation Podcast Series Examines Class in British Politics

Read Now
Emerson College Pollsters Explain How Pollsters Do What They Do
International Debate
October 23, 2024

Emerson College Pollsters Explain How Pollsters Do What They Do

Read Now
Diving Into OSTP’s ‘Blueprint’ for Using Social and Behavioral Science in Policy

Diving Into OSTP’s ‘Blueprint’ for Using Social and Behavioral Science in Policy

Just in time for this past summer’s reading list, in May 2024 the White House Office of Science and Technology Policy (technically, […]

Read Now
Eighth Edition of ‘The Evidence’: How Sexist Abuse Undermines Political Representation 

Eighth Edition of ‘The Evidence’: How Sexist Abuse Undermines Political Representation 

In this month’s issue of The Evidence newsletter, Josephine Lethbridge explores rising levels of abuse directed towards women in politics, spotlighting research […]

Read Now
Revisiting the ‘Research Parasite’ Debate in the Age of AI

Revisiting the ‘Research Parasite’ Debate in the Age of AI

The large language models, or LLMs, that underlie generative AI tools such as OpenAI’s ChatGPT, have an ethical challenge in how they parasitize freely available data.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

1 Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Elmer Rich

As professional problem solvers, we need facts, data and the best evidence to guide action. Paradoxically, in the discussion of popular science, science education and the use or misuse of facts in “decision making” we have, effectively no research on our dependent variables – what ever they may be. Let take the simple statement: “Scientists must use all reasonable vehicles…” How is “reasonable” to be defined and measured? By whom and over what period of time? Another example: “There is strength in numbers, and more than 2,300 scientists have signed an open letter” OK, where are the experimental facts to… Read more »