When Science and Politics Collide: Panelists Respond
Earlier this month Social Science Space and the American Academy of Political and Social Science held their debut joint webinar, titled When Politics and Science Collide. The event featured moderator Liz Suhay of American University’s School of Public Affairs with Dan Kahan of Yale Law School, who explained how what people “believe” about climate change doesn’t reflect what they know; it expresses who they are, and Francis Shen of the University of Minnesota Law School, who discussed how neuroscience moves from the lab to courts and legislatures, encountering obstacles along the way. Suhay is a co-editor of the March issue of The ANNALS of the American Academy of Political and Social Science, entitled, “The Politics of Science: Political Values and the Production, Communication, and Reception of Scientific Knowledge,” an issue that included articles by Kahan and Shen.
The recorded webinar is available above, including a lively question-and-answer session that followed the panelists’ prepared presentations. Not all the question could be addressed in the hourlong session, and so professors Shen and Kahan graciously agreed to answer some of the remaining queries below. Kahan’s answers also appear on the blog for the Cultural Cognition Project at Yale.
***
How do you reconcile the fact that left-wing/educated individuals accept scientific evidence about climate change yet reject vaccinations?
Asked specifically of Dan M. Kahan:Have you looked at GMOs or vaccines and seen similar results from the left that you’ve seen on the right?
The scientific literature suggests that awareness of a given bias in one or another direction tends to reduce, at least somewhat, the effect of that bias on the individual holding it (as they take it into account). Is there any evidence of this effect in the contexts discussed today?
Francis X. Shen: This is a timely question, as there is presently an active conversation in legal circles about implicit bias. Scholars and practitioners from law and science disciplines are collaborating to think about the nature of the issue and ways to address it. The issue of implicit racial bias has been of great interest in this dialogue, and neuroscientists have chimed in as well. There is evidence, albeit it still preliminary in many regards, that in some contexts certain “de-biasing” techniques could be effective.
But there remain many unanswered questions, both scientifically and legally. For instance, who gets to decide, and on what grounds, which “biases” are morally or legally problematic? It will be interesting to see how these developments play out in the years and decades to come.
Dan M. Kahan: I put the first and second together because my answer to the first is based on the second.
There’s no need to “reconcile the fact that left-wing/educated individuals accept scientific evidence about climate change yet reject vaccinations” because it’s not true!
Same for the claim that genetically modified foods are somehow connected to a left-leaning political orientation–or a right-wing leaning one, for that matter.
The media and blogosophere grossly overstate the number of risk issues on which we see the sort of polarization that we do on climate change along with a number of other issues (e.g., fracking, nuclear power, HPV vaccine [at least at one time; not sure anymore]).
Consider these responses form a large, nationally representative sample, surveyed last summer:
I call the survey item here the “industrial strength risk perception measure,” or ISRPM. There’s lots of research showing that responses to ISRPM will correlate super highly with responses that people give to more specific questions about the identified risk sources (e.g., “is the earth heating up?” or “are humans causing global temperatures to rise” in the case of the “global warming” ISRPM) and even to behavior with respect to personal risk-taking (at least if the putative risk source is one they are familiar with). So it’s an economical way to look at variance.
You can see that climate change, fracking, and guns are pretty unusual in generating partisan divisions (click for higher resolution).
Well, here’s childhood vaccines and GM foods:
Definitely not in the class of issues—the small, weird ones, really—that polarize people.
A couple of other things.
First, to put the very tiny influence of political orientations on vaccine risks (and even smaller one on GM foods) in perspective, consider this (from a CCP report on vaccine risk perceptions):
Anyone who sees how tiny these correlations are and still wants to say that the there is a meaningful connection between partisanship and either vaccine- or GM food-risk perceptions is making a ridiculous assertion.
Indeed, in my view, they are just piling on in an ugly, ignorant, illiberal form of status competition that degrades public science discourse
Second, GM food’s ISRPM is higher than that of many other risk sources, it’s true. But that’s consistent with noise: people are all over the map when they respond to the question, and so the average ends up around the middle.
In fact, there’s no meaningful public concern about GM food risks in the general population—for the simple reason that most people have no idea what GM foods are. Serious public opinion surveys show this over and over.
Nonserious ones ignore this and pretend that we can draw inferences from the fact that, when people who don’t know what GM foods are, are asked if they are worried about them, they say, “Oh, yes!” They also say ridiculous things like that that they carefully check for GM ingredients when they shop at the supermarket, even though in fact there aren’t any general GM food labeling requirements in the US.
Some 80 percent of the foods in US supermarkets have GM ingredients. People don’t fear GM foods — they eat them, in prodigious amounts.
It’s worth trying to figure out both why so many people have the misimpression that both GM foods and vaccines are matters of significant concern for any meaningful segment of the US population. The answer, I think, is a combination of bad reporting in the media and selective sampling on the part of those who are very interested in these issues and who immerse themselves in the internet enclaves where these issues are being actively debated.
There are serious dangers, moreover, from the exaggeration of the general concern over these risks and the gross misconceptions people have about the partisan character of them.
Some sources to consider in that regard:
Kahan, D.M. A risky science communication environment for vaccines. Science 342, 53-54 (2013).
UPDATE: Since the initial question posited that it was “high educated” left-leaning people who are concerned about vaccine risks, this is also probably relevant:
How do you think the collision of science and politics you are discussing will (or should?) change the way we educate children in K-12 science and social studies classes?
Considering that individual values and public perception/engagement with government differ widely around the world, to what degree do the speakers’ presentations (broadly related to culture & science policy or law & science policy), reflect political activity or engagement internationally (for individual nations or collaborative international policy)?
Francis X. Shen: Excellent question. The integration of science into policymaking will vary dramatically in light of the institutional context in which that integration takes place. In legal systems that do not employ an adversarial approach (e.g. many legal systems in Europe) expert evidence is handled quite differently. This means that emerging scientific evidence like neuroscience will also be handled differently. There is an active community of European neurolaw scholars exploring these and other related issues, and Europe is investing heavily in brain research. To date, however, there has been very little investigation of parallel developments in Asia and Africa. I hope that we see more international dialogue and collaboration on these issues going forward.
***
How does increasing science literacy – that is, knowledge about the scientific process – serve to influence people’s beliefs about science issues?
Francis X. Shen: In the realm of an emerging science like neuroscience I think that improving general scientific literary enables consumers to ask better, more critical questions. In evaluating scientific evidence for purposes of legal proceedings, it is important for judges and attorneys to understand the basic scientific process because the admissibility of the expert evidence depends, in part, on how that evidence was generated. There are efforts underway to improve this scientific literacy, and this is a good thing because it puts judges and lawyers in a better position to evaluate the scientific evidence that is being considered for jury consumption.
Beyond the courtroom, I think the same goal should apply for relevant staffers at government administrative bodies and staff members for legislators (both state and federal). In many agencies, of course, this is a core part of their mission. The Food and Drug Administration (FDA), for instance, has significant expertise to evaluate the many drugs and devices that companies want to bring to the consumer market. In legislative chambers, however, there is still much room for improvement.
I think it’s worth noting too that learning about the scientific process is important not just for understanding findings from the natural sciences, but from the social sciences as well. For instance, when a state Department of Education releases a report that says, “Curriculum X raised test scores by 10%”, one ought to look carefully at the methodology used to generate the report. Too often in policy evaluation we are not rigorous enough in methods, and not humble enough in our conclusions. Improved scientific literary on legislative staffs would make them better consumers of such reports.
Dan M. Kahan: Where the sorts of dynamics that generate polarization exist, greater science comprehension (measured in any variety of ways, including standard science literacy assessments, numeracy tests, and critical reasoning scales) magnifies polarization. The most science comprehending members of the population are the most polarized on issues like climate change, fracking, guns, etc.
Consider:
Here I’ve plotted in relation to science comprehension (measured with a scale that includes basic science knowledge along with various critical reasoning dispositions) the ISRPM scores of individuals identified by political outlook.
As mentioned above, partisan polarization on risk issues is the exception, not the rule.
But where it exists, it gets worse as people become better at making sense of scientific evidence.
Why?
Because now and again, for one reason or another, disputes that admit of scientific inquiry become entangled in antagonistic cultural meanings. When that happens, positions on them become badges of membership in, and loyalty to, cultural groups.
At that point, individuals’ personal stake in protecting their status in their group will exceed their personal stake in “getting the right answer.” Accordingly, they will then use their intelligence to form and persist in the positions that signify their group membership.
The entanglement of group identity in risks and other facts that admit of scientific investigation is a kind of pollution in the science communication environment. It disables the faculties that people normally use with great success to figure out what is known by science.
Improving science literacy won’t, unfortunately, clean up our science communication environment.
On the contrary, we need to clean up our science communication environment so that we can get the full value of the science literacy that our citizens possess.
Some sources:
Kahan, D. Why we are poles apart on climate change. Nature 488, 255 (2012).
***
Asked specifically of Francis X. Shen: As a researcher and a citizen, do you have any policy recommendations? In your opinion, what are the productive directions for policy?
Francis X. Shen: While some policymaking will have to remain wait-and-see until neuroscience provides a better understanding of our brains (and how to improve them), I think we ought to do the following three things right off the bat.
First, we ought to recognize that mental health requires brain health and should not stigmatize those whose brain heath is suffering. Rather, we ought to more proactively promote brain health. Thanks to many folks across the country, we’re already starting to move in this direction. But we can do more. And I think an important part of this policy work is reshaping the policy narrative as one of brain health. I hope that my work contributes in some small ways to these efforts in the years to come.
Second, the federal government should expand its support of interdisciplinary funding to better address the need for better science and better integration of that science in applied settings. There are some instances where this is happening, but I would like to see more new grants developed where there is a requirement for combining cutting-edge science with applied law and policy goals. This should be a complement to, not a replacement of, investments in basic science.
Third, and specific to the criminal justice system, we ought to focus not on how neuroscience might do away with “guilt” (which can lead to political chasms as we found in our study). Rather we should focus on treatment by asking: what can advances in brain science do to help the legal system improve its rehabilitation of offenders? We can, and should, do better in the design and implementation of interventions for criminal offenders.