Communication

Harvesting the Opportunities in Psychology, Open Science and Government

May 20, 2016 1229

[Ed. The views expressed are the author’s and do not necessarily represent the official views of the United States government or the government of the District of Columbia.]

There is a movement afoot to weave psychological science into the fabric of government. And by using the words “weave” and “fabric,” I mean to signal something unique: an attempt, now emerging from within government itself, to integrate the insights and experimental methods from the psychological sciences directly into day-to-day governance.

David Yokum

David Yokum will speak on the afternoon of Friday, May 27 at the annual APS meeting in Chicago in an invited symposium on “Science of Behavior Change: Finding Mechanisms of Change in the Laboratory and the Field.” For more information on the event, click HERE.

My work at the federal level, for example, has been as part of the White House’s recently created Social & Behavioral Sciences Team (SBST). The SBST is a multidisciplinary group of applied behavioral scientists, most of whom are drawn from academia or other research entities and serve a Fellowship tour-of-duty directly within government. (For instance, I was previously at the University of Arizona, studying in the psychology department and law school.) In the opening run of work, the SBST and partner agencies completed more than 15 randomized field experiments, designing and testing the impact of behaviorally informed interventions in domains spanning health, education, finance, and government operations (see the 2015 SBST report for details).One of the projects examined the collection of a business fee (known as the industrial funding fee, or IFF) that relied on payers to self-report how much they owed. A psychological study by Shu et al.  found that requiring people to sign a guarantee that the information on the form is correct before — rather than after — completing the form made accountability more salient and improved the accuracy of self-reported information. We applied and tested this insight by randomly assigning whether or not a signature box appeared at the top of the IFF online reporting form. The median self-reported sales amount was $445 higher (p < .05, 95% CI [$87, $803]) for those signing beforehand than those not signing at all. This subtle, virtually cost-free intervention resulted in an additional $1.6 million in collections in a single quarter.

Psychological Scientists, Please Stand Up

On September 15, 2015, President Obama issued Executive Order 13707, “Using Behavioral Science Insights to Better Serve the American People.” The executive order emphasizes the applicability of psychological science to governance and directs agencies to “develop strategies for applying behavioral science insights to programs and, where possible, rigorously test and evaluate the impact of these insights.” Agencies are to “recruit behavioral science experts to join the Federal Government” and “strengthen agency relationships with the research community.”

Association for Psychological Science logo_opt

This article by David Yokum first appeared in the April 2016 edition of Observer, the magazine of the Association for Psychological Science.

We psychological scientists are now being explicitly invited to — or perhaps more accurately, we’ve elbowed our way into — a prime spot on the governance stage. How will we respond to this opportunity?

There is momentum already. Applied research and public advocacy are in the bones of the psychology profession, running at least from Hugo Münsterberg’s work at the turn of the 19th century through the recentPerspectives on Psychological Science special issue imagining a Council of Psychological Science Advisors. The reasons why psychologists should and must engage have been persuasively argued. The APS Presidential Column series* alone contains many jewels of thinking and leadership related to how the profession should think about meeting the opportunity:

I think psychological science has another unique advantage, one related to its leadership in developing and adopting open science practices. (See APS Executive Director Emeritus Alan Kraut’s December 2015 guest presidential column on open science efforts.) As it turns out, these practices might be the key to bypassing — or harnessing — one obstacle to the uptake of science into policy: politics.

Harnessing Open Science as Political Process

Let me spill some boring beans: Applied research, especially in government, involves politics. Shocking, right?

Importantly though, applied research must and should involve politics, in particular ways. For example, how big of an effect is needed to make an intervention worthwhile? How precise does our estimate of an effect need to be before we act on it? The answers depend on value judgments, such as deciding what counts as a cost or a benefit in cost–benefit analyses and balancing the inductive risks of accepting a false hypothesis or rejecting a true hypothesis. Government provides a process for coordinating and expressing such value-laden decisions. This is an imperfect process, to be sure, but there are textbook guideposts for how democratically elected or appointed officials decide (or delegate and supervise) the courses of government action, doing so in transparent ways that empower voters to hold them accountable.

The key is not to remove politics from the process — neither possible, nor desirable — but rather to shift when and how the value judgments occur. If we front-load the discussion and use the analysis plan to build and lock in consensus about the methods, we may be able to de-politicize reactions to the result

My pitch here is that the best practices from the open science movement, particularly developing and preregistering an analysis plan (before looking at the data), stand to double-down as best practices for successfully engaging in evidence-informed governance. The key is not to remove politics from the process — neither possible, nor desirable — but rather to shift when and how the value judgments occur. If we front-load the discussion and use the analysis plan to build and lock in consensus about the methods, we may be able to de-politicize reactions to the result. The dialogue manages expectations and, most importantly, empowers buy-in for a method that can be controlled rather than a single hoped-for result, which might or might not materialize.

I’ve lived this dynamic with government partners. We tested an intervention with the Center for Program Integrity (CPI), for example, that failed to work as expected, obtaining a null result. There is a risk, when evidence fails to support an idea, for people to bunker down and reactively defend the idea; to pick apart why the evidence is irrelevant or inaccurate, or to rejiggle the analyses to spit out the preconceived result. But in this case, the research team included CPI employees who were closely involved in defining the problem and approving the methodological details. We discussed at length what the study might uncover, including the possibility of a null result and what, given the statistical power, that exactly meant. We adjusted the intervention to fit within regulatory constraints. We agreed ahead of time how the data would be analyzed. It took a lot of legwork. But as a consequence, the null result pivoted naturally, not to defensiveness or dismissiveness, but to brainstorming about how to further innovate to solve the problem. That need to pivot was one of the possibilities that was anticipated from the onset. (New interventions are now being tested in the field.)

Imagine a political debate about whether to evaluate a particular program and, if so, what effect size (measured at what precision) would be required to fund the program at scale. Imagine town halls where stakeholders provided input on what outcomes should be measured or where they decided ahead of time what they would need to see in order to support or reject a proposal. There are many details to unpack here, but the arguments and methods behind the open science movement are ripe to be harnessed and adapted for purposes of driving evidence-informed government.

So What Now?

This is a call to action. Psychological scientists have a spot on the governance stage, and as a profession we need to mobilize to meet the opportunity, to fulfill the responsibility.

There are many things to be done — keep an ear open for more discussion at the 2016 APS Annual Convention — but I’ll end with a request that any psychological scientist can address right away: Think locally. National problems with federal government solutions receive a lot of attention, but the reality is that state and local governments have many more touch points with people. There is enormous opportunity for psychological science to improve governance at these citizen frontlines. You’ll also have easier access to more local government practitioners. So roll up your sleeves and attend a town hall meeting, or visit city hall, to start a dialogue about how psychological science can improve the governance directly in your community.

P.S. To stay in the loop on SBST activities, click here. œ

***
References
Shu, L. L., Mazar, N., Gino, F., Ariely, D., & Bazerman, M. H. (2012). Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end. Proceedings of the National Academy of Sciences of the United States of America, 109, 15197–15200. doi:10.1073/pnas.1209746109


David Yokum is a behavioral scientist and Fellow on the White House Social & Behavioral Sciences Team, as well as director of the General Services Administration's Office of Evaluation Sciences. Yokum finished his Ph.D. in psychology, with dual specialization in cognition & neural systems and psychology, policy, & law, at the University of Arizona, a law degree from the James E. Rogers College of Law, where he also served as a writing fellow, instructor, and guest lecturer, and a master's in bioethics & medical humanities from the University of South Florida.

View all posts by David Yokum

Related Articles

Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
Ninth Edition of ‘The Evidence’: Tackling the Gender Pay Gap 
Communication
October 31, 2024

Ninth Edition of ‘The Evidence’: Tackling the Gender Pay Gap 

Read Now
The Conversation Podcast Series Examines Class in British Politics
Communication
October 25, 2024

The Conversation Podcast Series Examines Class in British Politics

Read Now
Emerson College Pollsters Explain How Pollsters Do What They Do
International Debate
October 23, 2024

Emerson College Pollsters Explain How Pollsters Do What They Do

Read Now
Diving Into OSTP’s ‘Blueprint’ for Using Social and Behavioral Science in Policy

Diving Into OSTP’s ‘Blueprint’ for Using Social and Behavioral Science in Policy

Just in time for this past summer’s reading list, in May 2024 the White House Office of Science and Technology Policy (technically, […]

Read Now
Eighth Edition of ‘The Evidence’: How Sexist Abuse Undermines Political Representation 

Eighth Edition of ‘The Evidence’: How Sexist Abuse Undermines Political Representation 

In this month’s issue of The Evidence newsletter, Josephine Lethbridge explores rising levels of abuse directed towards women in politics, spotlighting research […]

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

The creation of the Coalition for Advancing Research Assessment (CoARA) has led to a heated debate on the balance between peer review and evaluative metrics in research assessment regimes. Luciana Balboa, Elizabeth Gadd, Eva Mendez, Janne Pölönen, Karen Stroobants, Erzsebet Toth Cithra and the CoARA Steering Board address these arguments and state CoARA’s commitment to finding ways in which peer review and bibliometrics can be used together responsibly.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments