Impact

Welcoming the American ‘Nudge Unit’ Impact
Members of the Social and Behavioral Sciences Team visit the Oval Office to brief President Barack Obama on their work. (Photo: Pete Souza/White House)

Welcoming the American ‘Nudge Unit’

November 11, 2015 2732

SBST team meets Obama

Members of the Social and Behavioral Sciences Team visit the Oval Office to brief President Barack Obama on their work. (Photo: Pete Souza/White House)

Back in September, President Barack Obama signed an executive order that marked a major turning point in the role that behavioral science plays in helping the federal government achieve policy goals.

The order, which directs federal agencies to incorporate insights from behavioral science into their programs, may turn out to be one of the most important acts of his second term. That’s certainly the view of Cass Sunstein, a Harvard legal scholar and coauthor of the bestselling book on behavioral economics, Nudge.

Considering that during the last year alone Obama got Iran to agree to limit its nuclear program and inked the biggest trade deal in decades, that’s a high bar to meet. But in fact we’re already beginning to see why this may turn out to be true.

The Conversation logo

This article by Dave Nussbaum originally appeared at The Conversation, a Social Science Space partner site, under the title “How the science of human behavior is beginning to reshape the US government”

Common sense pays

Obama’s executive order coincided with the release of the inaugural report by the White House’s one-year-old Social and Behavioral Sciences Team (SBST). The report documents the successes (and failures) of the team’s initial efforts to transform policymaking through a better understanding of how and why people act as they do.

It may seem like common sense that when you’re designing programs designed to serve people, you ought to include insights from experts in human behavior. But common sense doesn’t always come easy. Although behavioral insights are commonly used by companies in the private sector, introducing them into the federal government – particularly in a systematic and scientific way – can be very difficult.

But as Sunstein correctly points out, these insights have the potential to reshape government, making it more efficient and effective, increasing citizens’ welfare while preserving their ability to make their own choices. That’s the early lesson from the UK’s so-called Nudge Unit, which reports that it has earned back more than 20 times its original investment in two years by improving tax collection, curbing student dropout rates and moved more people off of benefits and into work.

The same thing is now starting to happen on this side of the Atlantic.

Bounded rationality

Until recently, most economists held firmly to a worldview that assumed that people are rational utility maximizers. That is, they always behave rationally and go about their lives making fully informed decisions.

That can be a useful simplification in trying to understand how markets function and how economies work, but people don’t actually behave that way.

When we ignore the limitations of human rationality and the systematic errors those limitations produce, we end up designing policies that are logical but don’t end up working well for the people they’re supposed to serve.

That’s not to say that people are irrational fools, only that human beings have limitations – or as Herb Simon (winner of the 1978 Nobel Prize in Economics) put it: our rationality is “bounded.” Thus significant insights can be gained if we can explore the systematic ways that people’s behavior fails to rise to economists’ rational standards.

The behavioral science approach improves the effectiveness of public policy by recognizing these limitations and helping people overcome them, sometimes with approaches as simple as sending a reminder text message, altering the time an email is sent or changing the default setting on a printer from single- to double-sided.

The goal of behaviorally informed policy is to make it easier for people to make good decisions, while preserving their ability to freely choose.

Small tweaks, big results

The White House team, created and led by Maya Shankar, a cognitive neuroscientist, partnered with an array of government agencies including the Departments of Defense, Education and Agriculture, to turn behavioral insights into more effective policy. As she puts it:

It’s not enough to simply design good federal programs. We have to make sure that those programs effectively reach the very people they are designed to serve. Behavioral science teaches us that even small barriers to accessing programs, whether it is a complicated form or burdensome application process, can have disproportionate negative impacts on participation rates.

The trials documented in the report generally aimed to streamline access to existing government programs and improve efficiency. The focus was on projects in which minute, low-cost changes built on very basic psychological concepts could lead to immediate, quantifiable improvements in outcomes and produce large shifts in behavior.

In one such effort, the team worked with the Department of Defense to increase enrollment in a retirement program for service members. The SBST modified emails sent to members who weren’t enrolled, more clearly describing the steps required to sign up and emphasizing the benefits of saving even just a little bit each month. As a result, the number of service members who enrolled in the program increased by 67 percent.

Generally the team tried to identify areas in which there was a breakdown in the effectiveness of policies that could potentially be improved with behavioral insights. And although the interventions were based on existing findings in fields like psychology and behavioral economics, the SBST rigorously evaluated the outcomes using randomized controlled trials, allowing them to evaluate which ones produced their intended effect and how strong those effects actually were.

Other projects were a little more ambitious in the behavioral insights employed, although they still made only minimal tweaks to the way policies were implemented.

For instance, federal vendors – who pay a small fee of 0.75% to the government based on self-reported sales – were asked to sign at the beginning of their declaration form attesting that they were providing accurate information. Compared with vendors who did not sign (the existing status quo), those who signed reported slightly more sales (US$445 on average). Although that may seem modest, the intervention was virtually costless and generated $1.59 million in revenue in the third quarter of 2014 alone.

Success through failure

Despite the impressive success of many of the trials in SBST’s first wave of interventions, perhaps even more encouraging were its failures.

Not all behavioral insight-driven interventions will work – that is, after all, why it’s critical to rigorously evaluate them. But it is in how failures are handled that will determine the team’s ultimate success. Notably, the SBST’s report was as candid about failures as successes.

One project involved trying to reduce the overprescription of certain drugs by informing doctors that they were prescribing them more than their peers. The technique has been successful in other contexts, such as curbing homeowners’ energy consumption by merely letting them know they used more than their neighbors. But with the doctors it had no discernible effect on prescription rates.

Although it can be tempting (and sometimes politically expedient, particularly in the short term) to highlight success and sweep failures under the rug, it’s critical that we understand what works and what doesn’t, so that we don’t repeat our failures and we can learn from them.

In some cases, interventions will fail because they’re fundamentally flawed, possibly because what worked in a carefully controlled lab environment gets washed out by the noise of the real world, or possibly because the intervention simply doesn’t work in a given context. In those cases, the interventions should be scrapped or replaced by other approaches.

But in other cases, a failed intervention is just a beginning. Squarely facing failure is the first step toward designing an intervention that works. Researchers can discover what the problems were and what makes the intervention work in some cases but not in others. This sort of learning will not only improve policies, it is also an enormously important contribution to the collaboration with the academic community.

Applying basic research in the real world

Although social psychology has its roots in tackling real-world problems, in recent decades its engagement with public policy has waned and applied work has become less prestigious than basic science.

But the two types of research – rigorously controlled laboratory research and evaluating outcomes in the field – can be symbiotic. There are encouraging signs that social psychologists and other behavioral scientists are moving in that direction.

At the White House, for now, the focus is on tweaking existing programs. As the evidence for the SBST’s programs continues to accumulate, the hope is that behavioral insights become as central in policymakers’ thinking as economic ones, helping us build effective policies from the ground up.

The Social and Behavioral Sciences Team has done an impressive job so far in using small, inexpensive changes to make federal policies better serve citizens.

The psychologist Barry Schwartz, who penned an op-ed in the Atlantic in 2012 calling for a Council of Psychological Advisors, summed it up well when he said: “It’s fantastic to actually have an agency in government who takes psychology seriously. There’s a long way to go before it becomes a sister to the Council of Economic Advisers, but if it proves itself to be helpful, I can imagine it.”
The Conversation


Dave Nussbaum is a social psychologist and adjunct associate professor of behavioral science at the Booth School of Business at the University of Chicago. He serves as the director of communications for the Behavioral Science & Policy Association, and blog editor for the Society of Personality and Social Psychology. Follow him @davenuss79

View all posts by Dave Nussbaum

Related Articles

Long-Term Impact Requires Archiving Research Communication
Impact
March 14, 2025

Long-Term Impact Requires Archiving Research Communication

Read Now
Michael Burawoy, 1947-2025: Patron Saint of Public Sociology
Career
February 6, 2025

Michael Burawoy, 1947-2025: Patron Saint of Public Sociology

Read Now
How Research Credibility Suffers in a Quantified Society
Higher Education Reform
January 8, 2025

How Research Credibility Suffers in a Quantified Society

Read Now
Young Scholars Can’t Take the Field in Game of  Academic Metrics
Infrastructure
December 18, 2024

Young Scholars Can’t Take the Field in Game of Academic Metrics

Read Now
Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

Canada’s Storytellers Challenge Seeks Compelling Narratives About Student Research

“We are, as a species, addicted to story,” says English professor Jonathan Gottschall in his book, The Storytelling Animal. “Even when the […]

Read Now
Tom Burns, 1959-2024: A Pioneer in Learning Development 

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Tom Burns, whose combination of play — and plays – with teaching in higher education added a light, collaborative and engaging model […]

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

The creation of the Coalition for Advancing Research Assessment (CoARA) has led to a heated debate on the balance between peer review and evaluative metrics in research assessment regimes. Luciana Balboa, Elizabeth Gadd, Eva Mendez, Janne Pölönen, Karen Stroobants, Erzsebet Toth Cithra and the CoARA Steering Board address these arguments and state CoARA’s commitment to finding ways in which peer review and bibliometrics can be used together responsibly.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest


This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments