Brexit

New Think Tank Starts Policy Research Critique Series

April 13, 2017 1204

A Germany-based social science think tank has released its first “Second Opinion” publication – on Brexit – as part of a move to review existing research that aims to influence policy decisions.

Founded in 2016 by scholars from Berlin’s Humboldt University, Social Science Works is an international think tank based in Potsdam, Germany. According to its organizers, “our focus is on making social science useful for modern democracies. Our intention is to help to provide policy-makers with the tools to use social science to help make better informed decisions.”

The non-profit project, which was first proposed in 2014 as a way to tap usable knowledge for the support of democracy, is supported by the ZukunftsAgentur Brandenburg and is part of the project Innovationen brauchen Mut (Innovation Needs Courage), paid for by the European Union and the state of Brandenburg.

In that vein Social Science Works has published What We Can & Can’t Measure in a Brexit Deal, a white paper in which the authors argue that “a Brexit deal cannot be considered outside of the context of the aims of the European Union.” This includes consideration of the rights of European Union citizens inside Britain and British citizens in the EU; the stability of the re-invented UK and EU; and security issues arising from the new configuration. “As social and political scientists and citizens of the EU and the UK,” the authors conclude, “we have a duty to help to shape how we understand these events.”

When developing the idea for Second Opinions, Social Science Works sought to alleviate a common problem that decision-makers face in a world that has become increasingly dominated by the use of research: namely, reducing ambiguity in the quality and validity of research that aims to influence policy decisions. By offering a neutral, scientific point of view to existing research, the product can help improve the credibility of policy-makers and members of civil society who seek to use research in justifying policy initiatives.

In their Second Opinions publications, Social Science Works takes research upon request from decision makers and evaluates its quality in a variety of thematic areas, namely the content of the research, its underlying assumptions, research design, its transparency in funding, acknowledgement and use of existing research, and its coherency.

The research at hand is evaluated in each of these areas against a set of criteria, and ultimately receives a final score translating into a color-numeric “Traffic Light” grade with a corresponding overall recommendation regarding the permissibility of the research in decision making. The results of the final evaluation range from justifiable and permissible – a “Green Light” – to completely inadmissible with serious problems in the research – a “Red Light.”

“From all of the policy makers we have talked to, it became very much apparent that a service like this is badly needed in the realm of public decision making,” said Nils Wadt, co-founder of Social Science Works and co-author – with Hans Blokland, Sarah Coughlan, and Patrick Sullivan — of the Brexit Second Opinion.

In the Brexit report, Social Science Works sought not only to add its voice to the debate, but to demonstrate the relevance of a methodology for examining existing research, particularly when it is part of key, ongoing policy debates.

The original paper asks how we can measure the success of a future Brexit deal for the UK. In doing so, it presents four economic tests relating to four thematic areas: public finances and the economy, fairness, openness and control. The paper makes it clear that it does not intend to offer a comprehensive framework through which to assess a future Brexit deal, and focuses only on economic considerations. As the authors note:

A Successful Brexit: Four Economic Tests was conducted by noted scholars from the UK and as such we have found the quality of the paper to be mostly very high. That said, there were some areas, particularly in the realms of assumptions and definitions that have tilted its focus to looking at a future Brexit deal as an economic question. Our paper seeks to help to redress this.

In our conclusion, we argue that while A Successful Brexit: Four Economic Tests adds a valuable contribution to the ongoing debate on measuring the success of Brexit, it still omits many important criteria and indicators that go beyond the realm of economic data.


Related Articles

Three Decades of Rural Health Research and a Bumper Crop of Insights from South Africa
Impact
March 27, 2024

Three Decades of Rural Health Research and a Bumper Crop of Insights from South Africa

Read Now
Using Translational Research as a Model for Long-Term Impact
Impact
March 21, 2024

Using Translational Research as a Model for Long-Term Impact

Read Now
Norman B. Anderson, 1955-2024: Pioneering Psychologist and First Director of OBSSR
Impact
March 4, 2024

Norman B. Anderson, 1955-2024: Pioneering Psychologist and First Director of OBSSR

Read Now
New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible
Impact
March 4, 2024

New Feminist Newsletter The Evidence Makes Research on Gender Inequality Widely Accessible

Read Now
New Podcast Series Applies Social Science to Social Justice Issues

New Podcast Series Applies Social Science to Social Justice Issues

Sage (the parent of Social Science Space) and the Surviving Society podcast have launched a collaborative podcast series, Social Science for Social […]

Read Now
The Importance of Using Proper Research Citations to Encourage Trustworthy News Reporting

The Importance of Using Proper Research Citations to Encourage Trustworthy News Reporting

Based on a study of how research is cited in national and local media sources, Andy Tattersall shows how research is often poorly represented in the media and suggests better community standards around linking to original research could improve trust in mainstream media.

Read Now
Why Don’t Algorithms Agree With Each Other?

Why Don’t Algorithms Agree With Each Other?

David Canter reviews his experience of filling in automated forms online for the same thing but getting very different answers, revealing the value systems built into these supposedly neutral processes.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments