Impact

Here Are the Blocks You Need to Tell Your Impact Story

September 22, 2020 2348
Building blocks on bits of wood
Impact Evaluation Framework ‘Building Blocks’

The increasing pressure placed upon research organisations of all kinds (not just universities) to demonstrate impact is often passed along to researchers and research managers without the support of an accompanying cultural change within the institution.

With this in mind, Forest Research, the government research agency that supports the United Kingdom’s forestry sector, has committed to promote a culture of knowledge exchange and interdisciplinarity in its research. This mission stemmed from a 2012 external impact evaluation which recommended directing attention toward helping researchers generate and record impacts. In addition to program activities such as training workshops and small-group conversations, the initiative identified an additional challenge: that of demystifying and operationalizing impact evaluation in a coherent and accessible way that nonetheless allows for diversity, subtlety and uncertainty.

A new evaluation framework: building blocks

To undertake this almost contradictory challenge, we developed a new evaluation framework that we tested across multiple projects in a pilot exercise.  Working with the draft framework made it possible to produce a dozen full impact case studies from initial observations. Analysis of this exercise indicated that the components of the framework were sufficiently clear and comprehensive for researchers and stakeholders to understand and use in constructing a range of very different narratives.

We envisioned the framework as a set of building blocks which could be combined in multiple ways to tell almost any impact story, each shedding light on the complex interactions between research, policy and practice. In fact, when we gave a demo at the 2019 UK Knowledge Mobilisation Forum, we represented each component with different wooden tiles (fitting for work with a forest research agency!) and then used them to tell two contrasting impact case study narratives, showing how we could manipulate the tiles into different configurations to reveal the relationships between them. The audience seemed to appreciate our main messages of dynamics – in terms of multi-directional interactions unfolding in varying sequences – and heterogeneity – in terms of impact types, causes and stakeholders   – as an alternative or complement to conventional linear logic models that tend to dominate the field of impact evaluation.

What are the building blocks?

There are: five types of impacts; five broad categories of stakeholders; and eight causal factors, along with a set of over-arching questions.

Five impact types

From previous impact evaluations and other research, we had found that providing a range of impact types to consider often makes people feel ‘liberated’ and ‘authorized’ to go beyond conventionally-sought instrumental changes when searching for their impacts. The five impact types are:

1) Instrumental: changes to plans, decisions, behaviors, practices, actions, policies

2) Conceptual: changes to knowledge, awareness, attitudes, emotions

3) Capacity-building: changes to skills and expertise

4) Enduring connectivity: changes to the number and quality of relationships and trust

5) Culture/attitudes towards knowledge exchange, and research impact itself

There is no fixed hierarchy or linear sequence inherent in this typology, and all have value in their own right, although the last two, process-based impacts may make other sorts of impacts more likely.

Asking ‘How do we know?’ leads towards gathering of evidence, including recognition of telling indicators of change.

LSE-impact-blog-logo
This article by Laura Meagher and David Edwards originally appeared on the LSE Impact of Social Sciences Blog as “How to tell an impact story? The building blocks you need” and is reposted under the Creative Commons license (CC BY 3.0).

Five categories of stakeholders

To stimulate further definition (or mapping) for a particular situation, the framework offers five broad categories of stakeholders, which of course can be subdivided further, and may influence as well as be influenced by the research process and its outputs.

  1. Policy-makers: government agencies and regulatory bodies, local, national and international

2) Practitioners: public, private, NGOs

3) Communities: of places or interest, general public

4) Researchers: within and beyond the project and the institution

5) Other

Eight causal factors

Because impact generation is a dynamic process, it is important to ask: ‘Why/how did changes occur?’ The framework offers eight causal factors:

1) Problem-framing: Level of importance; tractability of the problem; active negotiation of research questions; appropriateness of research design.

2) Research management: Research culture; integration between disciplines and teams; promotion of research services; planning; strategy.

3) Inputs: Funding; staff capacity and turnover; legacy of previous work; access to equipment and resources.

4) Outputs: Quality and usefulness of content; appropriate format.

5) Dissemination: Targeted and efficient delivery of outputs to users and other audiences.

6) Engagement: Level and quality of interaction with users and other stakeholders; co-production of knowledge; collaboration during design, dissemination and uptake of outputs.

7) Users: Influence of knowledge intermediaries, e.g. ‘champions’ and user groups; incentives and reinforcement to encourage uptake.

8) Context: Societal, political, economic, biophysical, climate and geographical factors.

Reflective questions

The questions underlying the framework are: ‘Who or what changed? (And how do we know?); Why or how did changes occur? and What lessons can be learned?’ 

These are centered around two questions which encourage the framework’s users to step back and think about what lessons have been learned regarding impact identification and generation, to plan actions and improve future efforts:

1) What worked? What could (or should) have been done differently?

2) What could (or should) be done in the future?

The framework can be used to stimulate critical reflection by individual researchers or teams, ideally involving end-users and other stakeholders, throughout the project cycle, at the planning, mid-course and final stages.

From simplicity to complexity

When assessing impacts, there can be a temptation to package notions of impact into readily countable and comparable units which come about as a result of a simple linear sequence of events. Instead, we tried to encourage researchers to see impact as complex, developing over often lengthy periods of time, in a dynamic context in which different causal factors, stakeholders and impacts themselves interact. Ultimately, impacts are highly heterogenous and travel different developmental pathways. Their stories are multi-dimensional and based on complex interactions.

Forestry was an ideal field in which to develop a framework of this kind, as it draws together a ‘real world’ orientation with an interdisciplinary approach spanning the natural and social sciences. However, we see the framework as having a wider application and being flexible enough to meet the needs of other disciplines and research organisations.

The ‘building blocks’ (impact types, stakeholders and causal factors) we identified can be assembled in multiple combinations to construct nuanced impact narratives in a way that is helpful for accountability (e.g. for funders), communication (e.g. for stakeholders) and learning (e.g. for researchers/managers).

Whatever their organisation, sector or country, we encourage readers to try it out and judge for themselves!

Drawn from
David M. Edwards and Laura R. Meagher. 2019. Forest Policy and Economics. A framework to evaluate the impacts of research on policy and practice: A forestry pilot study. Forest Policy and Economics. https://doi.org/10.1016/j.forpol.2019.101975

Laura Meagher (pictured) is the senior partner in the Technology Development Group, an honorary fellow at the University of Edinburgh and at the James Hutton Institute and an associate at the Research Unit for Research Utilisation at the University of St Andrews. She has spent over 30 years working in the US and the UK with and within research and education institutions, along with industry and government, focussing on facilitation and evaluation of strategic change. David Edwards is an environmental social scientist with 25 years' experience in UK, Europe, Africa and South Asia. He is a member of the senior management team at Forest Research, the research agency of the Forestry Commission, where he is head of the Centre for Ecosystems, Society and Biosecurity.

View all posts by Laura Meagher and David Edwards

Related Articles

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures
Impact
September 23, 2024

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Read Now
Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate
Impact
September 18, 2024

Paper to Advance Debate on Dual-Process Theories Genuinely Advanced Debate

Read Now
Webinar: Fundamentals of Research Impact
Event
September 4, 2024

Webinar: Fundamentals of Research Impact

Read Now
Paper Opening Science to the New Statistics Proves Its Import a Decade Later
Impact
July 2, 2024

Paper Opening Science to the New Statistics Proves Its Import a Decade Later

Read Now
A Milestone Dataset on the Road to Self-Driving Cars Proves Highly Popular

A Milestone Dataset on the Road to Self-Driving Cars Proves Highly Popular

The idea of an autonomous vehicle – i.e., a self-driving car – isn’t particularly new. Leonardo da Vinci had some ideas he […]

Read Now
Why Social Science? Because It Can Help Contribute to AI That Benefits Society

Why Social Science? Because It Can Help Contribute to AI That Benefits Society

Social sciences can also inform the design and creation of ethical frameworks and guidelines for AI development and for deployment into systems. Social scientists can contribute expertise: on data quality, equity, and reliability; on how bias manifests in AI algorithms and decision-making processes; on how AI technologies impact marginalized communities and exacerbate existing inequities; and on topics such as fairness, transparency, privacy, and accountability.

Read Now
Digital Scholarly Records are Facing New Risks

Digital Scholarly Records are Facing New Risks

Drawing on a study of Crossref DOI data, Martin Eve finds evidence to suggest that the current standard of digital preservation could fall worryingly short of ensuring persistent accurate record of scholarly works.

Read Now
5 1 vote
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments