Friday, 01 May 2020 14:10

What have we learnt about impact?

We’ve been interviewing project coordinators about impact; how they define it, measure it and assess it within their projects. We’ve been collecting and analysing their answers, in an effort to better design the platform to suit their needs. You can read a little bit about the projects we’ve been investigating here: https://mics.tools/lab

We have learnt a great deal from speaking to project coordinators, but here are some of the highlights:

Lesson 1: Impact is hard to define
This will come to no surprise to those who have taken an interest in impact, but even just the concept of “impact” is difficult to define. In some citizen science projects, “impact” isn’t mentioned, and the importance of the project is in knowledge creation. Whereas some coordinators consider any change resulting from their project to be an impact, others only see impacts as long-term indirect effects. While it is not for us to say who is right or wrong, it’s important to consider these opinions when developing the MICS platform.

Lesson 2: We measure impact to learn
As many of you involved in different citizen science projects will know, some projects will have work packages and deliverables dedicated to impact; but many don’t. What we wanted to know, was why does measuring the impact of a project matter in any case? We’ve learnt that there is a lot of crossover between a project’s impact and its evaluation: a thorough impact assessment allows projects to see how effective their methodologies are, where opportunities might be, and how best to promote their activities. It also provides an opportunity for learning, so that future projects can be more impactful and successful.



Lesson 3: Projects use different impact assessment methods (with varying levels of success)
Methods of assessing impact vary hugely across the citizen science projects we interviewed: from having no in-house method and not being aware of any formal methodology in the literature; through having a formal in-house method that is too complicated to actually use practically; to using a strict framework such as the Theory of Change. IHE Delft have been exploring published frameworks in an effort to better understand the literature – we’ll have more on that in the next newsletter. For now, it’s clear that a key task for MICS will be to develop a method that suits both impact newbies and experts; we think we’re up to the challenge!

Lesson 4: Impact is important for our volunteers!
Whilst formal impact assessments are often required by citizen science project funders, a lot of project coordinators consider the citizen scientists to be a key audience interested in a project’s impact. This makes sense in terms of sustaining volunteers’ engagement, and is something for MICS to consider when designing the output of the platform: it has to capture the attention and interest of the average citizen scientist, as well as provide enough information to satisfy more formal funding bodies.

Lesson 5: An impact assessment output will have to be both qualitative and quantitative
We asked citizen science project coordinators, “what would a visual representation of impact look like to you?” and – as you can imagine – the answers were as varied as the projects themselves! Some people think of impact as a score out of 10; some people think of it as a heat map; whereas others think of it as more of a story, or collection of experiences. At MICS, we are taking all of this feedback on board, to design an output that will be useful to as many citizen-science projects – their coordinators and volunteers – as possible.

More in this category: « The impact of Covid-19 on MICS
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 824711.

Newsletter

Enter your email address above to receive updates on the MICS project. Please see our privacy policy for more information.

MICS, 2019-2021
Photo credits: River Restoration Centre