To know if citizen science is successful, measure it
As you read this, thousands of ordinary people across Europe are busy tagging, categorising and counting in the name of science. They may be reporting crop yields, analysing plastic waste found in nature or monitoring the populations of wildlife. This relatively new method of public participation in scientific enquiry is experiencing a considerable upswing in both quality and scale of projects.
Of course, people have been sharing observations about the natural world for millennia – way before the term ‘citizen science’ appeared on the cover of sociologist Alan Irwin’s 1995 book ‘Citizen Science: A Study of People, Expertise, and Sustainable Development’.
Today, citizen science is on the rise with bigger projects which are more ambitious and better networked than ever before. And while collecting seawater samples and photographing wild birds are two well-known examples of citizen science, this is just the tip of the iceberg.
Citizen science is evolving thanks to new data collection techniques enabled by the internet, smartphones and social media. Increased connectivity is encouraging a wide range of observations that can be easily recorded and shared. The reams of crowd-sourced data from members of the public are a boon for researchers working on large-scale and geographically diverse projects. Often it would be too difficult and expensive to obtain this data otherwise.
Both sides win because scientists are helped to collect much better data and an enthusiastic public gets to engage with the fascinating world of science.
But success has been difficult to define, let alone to translate into indicators for assessment. Until now.
A group of EU researchers has taken on the challenge of building the first integrated and interactive platform to measure costs and benefits of citizen science.
Hundreds of questions
‘The platform will be very complex but able to capture the characteristics and the results of projects, and measure their impact on several domains like society, economy, environment, science and technology and governance,’ said Dr Luigi Ceccaroni, who is coordinating the Measuring Impact of Citizen Science (MICS) project behind the platform. Currently at the testing stage, the platform is slated to go live before the end of this year.
‘Imagine, we are working with more than 200 variables. So, you can understand the complexity and just how comprehensive this platform will be. It’s the first time a project is considering so many variables and so many domains in citizen science,’ he explained. ‘Basically, the platform captures the data through questions, and as I said, we have more than 200. Some are simple questions; others are more complex.’
Questions delve into the role and responsibilities of the public (citizen-scientists), and whether their participation in the project has influenced them in any way (changes in values, opinion, attitudes or perspectives).
Imagine, we are working with more than 200 variables. So, you can understand the complexity and just how comprehensive this platform will be. It’s the first time a project is considering so many variables and so many domains in citizen science.
Another series of questions is used to explore a project’s impact on the various domains. For instance, the projects are asked whether the innovation stemming from their project results in productivity and gross domestic product (GDP) growth. There are also questions exploring the level of trust among project participants and other stakeholders.
‘We rely on information provided by the project coordinators,’ added Ceccaroni. ‘Sometimes they measure these aspects in very concrete and scientific ways. Sometimes they think they do, but they don’t. The platform will help them start to measure what’s not being measured and to understand how to measure.’
For example, many of the multiple-choice answers on the platform offer the ‘Yes, but it is not measured’ option. By selecting this answer, project coordinators will be directed to a tool that will show them how to start measuring.
‘One thing MICS has taught me is that impact assessment is extremely complex,’ said Ceccaroni, who brings more than a decade of citizen science experience to the project.
‘With the platform, our aim is to make it useful before a project even starts – when there is time to introduce elements linked to impact and to shape the project in a way to make sure impact can be measured.’
Capturing and communicating impacts
Citizen science projects contribute to learning, skill development, scientific understanding, science awareness and enjoyment, according to Dr Raul Drachman. His observation is based on the findings of an international survey conducted by the CS-TRACK project that he coordinates. The survey gauged the experience of more than 1 000 participant volunteers in biodiversity citizen science projects in Europe, Australia and New Zealand.
‘We found out in our project – also via studies we made on the sustainable development goals (SDGs) – that the environmental subject is extremely important (especially, but not only, climate change) in the citizen science context,’ said Drachman. ‘There is no doubt that the high attention paid to the subject has effect on related individual and social perceptions and attitudes to the major problems involved.’
So, you saw a bird different from any other, and you shared this information. So what? We need to be able to explain why this is important and to show what happened with this spotting.
Researchers also explored subjects like education, healthcare and emerging challenges like the COVID-19 pandemic. ‘Our research revealed a lot of information on gender, age and other parameters at the individual participant level. The inclusion of diverse audiences in citizen science is of vast importance for the progress of the field and for creating real impact on science and society.’
Another crucial factor considered is the perceived quality of citizen science data. It’s important to show what comes out of research that involves citizens who are not scientists. ‘So, you saw a bird different from any other, and you shared this information. So what?’ said Drachman. ‘We need to be able to explain why this is important and to show what happened with this spotting. Of course, as researchers we are aware of the value of information gathered by citizens. It’s important to make this known wider.’
By opening the process of knowledge creation beyond the limiting borders of academia and research institutions, citizen science enables the inclusion of local expertise and lay knowledge in the scientific process. It also enriches the research findings.
More than halfway through the three-year research project, Drachman has identified another key takeaway: the need for moderation. ‘Citizen science brings together many players – scientists and non-scientists – coming from different fields and expertise. It’s never certain they will understand each other. So the question is how to ensure interest from all sides is maintained and the project advances. The solution is to effectively moderate between all sides.’
Addressing the contrasting data needs and motivations of the different stakeholders (researchers, citizens, policymakers and business consultants) is vital. Also, reaching a common understanding of citizen science and its benefits for science and society is crucial if projects are to continue to grow and address both local issues and global challenges. In turn, this will increase the trust and acceptability of citizen-generated data in order to address the grand challenges like the SDGs.