Next Article in Journal
Towards Sustainable Energy Generation Using Hybrid Methane Iron Powder Combustion: Gas Emissions and Nanoparticle Formation Analysis
Previous Article in Journal
Promoting Sustainability in Peripheral Regions: A Regional Economic Development (RED) Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Contribution of Citizen Science to SDG 4: A Systematic Review of the Evaluation of Learning Outcomes in Citizen Science Projects in Compulsory Education

by
Gloria Rodríguez-Loinaz
Department of Mathematics, Experimental and Social Sciences Education, University of the Basque Country (UPV-EHU), 48980 Leioa, Spain
Sustainability 2026, 18(2), 703; https://doi.org/10.3390/su18020703
Submission received: 17 September 2025 / Revised: 10 December 2025 / Accepted: 11 December 2025 / Published: 9 January 2026

Abstract

The contribution that the inclusion of CS in the curriculum can make to address SDG 4—Quality Education—which fosters convergence between Science Education and Education for Sustainable Development, essential for addressing the sustainability challenges currently facing humanity, has been widely recognized. This recognition is driving the inclusion of CS in formal education. However, to ensure that the use of CS in formal education contributes to this objective, a systematic and rigorous evaluation of its benefits in terms of participants’ learning outcomes (LO) is necessary. This study presents a systematic review of the published literature on CS projects implemented in compulsory education to examine whether students’ LO from participation in CS projects are evaluated, and if so, how this evaluation is performed. The results indicate a lack of systematic evaluation of LO from participating in CS projects. Moreover, although in 79% of cases, where some LO is evaluated, the evaluation reported positive results, in most of them, the results may have been influenced by the voluntary or mandatory nature of participation in the projects and the design of the evaluation itself. This may bias the results, leading to an over-optimistic view of the contribution of CS to SDG 4. In order to obtain solid evidence of the benefits, or lack thereof, for learners of participation in CS activities, which can guide the designers and educators in improving the CS projects to maximize their educational and sustainability impacts, some recommendations for future studies are presented.

1. Introduction

Education is key to advancing towards a sustainable planet; therefore, SDG 4 aims to ensure quality education. Promoting scientific literacy in society is one of the objectives of quality education [1], as scientific literacy is essential for addressing many of the sustainability challenges humanity faces. However, students’ interest in science decreases as they progress through the educational system [2,3] due to different factors, such as the lack of mentors as well as the use of teaching methodologies that are not stimulating for students [4,5]. To address this problem, the Next Generation Science Standards [6] emphasize the need to rethink science teaching, stressing the need to integrate science content with the practice of doing science. The inclusion of Citizen Science (CS) in formal education could play a significant role in this regard, as it allows students to engage in the scientific process through participation in real scientific projects, and at the same time, learn scientific content [7].
Due to the technological advances to promote citizen participation in science, and the potential of CS to raise the scientific literacy of society [8], there has been a significant increase in CS activities in recent years in informal settings [9,10]. However, the implementation of CS projects in informal settings has a limitation, which is that they do not reach the entire population. CS projects in informal settings face inequality issues endemic throughout society, with innate barriers to participation for minorities and disadvantaged communities [11]. In addition, participants in CS projects in informal settings are often volunteers, so they are likely to have an intrinsic motivation and interest in science that may be lacking in other groups of citizens [11]. Therefore, there is a push to include CS in formal education, with the aim of bringing all the potential benefits of participation in CS to all citizens [12,13].

1.1. Potential of the CS in Formal Education

The contribution that the inclusion of CS in the curriculum can make to address SDG 4—Quality Education—which fosters convergence between Science Education and Education for Sustainable Development, essential for addressing the sustainability challenges currently facing humanity, has been widely recognized [14,15,16,17]. Participation in CS can contribute to increasing students’ interest and motivation [7,18] because it gives the students the opportunity to participate in scientific activities in authentic contexts, which allows them to understand the relationships between scientific concepts and everyday life [19,20]. Through this type of context-based learning activity, learning becomes more relevant for students because they observe phenomena immersed in their close context and contribute to the solution of real problems [21,22]. Moreover, the integration of CS in the curriculum can contribute to improving students’ understanding of the Nature of Science, as well as to the development of scientific skills [23,24]. Similarly, different studies claim that participation in CS is an excellent way to stimulate a positive attitude towards science [25] and critical thinking [26]. Finally, the contribution of CS to increasing environmental awareness and participation in environmental conservation has also been recognized [27,28].

1.2. Challenges for the Inclusion of CS in Formal Education

The integration of CS into the curriculum is a relatively new approach as it poses a great challenge [10,11]. One of the biggest challenges for the inclusion of CS in formal education is making scientific goals converge with educational goals [12,29,30,31]. While the scientific community is motivated by data and concerned about data accuracy, teachers are concerned about time demands and the alignment of goals and activities with curriculum standards.
For the integration of CS activities into formal education, from a pedagogical perspective, CS interventions must be meaningful and engaging for students, feasible for teachers, and must pursue teaching objectives aligned with curriculum standards [12,29,31,32]. However, in many CS projects, while science objectives are clearly defined, learning objectives [33] and connections to the curriculum are not [34]. Furthermore, once learning objectives are established, quality assessment of learning outcomes (LO) should also be planned and designed [11,35]. Numerous authors argue that a systematic and rigorous evaluation of the benefits of CS in terms of the LO of participants is necessary, since without it, any conclusion about the impact on students of participation in CS activities is premature [10,30]. However, very few CS projects contemplate an evaluation of LO [33,36], because participants’ learning in many cases is not the focus of the project [10,37], and many of the commonly expected LO from participation in CS activities are difficult to evaluate [31,33,38].

1.3. Previous Revisions of the Use of CS in Formal Education

Recent literature reviews have examined the use of CS in formal education, but they present gaps in terms of the evaluation of students’ LO in CE. Pizzolato and Tsuji [34] analyzed 73 CS interventions in K-12 settings but omitted any analysis of the evaluation of students’ LO. Similarly, Lüsse and colleagues [30] reviewed several CS initiatives in schools and proposed potential LO (e.g., motivation, knowledge, and communication skills) but did not examine whether or how these LO were assessed. Despite this, they called for a more systematic evaluation of LO in formal science education. Tsivitanidou and Ioannou [39] focused their review on the evaluation of students’ LO, but their systematic review only included CS projects facilitated by digital technologies, finding only 12 studies conducted in formal CE that evaluated the impact of the project on students’ LO, and concluding that empirical evidence on the effects of technology-facilitated CS projects on students’ LO remains scarce. Hadjichambi and colleagues [40], in their review of CS projects in K-12 education, identified only eight initiatives implemented within formal education settings, and, although the study examined which LO were expected, it did not investigate whether these outcomes were actually evaluated, nor the methods employed for such assessment. A similar limitation is observed in another review of CS projects in K-12 education [41].
The same gaps have been revealed in other reviews not focused on formal CE. Abourashed and colleagues [42], in their review, mainly analyzed the quality of data and student participation in CS projects in life sciences in formal education. However, they reported that, of the 23 projects identified, only three addressed LO, and none were carried out in CE. Vance-Chalcraft and colleagues [10] reviewed the use of CS in post-secondary education and noted that only a few of the 15 studies they found reported on LO evaluations.
In summary, although numerous reviews exist, to the best of the author’s knowledge, none has focused on the evaluation of the students’ LO from CS participation in CE. However, understanding and improving LO evaluation practices is crucial for enhancing the educational value of CS projects in CE and, therefore, their contribution to SDG 4. To address this need, the present study provides a systematic review of the published literature on CS projects implemented in CE to examine whether students’ LO are evaluated, and, if so, how these evaluations are carried out, in order to identify possible limitations and good practices, with the aim of providing clues for future designers of CS project in CE to obtain as reliable as possible evaluations of the students’ LO. The initial hypothesis of this study is that there is no systematic and rigorous assessment of students’ LO in CS projects.

2. Methods

This study presents a systematic review of the publications on the use of CS in CE. For this purpose, the PRISMA method was used (Figure 1).

2.1. Search Strategy

The search for articles was conducted in June 2023 and focused on articles published in English up to December 2022 in two important databases due to their wide coverage and the quality of their content: Web of Science and Scopus. Based on previous studies [10,34], the following combination of terms and Boolean operators, which include different ways of referring to CS, were used: (“citizen science” OR “public science” OR “community science” OR “public participation in science”) AND (School* OR Education* OR Class*) AND (Student* OR Pupil* OR Learn* OR Youth* OR Child*).
The search led to the retrieval of 1161 articles. After eliminating duplicates (n = 393), 768 items remained to be analyzed.

2.2. Article Screening

The articles were screened in two phases based on exclusion and inclusion criteria. For the first screening, the following exclusion criteria were used:
  • Articles that did not present educational experiences
  • CS experiences outside formal education
  • No first-hand data (e.g., reviews)
  • CS experiences in high or early childhood education
Many of the articles analyzed met the first exclusion criteria as some of them had a theoretical approach, e.g., [44,45], while others presented CS projects in a general way, indicating that students could participate, e.g., [46], or presented tools, applications, techniques, etc. to facilitate data collection and interpretation in CS projects, e.g., [47]. Finally, some articles focused on analyzing the quality of data obtained through CS, e.g., [48]. In the first screening, 578 articles were eliminated.
For the second screening, an exhaustive review of the remaining 190 full articles was conducted, using the following inclusion criteria:
  • Articles that present CS experiences implemented within formal education
  • Articles that present CS experiences implemented within CE (for each paper, the educational system of the country was considered, including it as CE or not)
  • Articles that present an evaluation of the LO
All articles that met the first two criteria and made no reference to LO, or mentioned LO without presenting the evaluation of them, were counted but not included in the subsequent analysis.
Of the 190 articles, six could not be analyzed because access to the full articles was unavailable, and 58 did not meet either of the first two inclusion criteria. The remaining 126 articles presented CS experiences implemented in formal CE, but only 45 presented an evaluation of the LO (see Appendix A).

2.3. Analysis and Coding

For the coding, a content analysis based on a full and exhaustive reading of each article was performed. The 45 articles were analyzed for coding based on the following information: year of publication, country, CS project topic, type of CS project (contributory/collaborative/co-produced) [8], duration of the intervention, type of student participation (voluntary/mandatory), evaluated LO, basis of evaluation (perception - self-reported data/objective data), and evaluation design (experimental/quasi-experimental/non-experimental) [49].

3. Results

The systematic review identified 126 articles that present CS projects implemented in compulsory formal education and showed a significant increase in these projects since 2016 (Figure 2).
Of the 126 identified articles, only 45 presented an evaluation of the LO achieved by the students. Thus, 50 of the articles did not refer to the LO achieved by the participating students. Most of these articles focused on disseminating the results of the research, presenting the students as a mere data collection tool, e.g., [50,51], or on analyzing the reliability/rigidity/usability of data collected by students, e.g., [52,53]. Of the remaining 76 articles, 31 mentioned the LO that students would obtain, but in a theoretical or hypothetical way, without conducting an evaluation of the LO, e.g., [54,55]. Some of these articles even indicated that in the future it would be necessary to design an LO evaluation system, e.g., [56,57].

3.1. Characteristics of the CS Projects Analyzed

Regarding the geographical distribution of the 45 articles analyzed, most of them present CS projects carried out in European countries (51%) and in the USA (31%), and the most common study topics were biodiversity (33%) and pollution (mainly water and air pollution) (33%).
As for the type of CS project, following the classification of Bonney and colleagues [8], the projects were classified as contributory (student participation is restricted to obtaining data for the research), collaborative (students participate in other phases of the project such as data analysis or communication of results), and co-produced (the projects are designed in collaboration with the participating students, who participate in most phases of the research). Of the 45 articles analyzed, in 62% of them student participation was contributory, with 31% collaborative, while only 7% presented co-produced projects.
Finally, with regard to the duration of the students’ participation in the CS projects, in 18% of the articles, it was not possible to extract this information. In most of the projects the duration of the students’ participation was short (1–4 sessions) (40%) or medium (1–2 months) (29%), with only six projects (13%) where the intervention took more than 2 months.

3.2. Evaluated LO

Following the LO classifications proposed by Friedman [58] and Phillips and colleagues [49], the LO evaluated were grouped into six categories (Table 1): (1) interest and attitude towards science and the environment, (2) knowledge and awareness of scientific topics, (3) knowledge of the Nature of Science, (4) competencies for scientific inquiry, (5) behavior; and (6) self-efficacy.
The most evaluated LO were cognitive and attitudinal. For cognitive LO, the most evaluated LO was ‘Knowledge about the study topic’ (73% of the articles), followed by ‘Awareness of the problem under study’ (24%). For attitudinal LO, ‘Interest in and/or positive attitude towards science’ (24%) was the most evaluated. The procedural LO related to the acquisition of skills for scientific inquiry, and the behavioral LO were much less evaluated. Among these, the ones that were evaluated the most were ‘Self-efficacy to participate in scientific activities’ (18%) and ‘Data analysis/graphs creation and interpretation skills’ (16%).
Regarding the results of the evaluation, in 79% of cases, the evaluation reported positive results, such as more knowledge about the study topic, e.g., [59,60,61], or the development of data analysis skills, e.g., [38,62]. Moreover, 14% of the evaluations showed a lack of effect of the intervention, and 7% showed a negative effect, such as a reduction in self-efficacy for participating in scientific activities [13].

3.3. Reported LO by Study Topic and Type of Project (Contributory/Collaborative/Co-Produced)

The comparison of the reported results of the LO evaluation in function of the study topic does not show any effect of this factor. For both biodiversity and pollution projects, around 70% of evaluations reported positive results and 30% negative or no effect. The type of project (contributory/collaborative/co-produced) also shows no effect in the reported evaluations (see Appendix B).

3.4. Reported LO by Type of Participation (Voluntary vs. Mandatory)

As for the type of student participation, it was not possible to extract this information in 18% of the articles. In 18% of the studies, the CS project was presented as a complementary activity, and students’ participation was voluntary. In 64% of the studies, CS activity was presented as another class activity, and entire classes participated.
The analysis of the reported results of the LO evaluation shows that in projects where student participation was voluntary, the evaluation reported mainly positive results (94%). However, in the projects where student participation was mandatory, the evaluation reported negative results in 12% of the cases and no effect of the intervention in 18% of the cases (Figure 3).

3.5. LO Reported by Evaluation Design

Following the classification proposed by Phillips and colleagues [49] on LO evaluation design in CS projects, the evaluation was classified into three groups: (a) with non-experimental design, only with post-intervention testing (test, interviews, papers…) and without control; (b) with quasi-experimental design, with pre- and post-tests and without control; and (c) with experimental design, with control.
In the 45 analyzed articles, 36% of the evaluations of the LO presented a non-experimental design, 56% a quasi-experimental design, and only 8% an experimental design.
Both projects with non-experimental, quasi-experimental, and non-experimental evaluation designs included projects on different topics, of different duration, and of different types (contributory/collaborative/co-produced) (see Appendix C).

3.5.1. Results of the Evaluation of LO with a Non-Experimental Design

All LO reported in the evaluations with non-experimental design were positive, such as increased knowledge about the study topic, e.g., [63,64], or critical thinking, e.g., [65,66], with one exception, in which a lack of effect of the intervention on the students’ view of themselves as scientists was reported [67] (Figure 4a).
However, the evaluations had several limitations in most cases. In 50% of the cases, participation in the evaluation was voluntary, e.g., [67,68,69]. Furthermore, in many cases, evaluations were based on self-reported learning by students (Table 2), on perceptions of teachers, e.g., [67,70], or on informal tests or observations, e.g., [69,71].
These limitations were recognized in several papers:
‘When analyzing the participants’ questionnaires, the frequency of response and possible biases must be taken into account........ In this case, the frequency of response is around 18%. It must be assumed that some teachers who respond to the questionnaire also encourage their students to do so, resulting in a certain bias based on mutual experiences. It can also be assumed that those teachers who enjoyed participating are more likely to respond to the questionnaire and encourage their students to do the same.’ [67] (pp. 5–6)
‘The main limitation of our study was that we did not track actual learning outcomes but rather students’ perceptions and reports of what they had learned. As our study is based on the students’ self-described experiences, own interpretations, and self-reflection of what has happened, we are not able to assess the factual impact of our project in relation to their learning.’ [7] (p. 335)

3.5.2. Results of the Evaluation of LO with a Quasi-Experimental Design

The evaluation of LO with a quasi-experimental design showed a greater diversity of results, with positive results in 69% of the cases, showing no effect of the intervention in 19% of the LO reported, and reporting negative results in 12% of the evaluations (Figure 4b).
The most evaluated LO was “Knowledge about the study topic” (70% of the articles), with positive results in all cases, e.g., [5,75], and measured objectively except for one case where it was self-reported by the students [61]. The rest of the LO were evaluated by objective and/or previously validated questionnaires from other research, with the exception of three cases [13,61,76], where the evaluation was based on student self-reported or teacher-perceived learning. Excluding LO related to the acquisition of knowledge about the topic of study, the results presented show a greater balance between positive effects of interventions (58%) and neutral or negative effects (42%).
As for student participation in the evaluation, the percentage of cases in which participation was voluntary was less than in evaluations with a non-experimental design, falling to 26%, e.g., [77,78]. In the remaining cases in which this information could be extracted, all the students who took part in the projects participated in the evaluation, or at least entire classes in the projects in which the coverage was extremely large, e.g., [79].

3.5.3. Results of the Evaluation of LO with an Experimental Design

Of the 45 articles analyzed, only four presented an evaluation of LO with experimental design [80,81,82,83]. These studies report 14 LO evaluations, with the results being positive on half of the occasions and showing no extra beneficial effect of the intervention on the other half (Figure 4c).
Focusing only on the most evaluated LO, evaluations with non-experimental design reported positive LO in all cases, and evaluations with more rigorous designs did not support many of those results (Table 3).
While the three evaluation designs supported the contribution of CS in increasing knowledge about the study topic, the studies with quasi-experimental and experimental design did not support an increase in interest or the development of a more positive attitude towards science, increase in the awareness of the problem under study, increase in self-efficacy to participate in scientific activities, or increase in interest in scientific careers, due to participation in CS projects.

3.6. Reported LO by Duration of the Intervention

The comparison of the reported results of the LO evaluation in function of the duration of the interventions shows an unexpected result, with the projects with longer interventions reporting less positive LO than the ones with shorter interventions (Figure 5).
In order to interpret these results, it is worth pointing out that the use of more rigorous evaluation designs was more common in the longer projects than in the shorter ones (Figure 6).

4. Discussion

The potential social benefits for the participants in CS projects are driving the inclusion of CS activities in formal education, with the aim that these benefits reach the entire citizenry [12,13]. This review has evidenced this drive for CS in CE, having identified more than 100 articles published between 2016 and 2022 on CS projects conducted in CE.
Projects involving students in formal education must ensure that the participating students achieve LO aligned with the school curriculum [12,31,84]. Therefore, they should have well-defined educational objectives and rigorous evaluation systems to assess whether the project really contributes to the achievement of those objectives. However, this review has shown the lack of LO evaluations in most of the articles on CS activities in CE, confirming the initial hypothesis of this study. Many articles (n = 31) refer to the LO that students would achieve through participation in the CS project, without presenting any evidence. This is common in CS projects, which usually do not integrate an evaluation of their educational value, assuming that it is high [25]. However, some authors indicated that ‘Educational outcomes are an important component of school-based citizen science projects, so the project’s ability to deliver them should be formally evaluated rather than assumed’ [85] (p. 11). This argument is supported by different authors, such as Wichmann and colleagues [83], who reported a lack of effect of the intervention on environmental behavior and indicated that ‘The results suggest that a change in environmentally favorable behavior cannot be expected from participation alone’ [83] (p. 1). Other authors [5] also reported a lack of effect of the intervention on several of the expected LO and indicate that ‘Citizen science projects should not be assumed to increase scientific self-efficacy, attitudes toward Science, or interest in Science careers among students’ [5] (p. 1048).
Although there seems to be a consensus on the benefits for students of participating in CS activities, these benefits often remain more assumed than empirically demonstrated, since, as this review has shown, the publications on the use of CS in CE provide little rigorous data on students’ LO. Other authors, such as Vance-Chalcraft and colleagues [10], have reported similar results. These authors, in their review on the use of CS in postsecondary education, found that most of the articles reported many LO, on the basis of anecdotal evidence and the perceptions of the students and instructors, with only two articles that evaluated any LO beyond student perceptions or reflections. Tsivitanidou and Ioannou [39], in their review of technology-enabled CS projects involving school students, also found only 12 projects implemented in formal education that reported any LO, some of which were not supported by meaningful measures to authenticate the results.
To increase the empirical evidence for these assumptions, it is crucial to formally and rigorously evaluate the LO of student participation in CS projects [25,83]. However, the results of this review have shown that in most of the articles that evaluated LO, the results reported may have been influenced by the voluntary or mandatory nature of participation in the projects and by the design of the evaluation itself.
SDG 4—Quality Education—emphasizes the need to foster scientific literacy, critical thinking, and active citizenship [86], all of which align closely with the pedagogical potential of CS. Robust evaluation of LO in CS projects is therefore essential to ensure that they contribute meaningfully to the broader educational targets established under SDG 4.

4.1. Effects of the Type of Student Participation on LO Evaluation Results

This review has shown that projects in which students had volunteered reported a positive effect of the intervention in almost all cases. However, these results may be due to the bias produced by the type of participating students [7].
The contribution of participation in CS activities to the achievement of LO should not be evaluated only with students who have volunteered to participate in a project, as these students may not be representative of the whole class. They may be more motivated students who have a personal interest in learning or science [13], which the rest of the students who do not volunteer may not have. This may condition the benefits of the intervention [5,13], as student motivation may contribute to greater learning [87]. This would explain why, in projects in which whole classes participated, a lack of effect or even negative results were reported on many occasions. Therefore, taking into account that activities carried out in CE should ensure that they bring benefits to all the students, and not only of those who have a predisposition to learn, the suitability of CS projects for implementation in CE should be evaluated through projects that involve whole classes, to ensure the inclusion of students of different characteristics (e.g., social class, culture, learning skills, attitudes and interests, etc.).
The evaluation should also involve all students, since the voluntary or mandatory nature of participation in evaluation processes can also affect the results. It is possible that the students who voluntarily participate in the evaluation are those who have been motivated by the project and, therefore, have been more involved in it. This can contribute to greater real or perceived learning. The students who do not participate in the evaluation are possibly those who have not been as motivated by the project and, therefore, have not obtained or perceived the same learning.
Regarding the phases of the research in which students participate, student participation may be restricted only to data collection (contributory project) or may extend to other phases of the research, such as data analysis or communication of results, (collaborative project), or even to all of them (co-produced project). Authors such as Roche and colleagues [11] recommend that new CS projects should be co-produced, as these projects are more likely to achieve the scientific and educational goals of the project. However, this review has not shown an effect of project type on the reported LO evaluations, with all three types of projects reporting very similar frequencies of positive and negative results or no effect.
Finally, as for the duration of the participation in the CS project, it might be expected that projects of longer duration would achieve more LO than those of shorter duration, as there are LO that take longer to achieve. However, this review has shown the opposite result, with a lower proportion of positive evaluations in long projects than in those of shorter duration. This could be explained by the evaluation design, with more rigorous evaluation designs (quasi-experimental/experimental) being used more frequently in the longer-duration projects.

4.2. Effects of the Evaluation Design on LO Evaluation Results

In addition to the inclusion of all students in the evaluation of LO, the design of the evaluation can also condition the results, as this review has evidenced. Evaluations based only on post-intervention tests do not allow for determining objectively whether there has been an improvement, because there is a lack of diagnosis of the actual baseline situation. Regarding the tool used, although a rigorous analysis of the specific tools used to evaluate the different LO has not been carried out, as this was beyond the scope of this study, this review has shown that evaluations are often not based on robust or validated instruments. Frequently, evaluations rely on students’ self-reported learning or teacher perceptions, e.g., [67,70]. These evaluations may be biased due to social desirability [88,89]; that is, students’ responses may be influenced by what they believe the teachers or researchers expect them to respond. This highlights the need for future research focused on the development, standardization, and validation of assessment tools to evaluate different LO in the context of CS projects.
So, in order to carry out an evaluation of the LO of participation in such projects that is as rigorous as possible, an evaluation with an experimental design based on objective data would be the most appropriate one. Only an experimental design can ensure that an innovative methodology, such as the use of CS, will result in higher LO than other more conventional methodologies. Evaluations with quasi-experimental design (pre- and post-test) based on objective data can evaluate whether the expected LO have been achieved due to the intervention but cannot ensure that this was due to the methodology or simply to having worked on the subject [5,25,90].

4.3. Evaluated LO

Finally, this review has shown a clear bias of evaluations towards cognitive LO, with ‘knowledge about the study topic’ and ‘awareness of the problem under study’ being the most evaluated LO. However, evaluations should go beyond merely assessing effects on knowledge acquisition and awareness of an issue [13,91] as scientific literacy goes beyond this. It includes the ability to apply scientific knowledge to solve problems, identify questions, generate new knowledge, explain phenomena, and draw evidence-based conclusions about science-related issues, as well as valuing science and a positive attitude and interest in scientific topics [92]. So, to ensure that participation in CS activities contributes to students’ comprehensive scientific literacy, deeper LO must also be evaluated [33,35], such as the development of scientific inquiry skills, a deeper understanding of the Nature of Science [14,93,94], or changes in attitudes and behaviors of participating learners [95]. However, this review has shown that the evaluation of this type of LO is uncommon. In the vast majority of cases, and independently of the participating students and the design of the evaluation, the level of knowledge about the study topic increased, so it could be said that there is evidence of the contribution of CS to increasing student knowledge. However, regarding other types of LO, the results reported when using more rigorous evaluation systems have shown different results; therefore, the available evidence of the effects of CS projects on deeper learning remains limited, which offers a clear avenue for future research.

5. Conclusions and Recommendations

The use of CS in CE has proliferated in recent years; however, the lack of systematic and rigorous evaluation of LO may lead to an overly optimistic view of its positive effects on learners and, therefore, on its contribution to SDG 4—Quality Education. This review has shown two main trends: a) many studies assume LO without evidence, and b) many others base their evidence on projects and evaluations that, due to their design, may bias the results. In order to obtain solid evidence of the beneficial effects, or lack thereof, for students of participation in CS activities, that can guide the designers and educators in improving the CS projects to maximize their educational and sustainability impacts, future studies should formally evaluate student LO in a rigorous manner, for which it is recommended that:
  • The CS project has clear and well-defined learning objectives;
  • The project design includes an LO evaluation system;
  • Whole classes participate in the project;
  • All participating students participate in the evaluation;
  • The evaluation has an experimental design with pre-test, post-test, and control group;
  • The evaluation excludes self-reported LO.
Given that SDG 4 emphasizes not only knowledge acquisition but also the development of scientific competencies and problem-solving skills [86], the absence of robust evaluation frameworks limits our ability to determine whether CS projects meaningfully support these broader educational goals. Strengthening LO assessments in CS projects is therefore essential not only for improving educational practice but also for verifying and enhancing their potential contribution to the targets of SDG 4 [93,96].

Funding

We gratefully acknowledge financial support from the BEZ-EKOFISKO Scientific Groups (IT1648-22).

Acknowledgments

The author has reviewed and edited the output and takes full responsibility for the content of this publication.

Conflicts of Interest

The author declares no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CSCitizen Science
LOLearning outcomes
CECompulsory education

Appendix A

YearAuthorsJournalVolume (Issue)PagesTitle
2012Zoellick, B.; Nelson, S. J.; Schauffler, M.Frontiers in
Ecology and the Environment
10(6)310–313Participatory science and education: bringing both views into focus.
2014Hiller, S. E.; Kitsantas, A.School Science and Mathematics114(6)302–311The effect of a horseshoe crab citizen science program on middle school student science performance and STEM career motivation
2014Nicosia, K.; Daaram, S.; Edelman, B.; Gedrich, L.; He, E.; McNeilly, S.; Shenoy, V.; Velagapudi, A.; Wu, W.; Zhang, L.; et al.Ecological
Economics
104145–151Determining the willingness to pay for ecosystem service restoration in a degraded coastal watershed: A ninth grade investigation
2016Grasser, S.; Schunko, C.; Vogl, C. R.Journal of
Ethnobiology and Ethnomedicine
1246Children as ethnobotanists: Methods and local impact of a participatory research project with children on wild plant gathering in the Grosses Walsertal Biosphere Reserve, Austria.
2016Ruiz-Mallen, I.; Riboli-Sasco, L.; Ribrault, C.; Heras, M.; Laguna, D.; Perie, L.Science Communication38(4)523–534Citizen science: Toward transformative learning
2016Seifert, V. A.; Wilson, S.; Toivonen, S.; Clarke, B.; Prunuske, A.Journal of Microbiology & Biology Education17(1)63–69Community partnership designed to promote lyme disease prevention and engagement in citizen science.
2016Tosh, J.; James, K.; Rumsey, F.; Crookshank, A.; Dyer, R.; Hopkins, D.Botanical Journal of the Linnean Society181(4)711–722Is DNA barcoding child’s play? Science education and the utility of DNA barcoding for the discrimination of UK tree species.
2017Ballard, H. L.; Dixon, C. G.;
Harris, E. M.
Biological Conservation20865–75Youth-focused citizen science: Examining the role of environmental science learning and agency for conservation
2017Weyhenmeyer, G. A.; Mackay, M.; Stockwell, J. D.; Thiery, W.; Grossart, H.; Augusto-Silva, P.B.; Baulch, H.M.; de Eyto, E.; Hejzlar, J.; Kangur, K.; et al.Scientific Reports743890Citizen science shows systematic changes in the temperature difference between air and inland waters with global warming.
2017Zarybnicka, M.; Sklenicka, P.; Tryjanowski, P.PLOS Biology15(1)e2001132A webcast of bird nesting as a state-of-the-art citizen science.
2018Kelemen-Finan, J.; Scheuch, M.; Winter, S.International Journal of Science Education40(17)2078–2098Contributions from citizen science to science education: An examination of a biodiversity citizen science project with schools in Central Europe
2019Andersson-Sundén, E.; Gustavsson, C.; Hjalmarsson, A.; Jacewicz, M.; Lantz, M.; Marciniewski, P.; Ziemann, V.; Barker, A.; Lundén, K.Nuclear Physics News29(2)25–28Citizen science and radioactivity
2019Ellenburg, J.A.; Williford, C.J.; Rodriguez, S.L.; Andersen, P.C.; Turnipseed, A.A.; Ennis, C.A.; Basman, K.A.; Hatz, J.M.; Prince, J.C.; Meyers, D.H.; et al.Atmospheric Environment-X4100048Global ozone (GO3) project and AQTreks: Use of evolving technologies by students and citizen scientists to monitor air pollutants
2019Fujiwara, Y.; Hite, R.; Wygant, H.; Paulsen, S.Childhood Education95(2)53–59Engaging students in global citizen science: A U.S.-japan collaborative watershed project.
2019Kermish-Allen, R.; Peterman, K.;
Bevc, C.
Cultural Studies of Science Education14(3)627–641The utility of citizen science projects in K-5 schools: Measures of community engagement and student impacts
2019Tarter, K.D.; Levy, C.E.; Yaglom, H.D.; Adams, L.E.; Plante, L.; Casal, M.G.; Gouge, D.H.; Rathman, R.; Stokka, D.; Weiss, J.; et al.Journal of the American Mosquito Control Association35(1)11–18Using citizen science to enhance surveillance of Aedes aegypti in arizona, 2015–17.
2019Walkinshaw, L. P.; Hecht, C.; Patel, A.; Podrabsky, M.Journal of School Health89(8)653–661Training high school student "citizen scientists" to document school water access: A feasibility study.
2020Aivelo, T.; Huovelin, S.Environmental Education Research26(3)324–340Combining formal education and citizen science: a case study on students’ perceptions of learning and interest in an urban rat project
2020Anderson, D.; Buntting, C.; Coton, M.; Luczak-Roesch, M.; Doyle, C.; Pierson, C.; Li, Y.J.; Glasson, B.; Brieseman, C.; Boucher, M.; Christenson, D.Curriculum Matters1638–59Using online citizen science to develop students’ science capabilities
2020Gustavsson, C.; Andersson-Sunden, E.; Barker, A.; Hjalmarsson, A.; Lantz, M.; Lunden, K.; Pomp, S.EPJ Web of Conferences23925001Citizen science in radiation research
2020Kocman, D.; Števanec, T.; Novak, R.; Kranjec, N.Sustainability12(23)10213Citizen science as part of the primary school curriculum: A case study of a technical day on the topic of noise and health
2020Maicas, S.; Fouz, B.; Figàs-Segura, À.; Zueco, J.; Rico, H.; Navarro, A.; Carbó, E.; Segura-García, J.; Biosca, E.G.Frontiers in Microbiology11564030Implementation of antibiotic discovery by student crowdsourcing in the valencian community through a service learning strategy
2020Queiruga-Dios, M.; Lopez-Inesta, E.; Diez-Ojeda, M.; Consuelo Saiz-Manzanares, M.; Vazquez Dorrio, J. B.Sustainability12(10)4283Citizen science for scientific literacy and the attainment of sustainable development goals in formal education.
2020Schneiderhan-Opel, J.; Bogner, F. X.Sustainability12(5)2036The relation between knowledge acquisition and environmental values within the scope of a biodiversity learning module.
2020Spicer, H.; Nadolny, D.; Fraser, E.Citizen Science: Theory and Practice5(1)14Going squirrelly: Evaluating educational outcomes of a curriculum-aligned citizen science investigation of non-native squirrels.
2020Walter, A.; Klammsteiner, T.; Gassner, M.; Heussler, C. D.; Kapelari, S.; Schermer, M.; Insam, H.Sustainability12(22)9574Black soldier fly school workshops as means to promote circular economy and environmental awareness.
2021Antunes, P.; Novais, C.; Novais, A.; Grosso, F.; Ribeiro, T.G.; Mourao, J.; Perovic, S.U.; Rebelo, A.; Ksiezarek, M.; Freitas, A.R.; et al.FEMS Microbiology Letters368(4)fnab016MicroMundo@UPorto: An experimental microbiology project fostering student’s antimicrobial resistance awareness and personal and social development.
2021Araujo, J. L.; Morais, C.; Paiva, J. C.Journal of Baltic Science Education20(6)881–893Students’ attitudes towards science: The contribution of a citizen science project for monitoring coastal water quality and (micro)plastics
2021Boaventura, D.; Neves, A. T.; Santos, J.; Pereira, P. C.; Luis, C.; Monteiro, A.; Cartaxana, A.; Hawkins, S.J.; Caldeira, M.F.; Ponces de Carvalho, A.Frontiers in Marine Science8675278Promoting ocean literacy in elementary school students through investigation activities and citizen science
2021Bopardikar, A.; Bernstein, D.; McKenney, S.Journal of
Biological
Education
57(3)592–617Designer considerations and processes in developing school-based citizen-science curricula for environmental education
2021Carson, S.; Rock, J.; Smith, J.Frontiers in Education6674883Sediments and seashores: A case study of local citizen science contributing to student learning and environmental citizenship
2021Castell, N.; Grossberndt, S.; Gray, L.; Fredriksen, M. F.; Skaar, J. S.;
Høiskar, B. A. K.
Frontiers in Climate3639128Implementing citizen science in primary schools: Engaging young children in monitoring air pollution
2021Fagan-Jeffries, E. P.; Austin, A. D.Zootaxa4949(1)79–101Four new species of parasitoid wasp (Hymenoptera: Braconidae) described through a citizen science partnership with schools in regional South Australia.
2021Lewis, R.; Carson, S.New Zealand Journal of Educational Studies56(1)101–110Measuring science skills development in new zealand high school students after participation in citizen science using a DEVISE evaluation scale.
2021Mnisi, B. E.; Geerts, S.; Smith, C.;
Pauw, A.
Biological Conservation257109087Nectar gardens on school grounds reconnect plants, birds and people
2021Tucker, C. S.; Trepanier, J. C.; Blanchard, P. B.; Bush, E.; Jordan, J. W.; Schafer, M. J.; Nyman, J. A.Bulletin of the American Meteorological Society102(6)E1275–E1282Using tree-ring research to introduce students to geoscience fieldwork.
2021Varaden, D.; Leidland, E.; Lim, S.; Barratt, B.Environmental Research201111536“I am an air quality scientist”—using citizen science to characterise school children’s exposure to air pollution.
2021Williams, K. A.; Hall, T. E.;
O’Connell, K.
Environmental Education Research27(7)1037–1053Classroom-based citizen science: Impacts on students’ science identity, nature connectedness, and curricular knowledge.
2022Araujo, J. L.; Morais, C.; Paiva, J. C.Chemistry Education Research and Practice23(1)100–112Student participation in a coastal water quality citizen science project and its contribution to the conceptual and procedural learning of chemistry
2022Kane, F.; Abbate, J.; Landahl, E. C.; Potosnak, M. J.Sensors22(3)1295Monitoring particulate matter with wearable sensors and the influence on student environmental attitudes
2022Oturai, N. G.; Pahl, S.; Syberg, K.Science of the Total Environment806150914How can we test plastic pollution perceptions and behavior? A feasibility study with danish children participating in “the mass experiment”
2022Rodríguez-Loinaz, G.; Ametzaga-Arregi, I.; Palacios-Agundez, I.Journal of Biological Education58(3)609–625ICT Tools and Citizen Science: A Pathway to Promote Science Learning and Education for Sustainable Development in Schools.
2022Scaini, C.; Peresan, A.; Tamaro, A.; Poggi, V.; Barnaba, C.International Journal of Disaster Risk Reduction69102755Can high-school students contribute to seismic risk mitigation? lessons learned from the development of a crowd-sourced exposure database.
2022Tarin-Pello, A.; Suay-Garcia, B.; Marco-Crespo, E.; Galiana-Rosello, C.; Bueso-Bordils, J. I.; Perez-Gracia, M.Frontiers in Microbiology13959187Evaluation of knowledge about antibiotics and engagement with a research experience on antimicrobial resistance between pre-university and university students for five school years (2017–2021).
2022Wichmann, C.S.; Fischer, D.; Geiger, S.M.; Honorato-Zimmer, D.; Knickmeier, K.; Kruse, K.; Sundermann, A.; Thiel, M.Marine Policy141105035Promoting pro-environmental behavior through citizen science? A case study with chilean schoolchildren on marine plastic pollution.

Appendix B

Figure A1. Percentage of learning outcomes evaluation results reported by study topic.
Figure A1. Percentage of learning outcomes evaluation results reported by study topic.
Sustainability 18 00703 g0a1
Figure A2. Percentage of learning outcomes evaluation results reported by type of project.
Figure A2. Percentage of learning outcomes evaluation results reported by type of project.
Sustainability 18 00703 g0a2

Appendix C

Evaluation
Design
Study TopicType of ProjectDuration
BiodiversityPollutionOthersContributoryCollaborativeCo-ProducedShortMediumLongNo Data
Non-experimental19%38%43%69%25%6%50%31%13%6%
Quasi-experimental40%32%28%60%32%8%40%36%8%16%
Experimental50%50%---------50%50%---------25%---------50%25%

References

  1. OECD. Organization for Economic Co-Operation and Development. In PISA 2018 Evaluation and Analytical Framework, PISA; OECD Publishing: Paris, France, 2019; p. 305. [Google Scholar]
  2. George, R. A cross-domain analysis of change in students’ attitudes toward science and attitudes about the utility of science. Int. J. Sci. Educ. 2006, 28, 571–589. [Google Scholar] [CrossRef]
  3. Potvin, P.; Hasni, A. Analysis of the decline in interest towards school science and technology from grades 5 through 11. J. Sci. Educ. Technol. 2014, 23, 784–802. [Google Scholar] [CrossRef]
  4. Osborne, J.; Simon, S.; Collins, S. Attitudes towards science: A review of the literature and its implications. Int. J. Sci. Educ. 2003, 25, 1049–1079. [Google Scholar] [CrossRef]
  5. Williams, K.A.; Hall, T.E.; O’Connell, K. Classroom-based citizen science: Impacts on students’ science identity, nature connectedness, and curricular knowledge. Environ. Educ. Res. 2021, 27, 1037–1053. [Google Scholar] [CrossRef]
  6. National Research Council. Next Generation Science Standards: For States, By States; The National Academies Press: Washington, DC, USA, 2013. [Google Scholar]
  7. Aivelo, T.; Huovelin, S. Combining formal education and citizen science: A case study on students’ perceptions of learning and interest in an urban rat project. Environ. Educ. Res. 2020, 26, 324–340. [Google Scholar] [CrossRef]
  8. Bonney, R.; Ballard, H.; Jordan, R.; McCallie, E.; Phillips, T.; Shirk, J.; Wilderman, C.C. Public Participation in Scientific Research: Defining the Field and Assessing Its Potential for Informal Science Education. A CAISE Inquiry Group Report; Center for Advancement of Informal Science Education (CAISE): Washington, DC, USA, 2009; p. 58. [Google Scholar]
  9. Science Europe. Science Europe Briefing Paper on Citizen Science; Science Europe: Brussels, Belgium, 2018; p. 31. [Google Scholar]
  10. Vance-Chalcraft, H.D.; Hurlbert, A.H.; Styrsky, J.N.; Gates, T.A.; Bowser, G.; Hitchcock, C.B.; Reyes, M.A.; Cooper, C.B. Citizen science in postsecondary education: Current practices and knowledge gaps. BioScience 2022, 72, 276–288. [Google Scholar] [CrossRef]
  11. Roche, J.; Bell, L.; Galvão, C.; Golumbic, Y.N.; Kloetzer, L.; Knoben, N.; Laakso, M.; Lorke, J.; Mannion, G.; Massetti, L.; et al. Citizen Science, Education, and Learning: Challenges and Opportunities. Front. Sociol. 2020, 5, 613814. [Google Scholar] [CrossRef]
  12. Harlin, J.; Kloetzer, L.; Patton, D.; Leonhard, C. Turning students into scientists. In Citizen Science: Innovation in Open Science, Society and Policy; Hecker, S., Haklay, M., Bowser, A., Makuch, Z., Vogel, J., Bonn, A., Eds.; UCL Press: London, UK, 2018; pp. 410–428. [Google Scholar]
  13. Carson, S.; Rock, J.; Smith, J. Sediments and seashores-A case study of local citizen science contributing to student learning and environmental citizenship. Front. Educ. 2021, 6, 674883. [Google Scholar] [CrossRef]
  14. Bela, G.; Peltola, T.; Young, J.C.; Balázs, B.; Arpin, I.; Pataki, G.; Hauck, J.; Kelemen, E.; Kopperoinen, L.; Van Herzele, A.; et al. Learning and the transformative potential of citizen science. Conserv. Biol. 2016, 30, 990–999. [Google Scholar] [CrossRef]
  15. Brossard, D.; Lewenstein, B.; Bonney, R. Scientific knowledge and attitude change: The impact of a citizen science project. Int. J. Sci. Educ. 2005, 27, 1099–1121. [Google Scholar] [CrossRef]
  16. Dunkley, R.A. The Role of Citizen Science in Environmental Education. In Analyzing the Role of Citizen Science in Modern Research; Ceccaroni, L., Piera, J., Eds.; IGI Global: Hershey, PA, USA, 2017; pp. 213–230. [Google Scholar]
  17. Shulla, K.; Leal Filho, W.; Sommer, J.H.; Salvia, A.L.; Borgemeister, C. Channels of collaboration for citizen science and the sustainable development goals. J. Clean. Prod. 2020, 264, 121735. [Google Scholar] [CrossRef]
  18. Trautmann, N.M.; Shirk, J.L.; Fee, J.; Krasny, M.E. Who Poses the Question? Using Citizen Science to Help K-12 Teachers Meet the Mandate for Inquiry. In Citizen Science: Public Collaboration in Environmental Research; Dickinson, J.L., Bonney, R.R., Eds.; Cornell University Press: Ithaca, NY, USA, 2012; pp. 179–190. [Google Scholar]
  19. Bennett, J. Teaching and Learning Science: A Guide to Recent Research and Its Applications; Bloomsbury Academic: London, UK, 2003; p. 289. [Google Scholar]
  20. Slingsby, D.; Barker, S. Making connections: Biology, environmental education and education for sustainable development. J. Biol. Educ. 2003, 38, 4–6. [Google Scholar] [CrossRef]
  21. Rose, D.E. Context-based learning. In Encyclopedia of the Sciences of Learning; Seel, N., Ed.; Springer: New York, NY, USA, 2012; pp. 799–802. [Google Scholar]
  22. Yu, K.C.; Fan, S.C.; Lin, K.Y. Enhancing students’problem-solving skills through context-based learning. Int. J. Sci. Math. Educ. 2015, 13, 1377–1401. [Google Scholar] [CrossRef]
  23. Stepenuck, K.F.; Green, L.T. Individual-and community-level impacts of volunteer environmental monitoring: A synthesis of peer-reviewed literature. Ecol. Soc. 2015, 20, 19. [Google Scholar] [CrossRef]
  24. Trumbull, D.J.; Bonney, R.; Bascom, D.; Cabral, A. Thinking scientifically during participation in a citizen-science project. Sci. Educ. 2000, 84, 265–275. [Google Scholar] [CrossRef]
  25. Vitone, T.; Stofer, K.; Steininger, M.S.; Hulcr, J.; Dunn, R.; Lucky, A. School of ants goes to college: Integrating citizen science into the general education classroom increases engagement with science. J. Sci. Commun. 2016, 15, A03. [Google Scholar] [CrossRef]
  26. Shah, H.R.; Martinez, L.R. Current approaches in implementing citizen science in the classroom. J. Microbiol. Biol. Educ. 2016, 17, 17–22. [Google Scholar] [CrossRef]
  27. Lewandowski, E.J.; Oberhauser, K.S. Butterfly citizen scientists in the United States increase their engagement in conservation. Biol. Conserv. 2017, 208, 106–112. [Google Scholar] [CrossRef]
  28. Toomey, A.H.; Domroese, M.C. Can citizen science lead to positive conservation attitudes and behaviors? Hum. Ecol. Rev. 2013, 20, 50–62. [Google Scholar]
  29. Bopardikar, A.; Bernstein, D.; McKenney, S. Designer considerations and processes in developing school-based citizen-science curricula for environmental education. J. Biol. Educ. 2021, 57, 592–617. [Google Scholar] [CrossRef]
  30. Lüsse, M.; Brockhage, F.; Beeken, M.; Pietzner, V. Citizen science and its potential for science education. Int. J. Sci. Educ. 2022, 44, 1120–1142. [Google Scholar] [CrossRef]
  31. Zoellick, B.; Nelson, S.J.; Schauffler, M. Participatory science and education: Bringing both views into focus. Front. Ecol. Environ. 2012, 10, 310–313. [Google Scholar] [CrossRef]
  32. Bonney, R.; Cooper, C.B.; Dickinson, J.; Kelling, S.; Phillips, T.; Rosenberg, K.V.; Shirk, J. Citizen science: A developing tool for expanding science knowledge and scientific literacy. BioScience 2009, 59, 977–984. [Google Scholar] [CrossRef]
  33. Phillips, T.; Porticella, N.; Constas, M.; Bonney, R. A framework for articulating and measuring individual learning outcomes from participation in citizen science. Citiz. Sci. Theory Pract. 2018, 3, 3. [Google Scholar] [CrossRef]
  34. Pizzolato, L.A.; Tsuji, L.J. Citizen science in K-12 school-based learning settings. Sch. Sci. Math. 2022, 122, 222–231. [Google Scholar] [CrossRef]
  35. Jordan, R.C.; Ballard, H.L.; Phillips, T.B. Key issues and new approaches for evaluating citizen-science learning outcomes. Front. Ecol. Environ. 2012, 10, 307–309. [Google Scholar] [CrossRef]
  36. Turrini, T.; Dörler, D.; Richter, A.; Heigl, F.; Bonn, A. The threefold potential of environmental citizen science—Generating knowledge, creating learning opportunities and enabling civic participation. Biol. Conserv. 2018, 225, 176–186. [Google Scholar] [CrossRef]
  37. National Academies of Sciences, Engineering, and Medicine (NASEM). Learning Through Citizen Science: Enhancing Opportunities by Design; National Academies Press: Washington, DC, USA, 2018; p. 183. [Google Scholar]
  38. Ballard, H.L.; Dixon, C.G.; Harris, E.M. Youth-focused citizen science: Examining the role of environmental science learning and agency for conservation. Biol. Conserv. 2017, 208, 65–75. [Google Scholar] [CrossRef]
  39. Tsivitanidou, O.; Ioannou, A. Citizen Science, K-12 science education and use of technology: A synthesis of empirical research. J. Sci. Commun. 2020, 19, 1–22. [Google Scholar] [CrossRef]
  40. Hadjichambi, D.; Hadjichambis, A.C.; Adamou, A.; Georgiou, Y. A systematic literature review of K-12 environmental Citizen Science (CS) initiatives: Unveiling the CS pedagogical and participatory aspects contributing to students’ environmental citizenship. Educ. Res. Rev. 2023, 39, 100525. [Google Scholar] [CrossRef]
  41. Solé, C.; Couso, D.; Hernández, M.I. Citizen science in schools: A systematic literature review. Int. J. Sci. Educ. Part B 2024, 14, 383–399. [Google Scholar] [CrossRef]
  42. Abourashed, A.; Doornekamp, L.; Escartin, S.; Koenraadt, C.J.; Schrama, M.; Wagener, M.; Bartumeus, F.; van Gorp, E.C. The potential role of school citizen science programs in infectious disease surveillance: A critical review. Int. J. Environ. Res. Public Health 2021, 18, 7019. [Google Scholar] [CrossRef] [PubMed]
  43. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  44. Jenkins, L.L. Using citizen science beyond teaching science content: A strategy for making science relevant to students’ lives. Cult. Stud. Sci. Educ. 2011, 6, 501–508. [Google Scholar] [CrossRef]
  45. Rautio, P.; Tammi, T.; Aivelo, T.; Hohti, R.; Kervinen, A.; Saari, M. “For whom? By whom?”: Critical perspectives of participation in ecological citizen science. Cult. Stud. Sci. Educ. 2022, 17, 765–793. [Google Scholar] [CrossRef]
  46. Cartwright, L.A.; Cvetkovic, M.; Graham, S.; Tozer, D.; Chow-Fraser, P. URBAN: Development of a citizen science biomonitoring program based in Hamilton, Ontario, Canada. Int. J. Sci. Educ. Part B 2015, 5, 93–113. [Google Scholar] [CrossRef]
  47. Hachaj, T.; Bibrzycki, Ł.; Piekarczyk, M. Recognition of cosmic ray images obtained from CMOS sensors used in mobile phones by approximation of uncertain class assignment with deep convolutional neural network. Sensors 2021, 21, 1963. [Google Scholar] [CrossRef]
  48. Jiménez, M.; Triguero, I.; John, R. Handling uncertainty in citizen science data: Towards an improved amateur-based large-scale classification. Inf. Sci. 2019, 479, 301–320. [Google Scholar] [CrossRef]
  49. Phillips, T.B.; Ferguson, M.; Minarchek, M.; Porticella, N.; Bonney, R. User’s Guide for Evaluating Learning Outcomes in Citizen Science; Cornell Lab of Ornithology: New York, NY, USA, 2014; p. 53. [Google Scholar]
  50. Martay, B.; Pearce-Higgins, J.W. Using data from schools to model variation in soil invertebrates across the UK: The importance of weather, climate, season and habitat. Pedobiologia 2018, 67, 1–9. [Google Scholar] [CrossRef]
  51. Honorato-Zimmer, D.; Kruse, K.; Knickmeier, K.; Weinmann, A.; Hinojosa, I.A.; Thiel, M. Inter-hemispherical shoreline surveys of anthropogenic marine debris—A binational citizen science project with schoolchildren. Mar. Pollut. Bull. 2019, 138, 464–473. [Google Scholar] [CrossRef]
  52. Galloway, A.W.E.; Tudor, M.T.; Vander Haegen, W.M. The reliability of citizen science: A case study of Oregon white oak stand surveys. Wildl. Soc. Bull. 2006, 34, 1425–1429. [Google Scholar] [CrossRef]
  53. Peckenham, J.M.; Peckenham, S.K. Evaluation of quality for middle level and high school student-generated water quality data. J. Am. Water Resour. Assoc. 2014, 50, 1477–1487. [Google Scholar] [CrossRef]
  54. Klütsch, C.F.; Aspholm, P.E.; Polikarpova, N.; Veisblium, O.; Bjørn, T.A.; Wikan, A.; Gonzalez, V.; Hagen, S.B. Studying phenological phenomena in subarctic biomes with international school pupils as citizen scientists. Ecol. Evol. 2021, 11, 3501–3515. [Google Scholar] [CrossRef] [PubMed]
  55. Saunders, M.E.; Roger, E.; Geary, W.L.; Meredith, F.; Welbourne, D.J.; Bako, A.; Canavan, E.; Herro, F.; Herron, C.; Hung, O.; et al. Citizen science in schools: Engaging students in research on urban habitat for pollinators. Austral Ecol. 2018, 43, 635–642. [Google Scholar] [CrossRef]
  56. Sanden, T.; Spiegel, H.; Wenng, H.; Schwarz, M.; Sarneel, J.M. Learning science during teatime: Using a citizen science approach to collect data on litter decomposition in Sweden and Austria. Sustainability 2020, 12, 7745. [Google Scholar] [CrossRef]
  57. Young, A.M.; van Mantgem, E.F.; Garretson, A.; Noel, C.; Morelli, T.L. Translational science education through citizen science. Front. Environ. Sci. 2021, 9, 800433. [Google Scholar] [CrossRef]
  58. Friedman, A.J. (Ed.) Framework for Evaluating Impacts of Informal Science Education Projects; Report from a National Science Foundation Workshop; National Science Foundation: Alexandria, VA, USA, 2008; p. 117. [Google Scholar]
  59. Castell, N.; Grossberndt, S.; Gray, L.; Fredriksen, M.F.; Skaar, J.S.; Høiskar, B.A.K. Implementing citizen science in primary schools: Engaging young children in monitoring air pollution. Front. Clim. 2021, 3, 639128. [Google Scholar] [CrossRef]
  60. Kane, F.; Abbate, J.; Landahl, E.C.; Potosnak, M.J. Monitoring particulate matter with wearable sensors and the influence on student environmental attitudes. Sensors 2022, 22, 1295. [Google Scholar] [CrossRef]
  61. Nicosia, K.; Daaram, S.; Edelman, B.; Gedrich, L.; He, E.; McNeilly, S.; Shenoy, V.; Velagapudi, A.; Wu, W.; Zhang, L.; et al. Determining the willingness to pay for ecosystem service restoration in a degraded coastal watershed: A ninth grade investigation. Ecol. Econ. 2014, 104, 145–151. [Google Scholar] [CrossRef]
  62. Scaini, C.; Peresan, A.; Tamaro, A.; Poggi, V.; Barnaba, C. Can high-school students contribute to seismic risk mitigation? Lessons learned from the development of a crowd-sourced exposure database. Int. J. Disaster Risk Reduct. 2022, 69, 102755. [Google Scholar] [CrossRef]
  63. Anderson, D.; Buntting, C.M.; Coton, M.; Luczak-Roesch, M.; Doyle, C.; Pierson, C.; Li, Y.J.; Glasson, B.; Brieseman, C.; Boucher, M.; et al. Using online citizen science to develop students’ science capabilities. Curric. Matters 2020, 16, 38–59. [Google Scholar] [CrossRef]
  64. Andersson-Sundén, E.; Gustavsson, C.; Hjalmarsson, A.; Jacewicz, M.; Lantz, M.; Marciniewski, P.; Ziemann, V.; Barker, A.; Lundén, K. Citizen science and radioactivity. Nucl. Phys. News 2019, 29, 25–28. [Google Scholar] [CrossRef]
  65. Araujo, J.L.; Morais, C.; Paiva, J.C. Student participation in a coastal water quality citizen science project and its contribution to the conceptual and procedural learning of chemistry. Chem. Educ. Res. Pract. 2022, 23, 100–112. [Google Scholar] [CrossRef]
  66. Rodríguez-Loinaz, G.; Ametzaga-Arregi, I.; Palacios-Agundez, I. ICT Tools and Citizen Science: A Pathway to Promote Science Learning and Education for Sustainable Development in Schools. J. Biol. Educ. 2022, 58, 609–625. [Google Scholar] [CrossRef]
  67. Gustavsson, C.; Andersson-Sunden, E.; Barker, A.; Hjalmarsson, A.; Lantz, M.; Lunden, K.; Pomp, S. Citizen science in radiation research. EPJ Web Conf. 2020, 239, 25001. [Google Scholar] [CrossRef]
  68. Fujiwara, Y.; Hite, R.; Wygant, H.; Paulsen, S. Engaging students in global citizen science: A U.S.-Japan collaborative watershed project. Child. Educ. 2019, 95, 53–59. [Google Scholar] [CrossRef]
  69. Grasser, S.; Schunko, C.; Vogl, C.R. Children as ethnobotanists: Methods and local impact of a participatory research project with children on wild plant gathering in the Grosses Walsertal Biosphere Reserve, Austria. J. Ethnobiol. Ethnomed. 2016, 12, 46. [Google Scholar] [CrossRef]
  70. Tarter, K.D.; Levy, C.E.; Yaglom, H.D.; Adams, L.E.; Plante, L.; Casal, M.G.; Gouge, D.H.; Rathman, R.; Stokka, D.; Weiss, J.; et al. Using citizen science to enhance surveillance of Aedes aegypti in Arizona, 2015–2017. J. Am. Mosq. Control Assoc. 2019, 35, 11–18. [Google Scholar] [CrossRef]
  71. Kocman, D.; Števanec, T.; Novak, R.; Kranjec, N. Citizen science as part of the primary school curriculum: A case study of a technical day on the topic of noise and health. Sustainability 2020, 12, 10213. [Google Scholar] [CrossRef]
  72. Walkinshaw, L.P.; Hecht, C.; Patel, A.; Podrabsky, M. Training high school student “citizen scientists” to document school water access: A feasibility study. J. Sch. Health 2019, 89, 653–661. [Google Scholar] [CrossRef]
  73. Antunes, P.; Novais, C.; Novais, A.; Grosso, F.; Ribeiro, T.G.; Mourao, J.; Perovic, S.U.; Rebelo, A.; Ksiezarek, M.; Freitas, A.R.; et al. MicroMundo@UPorto: An experimental microbiology project fostering student’s antimicrobial resistance awareness and personal and social development. FEMS Microbiol. Lett. 2021, 368, fnab016. [Google Scholar] [CrossRef] [PubMed]
  74. Maicas, S.; Fouz, B.; Figàs-Segura, À.; Zueco, J.; Rico, H.; Navarro, A.; Carbó, E.; Segura-García, J.; Biosca, E.G. Implementation of antibiotic discovery by student crowdsourcing in the valencian community through a service learning strategy. Front. Microbiol. 2020, 11, 564030. [Google Scholar] [CrossRef]
  75. Zarybnicka, M.; Sklenicka, P.; Tryjanowski, P. A webcast of bird nesting as a state-of-the-art citizen science. PLoS Biol. 2017, 15, e2001132. [Google Scholar] [CrossRef] [PubMed]
  76. Lewis, R.; Carson, S. Measuring science skills development in New Zealand high school students after participation in citizen science using a DEVISE evaluation scale. N. Z. J. Educ. Stud. 2021, 56, 101–110. [Google Scholar] [CrossRef]
  77. Ellenburg, J.A.; Williford, C.J.; Rodriguez, S.L.; Andersen, P.C.; Turnipseed, A.A.; Ennis, C.A.; Basman, K.A.; Hatz, J.M.; Prince, J.C.; Meyers, D.H.; et al. Global ozone (GO3) project and AQTreks: Use of evolving technologies by students and citizen scientists to monitor air pollutants. Atmos. Environ. X 2019, 4, 100048. [Google Scholar] [CrossRef]
  78. Varaden, D.; Leidland, E.; Lim, S.; Barratt, B. “I am an air quality scientist”—Using citizen science to characterise school children’s exposure to air pollution. Environ. Res. 2021, 201, 111536. [Google Scholar] [CrossRef]
  79. Spicer, H.; Nadolny, D.; Fraser, E. Going squirrelly: Evaluating educational outcomes of a curriculum-aligned citizen science investigation of non-native squirrels. Citiz. Sci. Theory Pract. 2020, 5, 14. [Google Scholar] [CrossRef]
  80. Araujo, J.L.; Morais, C.; Paiva, J.C. Students’ attitudes towards science: The contribution of a citizen science project for monitoring coastal water quality and (micro)plastics. J. Balt. Sci. Educ. 2021, 20, 881–893. [Google Scholar] [CrossRef]
  81. Hiller, S.E.; Kitsantas, A. The effect of a horseshoe crab citizen science program on middle school student science performance and STEM career motivation. Sch. Sci. Math. 2014, 114, 302–311. [Google Scholar] [CrossRef]
  82. Mnisi, B.E.; Geerts, S.; Smith, C.; Pauw, A. Nectar gardens on school grounds reconnect plants, birds and people. Biol. Conserv. 2021, 257, 109087. [Google Scholar] [CrossRef]
  83. Wichmann, C.S.; Fischer, D.; Geiger, S.M.; Honorato-Zimmer, D.; Knickmeier, K.; Kruse, K.; Sundermann, A.; Thiel, M. Promoting pro-environmental behavior through citizen science? A case study with Chilean schoolchildren on marine plastic pollution. Mar. Policy 2022, 141, 105035. [Google Scholar] [CrossRef]
  84. Silva, C.; Monteiro, A.J.; Manahl, C.; Lostal, E.; Schäfer, T.; Andrade, N.; Brasileiro, F.; Mota, P.; Serrano Sanz, F.; Carrodeguas, J.; et al. Cell Spotting: Educational and motivational outcomes of cell biology citizen science project in the classroom. J. Sci. Commun. 2016, 15, A02. [Google Scholar] [CrossRef]
  85. Soanes, K.; Cranney, K.; Dade, M.C.; Edwards, A.M.; Palavalli-Nettimi, R.; Doherty, T.S. How to work with children and animals: A guide for school-based citizen science in wildlife research. Austral Ecol. 2020, 45, 3–14. [Google Scholar] [CrossRef]
  86. UNESCO. Education for Sustainable Development Goals: Learning Objectives; UNESCO: Paris, France, 2017; p. 62. [Google Scholar]
  87. Filgona, J.; Sakiyo, J.; Gwany, D.M.; Okoronka, A.U. Motivation in learning. Asian J. Educ. Soc. Stud. 2020, 10, 16–37. [Google Scholar] [CrossRef]
  88. Gignac, F.; Solé, C.; Barrera-Gómez, J.; Persavento, C.; Tena, È.; López-Vicente, M.; Júlvez, J.; Sunyer, J.; Couso, D.; Basagaña, X. Identifying factors influencing attention in adolescents with a co-created questionnaire: A citizen science approach with secondary students in Barcelona, Spain. Int. J. Environ. Res. Public Health 2021, 18, 8221. [Google Scholar] [CrossRef] [PubMed]
  89. Harrington, K.F.; Kohler, C.L.; McClure, L.A.; Franklin, F.A. Fourth graders’ reports of fruit and vegetable intake at school lunch: Does treatment assignment affect accuracy? J. Am. Diet. Assoc. 2009, 109, 36–44. [Google Scholar] [CrossRef] [PubMed][Green Version]
  90. Pomeranz, D. Methods of evaluation. Harv. Bus. Sch. 2011, 10, 1–12. [Google Scholar]
  91. Schaefer, T.; Kieslinger, B.; Brandt, M.; van den Bogaert, V. Evaluation in citizen science: The art of tracing a moving target. In The Science of Citizen Science; Vohland, K., Land-Zandstra, A., Ceccaroni, L., Lemmens, R., Perelló, J., Ponti, M., Samson, R., Wagenknecht, K., Eds.; Springer: Cham, Switzerland, 2021; pp. 495–514. [Google Scholar]
  92. Singh, S.; Singh, S. What is scientific literacy: A review paper. Int. J. Acad. Res. Dev. 2016, 1, 15–20. [Google Scholar]
  93. Bonney, R.; Phillips, T.B.; Ballard, H.L.; Enck, J.W. Can citizen science enhance public understanding of science? Public Underst. Sci. 2016, 25, 2–16. [Google Scholar] [CrossRef]
  94. Queiruga-Dios, A.M.; Lopez-Inesta, E.; Diez-Ojeda, M.; Saiz-Manzanares, C.M.; Vazquez Dorrio, J.B. Citizen science for scientific literacy and the attainment of sustainable development goals in formal education. Sustainability 2020, 12, 4283. [Google Scholar] [CrossRef]
  95. Newman, G.; Wiggins, A.; Crall, A.; Graham, E.; Newman, S.; Crowston, K. The future of citizen science: Emerging technologies and shifting paradigms. Front. Ecol. Environ. 2012, 10, 298–304. [Google Scholar] [CrossRef]
  96. OECD. An OECD Learning Framework 2030. In The Future of Education and Labor; Bast, G., Carayannis, E.G., Campbell, D.F.J., Eds.; Springer: Cham, Switzerland, 2019; pp. 23–35. [Google Scholar]
Figure 1. Steps and flowchart* of the systematic review [43].
Figure 1. Steps and flowchart* of the systematic review [43].
Sustainability 18 00703 g001
Figure 2. Evolution of the number of publications on citizen science projects implemented in formal compulsory education in the Web of Science and Scopus databases.
Figure 2. Evolution of the number of publications on citizen science projects implemented in formal compulsory education in the Web of Science and Scopus databases.
Sustainability 18 00703 g002
Figure 3. Percentage of LO evaluation results reported by type of student participation.
Figure 3. Percentage of LO evaluation results reported by type of student participation.
Sustainability 18 00703 g003
Figure 4. Percentage of LO evaluation results by type of evaluation design: (a) non-experimental, (b) quasi-experimental, and (c) experimental.
Figure 4. Percentage of LO evaluation results by type of evaluation design: (a) non-experimental, (b) quasi-experimental, and (c) experimental.
Sustainability 18 00703 g004
Figure 5. Percentage of LO evaluation results reported by duration of the intervention.
Figure 5. Percentage of LO evaluation results reported by duration of the intervention.
Sustainability 18 00703 g005
Figure 6. Distribution of LO evaluation design by duration of interventions.
Figure 6. Distribution of LO evaluation design by duration of interventions.
Sustainability 18 00703 g006
Table 1. Frequency of evaluated LO and evaluation results.
Table 1. Frequency of evaluated LO and evaluation results.
CategoryLearning OutcomesNo.
Evaluations
Results *
+0
Interest/attitude
towards science and the environment
Interest in and/or positive
attitude towards science
11821
Positive attitude towards
living beings
541--
Interest in environmental
conservation
431--
Interest in scientific careers844--
Highlight of scientific activity43--1
Interest in learning about
scientific topics
43--1
Connection with nature211--
Knowledge and awareness of
scientific issues
Awareness of the problem
under study
11821
Knowledge about the study topic 33321--
Knowledge of the Nature of ScienceKnowledge about the
Nature of Science
66----
BehaviorMore sustainable behaviors4211
Participation in community projects and/or actions (intention)11----
Skills for
scientific
inquiry
Data collection211--
Data analysis/graph creation
and interpretation skills
761--
Critical thinking33----
Raising research questions 11----
Formulating hypothesis211--
Communication/argumentation44----
Research design33----
Use of dichotomous keys11----
Self-efficacyTo solve problems3111
To participate in scientific activities8512
To learn/study science11----
OthersLess stereotypical image
of scientists
22----
View of oneself as a scientist2--11
TOTAL132104199
* positive (+), no effect (0), negative (−).
Table 2. Examples of questions in the post-intervention questionnaires to obtain self-reported LO by students.
Table 2. Examples of questions in the post-intervention questionnaires to obtain self-reported LO by students.
AskArticle
Do you think your knowledge of Excel has improved? Y/N[62]
To what extent did this project help you develop research skills?[72]
At what level was the project useful for acquiring the following competencies (a lot, quite, moderately, a little, nothing):
Understanding of scientific and biosafety protocols
[73]
Express your degree of agreement with the following sentences:
Now I understand better the problem of antibiotic resistance
Now I think I know more things than I knew before participating in the project
[74]
Did you know before the project how radioactivity is measured? Yes/No/Don’t know.
Do you after the project know how radioactivity is measured? Yes/No/Don’t know.
[64]
Table 3. Results of the evaluation of the most reported LO by the type of evaluation design.
Table 3. Results of the evaluation of the most reported LO by the type of evaluation design.
Non-ExperimentalQuasi-ExperimentalExperimental
Result of the Evaluation *+0+0+0
Interest and/or attitude
towards science
6----1211----
Interest in scientific careers2----21----3--
Awareness of the problem
under study
4----3211----
Knowledge about the study topic15----15----21--
Self-efficacy to participate
in scientific activities
3----1--211--
* positive (+), no effect (0), negative (−).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rodríguez-Loinaz, G. Contribution of Citizen Science to SDG 4: A Systematic Review of the Evaluation of Learning Outcomes in Citizen Science Projects in Compulsory Education. Sustainability 2026, 18, 703. https://doi.org/10.3390/su18020703

AMA Style

Rodríguez-Loinaz G. Contribution of Citizen Science to SDG 4: A Systematic Review of the Evaluation of Learning Outcomes in Citizen Science Projects in Compulsory Education. Sustainability. 2026; 18(2):703. https://doi.org/10.3390/su18020703

Chicago/Turabian Style

Rodríguez-Loinaz, Gloria. 2026. "Contribution of Citizen Science to SDG 4: A Systematic Review of the Evaluation of Learning Outcomes in Citizen Science Projects in Compulsory Education" Sustainability 18, no. 2: 703. https://doi.org/10.3390/su18020703

APA Style

Rodríguez-Loinaz, G. (2026). Contribution of Citizen Science to SDG 4: A Systematic Review of the Evaluation of Learning Outcomes in Citizen Science Projects in Compulsory Education. Sustainability, 18(2), 703. https://doi.org/10.3390/su18020703

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop