Next Article in Journal
When Aging and Climate Change Are Brought Together: Fossil Fuel Divestment and a Changing Dispositive of Security
Next Article in Special Issue
Models of Teaching Science Communication
Previous Article in Journal
Influence of Peat Soil Environment on Mechanical Properties of Cement-Soil and Its Mechanism
Previous Article in Special Issue
Infotainment May Increase Engagement with Science but It Can Decrease Perceptions of Seriousness
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Citizen Science: Is It Good Science?

1
Department of Science Communication, University of Otago, Dunedin 9054, New Zealand
2
Department of Marketing, University of Otago, Dunedin 9054, New Zealand
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(5), 4577; https://doi.org/10.3390/su15054577
Submission received: 21 December 2022 / Revised: 24 February 2023 / Accepted: 2 March 2023 / Published: 3 March 2023

Abstract

:
Citizen science projects, which entail scientific work undertaken by members of the public, have increased substantially over the last three decades. However, the credibility of such science has been questioned, especially with respect to its prospects for producing peer-reviewed publications, the principal means by which science is communicated and validated. We conducted a meta-analysis of 895 citizen science projects launched between 1890 and 2018. Three-quarters (674) did not produce a single peer-reviewed paper. The remaining 221 projects produced 2075 publications, although just five projects accounted for nearly half the publications. The average time from project launch to first publication was 9.15 years. Projects in health and medicine and astronomy were most likely to produce publications. Projects in biology (65.8% of all projects), computer science, and social sciences were least likely to publish their results. In conclusion, the “science” element of most citizen science projects is largely irrelevant as it is never validated or communicated. We propose reclassifying citizen science projects into two types: (i) Citizen Science, where the focus is on science, and participants essentially function as sampling devices; and (ii) Citizen Engagement, where the value lies more in citizen engagement than it does in citizen science.

1. Introduction

The concept of citizen science became popular in the mid-1990s. Rick Bonney, an American ornithologist, defined citizen science projects as scientific research that involves the participation of amateurs in the data collection process, such as the submission of records by birdwatchers [1]. By contrast, Irwin [2], in the book Citizen Science: A Study of People, Expertise, and Sustainable Development, claimed that the primary goal of citizen science was to link science and the general public more closely, rather than merely asking the public to collect and submit scientific data. The term “citizen science” first appeared in the Oxford English Dictionary in 2014, where it is defined as “scientific work undertaken by members of the general public, often in collaboration with or under the direction of professional scientists and scientific institutions” [3,4]. In essence, two aspects of citizen science are emphasized: (i) the scientific aspects of citizen science projects; and (ii) the participation of the public—mostly amateurs—in the science. These two aspects are now widely acknowledged as defining characteristics of citizen science [5,6].
In the 21st century, citizen science projects that involve both professional scientists and the general public have become more prevalent, particularly for studies based upon large temporal or spatial scales [5,6,7]. On Zooniverse, the world’s largest citizen science platform, as of 15 September 2021, 283 ongoing projects in 11 different fields were posted, with more than one million people having participated in and provided data for these projects [8]. Citizen science projects can also be found on other platforms. For example, the US citizen science platform citizenscience.gov listed 491 citizen science projects [9]. A majority of citizen science projects are in the field of biology [5,6], from bioinformatic science [10] to biology, ecology, and conservation of one or more taxa [11,12,13]. Follett and Strezov [14] reviewed 888 published peer-reviewed articles based on citizen science projects and found that birds were the dominant species that were the focus of these articles. Since 2010, the number of active citizen science projects has increased dramatically [5], with more taxa and more science areas represented, including geography [15], astronomy [16], health and medical science [17], computer science [18], and multidisciplinary sciences [5].
The evaluation of citizen science projects has included both their impacts on science and their impacts on participants. With regard to the latter, participating in citizen science projects can lead to changes in attitudes towards science, knowledge gains, and potential behavioural changes [19,20]. With regard to the science, there have been questions about data quality, including their credibility and validation [21]; and most particularly, about the likelihood of peer-reviewed academic outcomes [5,22]. While there are other potential outcomes, such as any economic and political implications that could arise [22,23], for the most part, evaluation of citizen science projects has focused on engagement of participants and the scientific outcomes [24].
Peer-reviewed publications are considered a benchmark for both the quality of science and the most effective way to communicate the outcomes of scientific research projects [25,26]. The North American Breeding Bird Survey, one of the earliest citizen science projects, was launched in 1967 and has generated at least 178 published articles—making it by far the most productive citizen science project from a scientific output perspective [5]. Follett and Strezov [14] stated that the number of peer-reviewed papers arising from citizen science had increased significantly since 2010, in parallel with an increase in the number of active citizen science projects. However, they only retrieved articles and relevant citizen science projects from academic databases (i.e., Web of Science and Scopus), ignoring the reality that a high proportion of citizen science projects are not accepted and acknowledged by the scientific community and, therefore, do not feature in such databases. For example, Freitag and Pfeffer [27] found that 58% of 19 long-term citizen science projects in the field of environmental science had not published any articles at all in the scientific literature, even though they had been running for an average period of 12 years. Based on a sample of 388 citizen science projects in the field of biodiversity, Theobald et al. [22] found that only 12% of the projects had published peer-reviewed papers. Similarly, Kullenberg and Kasperowski [5] found that only 16% of 490 citizen science projects, covering 20 different science areas and disciplines, produced any scientific output (i.e., peer-reviewed publications). However, Kullenberg and Kasperowski [5] only included projects in their study if they had been referred to by existing peer-reviewed articles. Given the large number of citizen science projects on platforms such as Zooniverse [8] and the limited number of published articles arising from citizen science projects [11,22,28], it is clear that many citizen science projects are posted on academic and non-academic platforms but go unmentioned in the scientific literature. Consequently, it is timely for an up-to-date analysis of the scientific outcomes of citizen science projects that is based upon a larger sample size which encompasses projects on both academic and non-academic platforms and across more scientific disciplines.
This research should help identify any issues with existing citizen science projects in terms of their scientific validity, thereby encouraging improvements in scientific methods for citizen science projects (e.g., design and data quality). Many results from citizen science projects are considered unsuitable for publication because they are not definitive or are of poor reliability [27]. This may be because of potential weaknesses in their methodologies and a reliance upon input from volunteers of varying expertise [21,29,30,31].
Therefore, in order to evaluate the scientific contribution of citizen science projects, as measured by peer-reviewed publications, we set out to answer three specific research questions:
  • RQ1: What are the characteristics (e.g., launch years, regions, disciplines, and data accessibility) of existing citizen science projects?
  • RQ2: How many peer-reviewed articles are produced by citizen science projects, and does this differ depending upon duration of the project, discipline, study region, and data accessibility of citizen science projects?
  • RQ3: Once launched, how many years does a project take to publish its first academic article, and is this affected by the launch year, discipline, study region, and data accessibility?

2. Methods

2.1. Citizen Science Projects

Citizen science projects are defined here as scientific research projects that invite the general public to collect and/or analyze data [5,22]. The first phase of this research was to generate a list of both active and completed citizen science projects. This was achieved using two different approaches: (i) by searching and scanning review articles about citizen science on Web of Science (WoS), we extracted names of citizen science projects [5,11,22,32,33]; and (ii) we identified a number of global and regional citizen science portal sites on the internet which aggregate information about global or regional citizen science projects [22]. The online portal sites involved in our data collection are listed in Table 1. Extracted citizen science projects from Web of Science and the online portal sites were then combined into one dataset. Any duplicates of projects were removed by manual inspection of the generated list of citizen science projects. Information collated about each project in the dataset included the project name, its scientific discipline, the start (launch) year, its duration, the regions or countries involved, and the level of data accessibility available to the public.
Categories for the scientific disciplines of sampled projects included biology, phenology, environmental science, astronomy, climatology, geology, computer science, health and medicine, social sciences, multidisciplinary sciences, and “other”. As biology is the dominant area for citizen science projects [14], we further divided the field of biology into a series of subcategories: ornithology, terrestrial invertebrates, terrestrial plants, terrestrial mammals, marine biology, herpetology, biodiversity (multi-taxa), and “other”. Data accessibility was defined by whether the general public was able to view or download the collective data (original data, processed data, or a summary of results in reports that were not peer-reviewed articles) without creating specific user accounts, paying a fee, or contacting the project managers. Apart from putting the project on online citizen science portal sites, most citizen science projects also have their own websites which detail backgrounds, tasks, results, and impacts. Hence, where applicable, data were also extracted from the websites associated with the sampled projects.

2.2. Topic Searches

Based on the above dataset, a meta-analysis was conducted to identify any peer-reviewed articles produced from each citizen science project. The names of the projects were used as search strings. Searches were conducted on 15 September 2021 in the WoS Core Collection. We used Topic Search because this could identify the search strings in titles, abstracts, as well as the keywords and the WoS Keywords Plus fields. The number of articles and the year of publishing the first article were extracted from the search results for each project. For different projects with similar names (e.g., NatureWatch (Canada) and NatureWatch (New Zealand); Frog Watch (India), FrogWatch (Canada), and FrogWatch (USA)), we manually examined the title, abstract and, if necessary, the full text of the article to match articles with the correct project.
The titles, abstracts and, if necessary, the full texts of the articles were then scanned manually to ensure that they represented scientific outputs from the projects (i.e., they presented and/or analysed data arising from the project). Articles that were not scientific outputs of a citizen science project were excluded from the results. Both qualitative descriptions and quantitative statistics (chi-squared and ANOVA comparisons) were used to analyse and compare the characteristics of the citizen science projects and their levels of performance in terms of scientific output. Statistical comparisons were conducted with SPSS Version 24.0.

3. Results

3.1. Characteristics of Sampled Projects

A total of 1116 citizen science projects were extracted from WoS and the online citizen science portals. However, 221 of them were duplicate projects or educational activities that did not include any scientific processes (i.e., data collection or analysis). These projects were removed from the dataset and our analysis is therefore based upon 895 citizen science projects.
The 895 citizen science projects had launch years that ranged from 1890 to 2018. The longest-running projects included the Cooperative Observer Program (COOP), which has been run by the National Weather Service and the National Centers for Environmental Information of the United States since 1890, as well as the Audubon Christmas Bird Count, launched in 1900. In our sample, 4.9% of the projects were launched before 1990 (i.e., between 1890 and 1989), before the concept of citizen science had even been defined and recognised. A further 11.3% of the sampled projects were launched from 1990 to 1999. Thereafter, the percentage of projects launched increased dramatically, with 24.7% between 2000 and 2009 and 59.1% from 2010 to 2018. The average age of projects was 12.88 years old (SD = 12.89, N = 793; launch years were not available for 102 projects). It should also be noted that most (83.9%) of the projects in our sample were still active at the time of data collection for this research; hence, the average duration of citizen science projects in our sample represents a conservative snapshot of the lifespan of citizen science projects.
The countries or regions of the sampled projects were categorised according to continents. North America had the largest proportion of projects (49.1%), most of which (85.4%) focused on citizen science in the United States. Those in European countries made up 13.0% of the citizen science projects, followed by Oceania (11.4%), Asia (3.2%), and a small percentage on other continents (2.0%). The remaining 21.3% of the projects were worldwide in nature, meaning that the project covered different countries on multiple continents and that volunteers throughout the world were able to participate in the projects.
Biology (65.8%) was the dominant scientific discipline represented in our sample of citizen science projects, followed by environmental science (14.2%), astronomy (5.9%), and multidisciplinary sciences (3.9%), with the remaining categories each accounting for less than 3% (Figure 1a). Of the biological projects (Figure 1b), those involving birds (mainly bird surveys and counts) were most prevalent, though terrestrial invertebrates (mainly insects), marine organisms, and terrestrial plants each accounted for around one-tenth of the citizen science projects about biology, with more than a fifth of such projects concerning themselves with multiple taxa (biodiversity).
In terms of public access to the data amassed by citizen science projects, 57% of the projects (n = 510) made their data available to the public. This included unstructured or structured datasets to which the public had access, shared maps showing data, and summarized results in reports posted on the internet (usually the project’s website).

3.2. The Publication of Peer-Reviewed Articles

A total of 2075 peer-reviewed articles attributable to the citizen science projects in our sample were retrieved from the WoS Core Collection. These articles arose from just 221 of the citizen science projects. In other words, 674 (75.3%) of citizen science projects in our sample had not produced a single scientific output. Projects publishing one to five peer-reviewed articles constituted just 18.2% (n = 163) of all projects, while a further 4.7% (n = 42) had published six to twenty articles, and only 1.8% (n = 16) had published more than 20 articles. In sum, a relatively small proportion of citizen science projects produce the vast bulk of any scientific outputs, with just 58 (6.5%) of the 895 projects producing 85.9% (n = 1782) of the peer-reviewed scientific articles arising from them. The top five projects, representing 0.6% of the citizen science projects in our sample, produced nearly half (44.5%) of all the scientific outputs arising from citizen science projects (Table 2).
Of the citizen science projects producing peer-reviewed publications, most (n = 175, 79.2%) published their first article since 2010. Only 15 projects (6.8%) published articles before 2000. The earliest publication that used the data from a citizen science project was based on the Audubon Christmas Bird Count (launched in 1900) in the journal American Biology Teacher in 1977 [34]. Projects with at least one published article had been operating for longer (average = 16.98 years) than those without any scientific publications (average = 11.35 years; ANOVA, F = 31.17, p < 0.001).
The likelihood of a citizen science project producing a peer-reviewed scientific publication varied with its spatial scale and location. Worldwide projects were much more likely to publish at least one peer-reviewed article than regional projects (Pearson chi-square = 60.27, p < 0.001; see Figure 2).
The likelihood of citizen science projects producing peer-reviewed articles was also significantly influenced by their scientific discipline (Pearson chi-square = 40.30, p < 0.001). Health and medicine had the largest percentage (60.0%) of projects with at least one published article (Figure 3). Additionally, almost half the citizen science projects in climatology and astronomy produced at least one scientific publication. By contrast, projects based in computer science, social sciences, and biology had relatively low outcomes when it came to producing publications, with at least four out of five citizen science projects not producing any peer-reviewed articles. Furthermore, the low prospect of citizen science projects in biology producing any scientific outputs did not vary significantly, irrespective of the taxa that were the focus of the projects (Pearson chi-square = 6.66, p = 0.47).
Citizen science projects that made their data available to the public were much more likely to have published at least one peer-reviewed article than those that did not (Pearson chi-square = 14.38, p = 0.001). Overall, 29.4% of 510 projects that made their data accessible published one or more article(s) compared to only 19.1% of the 262 projects that did not.

3.3. How Long Did It Take to Publish?

For the 221 citizen science projects that produced at least one peer-reviewed article, the average time from launch of the project to publication was 9.15 years. The project that took the longest was the Cooperative Observer Program (climatology), which launched in 1890 and had its first journal article published in 2005. The shortest was Backyard Worlds: Planet 9 (astronomy), which began on 15 February 2017 and had its first academic article published on 24 May 2017. Overall, for citizen science projects that produced scientific publications, there was a strongly negative relationship between the years projects were launched and the time taken to make their first publications (Pearson r = −0.914, p < 0.001, n = 221), with publications being produced quicker by projects launched most recently (Figure 4).
The scientific discipline of citizen science projects affected the time from launching the project to first publication when comparing 198 projects in the fields of biology, environmental science, climatology, health and medicine, and astronomy that produced scientific papers (sample sizes in the other disciplines were too small to make comparisons) (ANOVA, F = 2.52, p = 0.043, n = 198; Figure 5). Projects in health and medicine (x = 4.67 years) and astronomy (x = 5.91 years) took the shortest time to produce a publication, with health and medicine projects being significantly shorter than the average time (x = 9.15 years) taken by the other disciplines (Student’s t-test; t = −6.12, p < 0.001). However, the time to publication for those projects that produced peer-reviewed papers did not vary by study region (ANOVA: F = 1.57, p = 0.14) or whether projects made their data accessible to the public (ANOVA: F = 1.88, p = 0.16).

4. Discussion

This is the most thorough meta-analysis of citizen science projects to date, and it demonstrates clearly that the vast majority of citizen science projects are not conducting meaningful science.
Scholarly publication is the means by which science is both communicated and validated [35], with peer-review, whereby the findings are critically examined by experts before publication, being its almost universally practiced mechanism for quality-control [36]. While there are arguably inconsistencies and issues with the peer review process [37], publication in peer-reviewed journals or books nevertheless remains the accepted method for communicating science and determining its merit [38]. Science projects that do not produce any peer-reviewed publications therefore fail to disseminate the results of that project and avoid having the results scrutinized and their worth judged. Publication in peer-reviewed journals, then, is an essential part of any science project in order for its scientific outcomes to be considered worthwhile.
While the term citizen science suggests that science is being undertaken, as we have shown here, three-quarters of such projects never publish any peer-reviewed papers. This means that the “science” element of most citizen science projects is largely irrelevant and meaningless. Even for the relatively small percentage of citizen science projects that produce at least one peer-reviewed publication, the average time of over 9 years from launch of the project to publication suggests that even when engaged in doing real science, most citizen science projects do not do so efficiently. Indeed, as our analysis shows, it is a very small proportion of citizen science projects that provides the bulk of the scientific publications emanating from them.
The likelihood that a project will produce a peer-reviewed scientific output depends upon the discipline of the project. The two standout disciplines are health and medicine and astronomy, where approximately half of the citizen science projects in those areas produce at least one peer-reviewed publication, doing so in roughly half the time taken by projects centered in other disciplines. Indeed, Kullenberg and Kasperowski [5] noted a shift in papers produced by citizen science projects after the mid-2000s away from those that rely on bird counts to those that rely on employing digital platforms for observation, collection, and processing of data, which is a characteristic of many citizen science projects in astronomy. That is, the “citizens” are little more than sampling devices that observe and count when presented with data in digital form.
As well as astronomy, citizen science projects in health and medicine typically use human participants as sampling machines, where they are used for data sampling and data analysis [39]. Even in areas of biological sciences, this is also true with respect to bird counts, which uncoincidentally is the one area of bioscience that regularly produces peer-reviewed papers—a product of the massive amount of data collected by citizens over decades [5].
By contrast, almost all other citizen science projects based in biology, which account for approximately two-thirds of all citizen science projects, have very low likelihoods of producing any scientific outputs, and even then, they typically take a long time to materialise. What is more, we found that this pattern was consistent across all taxa studied, suggesting that citizen science projects in the biological sciences, if not producing science, may well be creating value in other ways that account for their existence and increasing popularity.
Citizen science projects that are worldwide in their scale (such as those in astronomy and health and medicine) are more likely to be scientifically productive than regionally based projects, the latter being a hallmark of most citizen science projects in biology. This points to the likelihood that the actual value of many citizen science projects, especially those in biology, lies more in science engagement with the public rather than producing worthwhile science per se.
There is evidence that participants in citizen science projects experience increased engagement with science [40,41] and that this occurs frequently at a local scale and is most often associated with biological projects, especially those in the outdoors [42]. In other words, these types of citizen science projects are more about providing benefits to the participants than benefits to science.

5. Conclusions

In conclusion, our study points to a need to reclassify citizen science projects into two different types of projects:
  • Citizen Science Projects: a minority of current citizen science projects that conduct actual science and produce peer-reviewed scientific outputs, in which the participants are often little more than anonymous sampling bots.
  • Citizen Engagement Projects: the large majority of current citizen science projects that are really aimed at citizen engagement with science, in which conducting science is a secondary function and scientific outputs are unlikely.
Wiggins et al. [17] argued for changing the ways we evaluate citizen science projects to account for the value of most citizen science projects arising from something other than the production of science and the proliferation of scientific knowledge, a view echoed by Kasperowski and Kullenberg [43]. However, using alternative measures of merit and validity simply to legitimise citizen science is bound to increase criticisms and wariness that many scientists already harbour about the scientific inadequacies of citizen science [44,45].
It is far better to treat the two types of projects separately and, rather than being apologetic or making excuses for perceived weaknesses of citizen science, to play to each type of project’s strengths.

5.1. The Outlook for Citizen Science Projects

Our research identified an increasing level of citizen science projects that are legitimately involved in the pursuit of science. While only a quarter of the 895 citizen science projects in our study published at least one peer-reviewed scientific paper, this is up from less than 16% in Kellenberg and Kasperoski’s 2016 study [5]. Perhaps the most telling result from our analysis is the highly significant negative correlation between the year projects were launched and the time to first publication (Figure 4): citizen science projects are getting faster and more proficient when it comes to generating scientific peer-reviewed outputs. That bodes well for projects, especially international ones, that can leverage the power that comes from having many participants recording data, such as those in health and medicine and astronomy.

5.2. The Outlook for Citizen Engagement Projects

Evidence is accumulating that the real value of most projects conducted under the guise of doing citizen science arises from the participants’ engagement with science [46] and science education [42]. This is especially true for projects conducted at local or regional scales and those concerned with aspects of biology. Involvement with such projects can have positive effects on participants’ well-being [47], connectedness to nature [48], development of scientific literacy [49], and involvement with societal issues and policy concerning science [50] while also promoting engagement with the public concerning the outcomes of the project [51].
In particular, citizen engagement projects have the potential to affect participants, changing their attitudes and bringing about behavioural changes that can facilitate conservation outcomes [52]. They can also facilitate mobilisation of citizen initiatives that can be used to drive policy changes, such as those affecting climate change [50]. However, the most exciting and powerful potential of citizen engagement projects is that they can be used to change participants’ attitudes and knowledge about science, thereby resulting in changed human behaviour towards the environment [53] and other “wicked problems” facing society.
The real value of citizen engagement projects, then, is not the science they produce, but the good that they can do.

5.3. Evaluation and Future Research

The big problem with most projects conducted under the umbrella of citizen science to date has been “overly simplistic” evaluations of impacts [54]. As Van Brussel and Huyse [55] did for their large-scale citizen science project on air quality in Antwerp, it is possible to evaluate the impacts of citizen science in terms of the science produced, the engagement of participants, and any influences a project may have on public policy. In their study, the dominant outcome was derived from the engagement by citizens in the project. This accords with the results of our study, given that science cannot be the primary benefit for at least three-quarters of so-called citizen science projects as they produce no meaningful science, and therefore, the benefits must lie elsewhere. Indeed, our results are in line with Irwin’s [2] original concept for citizen science as being a way to link the public with science rather than being a way to do science.
To appropriately evaluate the effects of citizen engagement in such projects, the use of surveys and mixed-method approaches is advocated [56]. When used, they show overwhelmingly that “citizen science” has become a form of public engagement with science [57]. Furthermore, by surveying the attitudes of participants, it is possible to detect changes in attitudes that occur from participating in such projects [54]. This can not only reveal changes in attitudes, but also changes in behaviour (e.g., participants in a project about the Wabash River Watershed in Indiana became motivated to help improve local water quality, leading to longer-term behavioural changes to aid water quality and conservation [58]). Perhaps the most exciting and powerful outcome of citizen engagement projects is derived from social diffusion, whereby the participants go on to affect the attitudes and behaviours of the community at large [58,59]. Even for the minority of citizen science projects that conduct actual science, where participants are often little more than data loggers, it is still possible to promote engagement [60] and thereby leverage the power of citizen engagement.
In sum, we advocate a shift in focus from citizen science to citizen engagement, with future research concentrating on the evaluation of engagement and effects on attitudes and behaviour while also documenting the degree of any social diffusion; only then will we be able to truly appreciate the benefits of bringing citizens and science together and the full potential of this union to improve our world through behavioural change and social diffusion.

Author Contributions

Conceptualization, L.S.D., L.Z. and W.F.; methodology, L.S.D. and L.Z.; software, L.Z.; validation, L.S.D. and L.Z.; formal analysis, L.S.D. and L.Z.; investigation, L.S.D. and L.Z.; resources, L.S.D. and L.Z.; data curation, L.S.D. and L.Z.; writing—original draft preparation, L.S.D., L.Z. and W.F.; writing—review and editing, L.S.D., L.Z. and W.F.; visualization, L.S.D. and L.Z.; project administration, L.S.D.; funding acquisition, L.S.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Stuart Residence Halls Council’s endowment to L.S.D.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available as they are being used as the basis for other publications.

Acknowledgments

We thank Margot Skinner, from the Stuart Residence Halls Council, and the other council members for their continued support. Without the funding to employ L.Z., this research could not have occurred.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Bonney, R. Citizen science: A lab tradition. Living Bird 1996, 15, 7–15. [Google Scholar]
  2. Irwin, A. Citizen Science: A Study of People, Expertise and Sustainable Development; Routledge: London, UK, 1995. [Google Scholar]
  3. Oxford English Dictionary. New Words List June 2014. 2014. Available online: https://public.oed.com/updates/new-words-list-june-2014/ (accessed on 21 November 2021).
  4. Oxford English Dictionary. Citizen Science. 2021. Available online: https://www.oed.com/view/Entry/33513?redirectedFrom=citizen+science#eid316619123 (accessed on 21 November 2021).
  5. Kullenberg, C.; Kasperowski, D. What Is Citizen Science?—A Scientometric Meta-Analysis. PLoS ONE 2016, 11, e0147152. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. MacPhail, V.J.; Colla, S.R. Power of the people: A review of citizen science programs for conservation. Biol. Conserv. 2020, 249, 108739. [Google Scholar] [CrossRef]
  7. Brown, E.D.; Williams, B.K. The potential for citizen science to produce reliable and useful information in ecology. Conserv. Biol. 2019, 33, 561–569. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Simpson, R.; Page, K.R.; De Roure, D. Zooniverse: Observing the world’s largest citizen science platform. In Proceedings of the 23rd International Conference on World Wide Web, Seoul, Republic of Korea, 7–11 April 2014. [Google Scholar]
  9. Citizenscience.gov. Federal Crowdsourcing and Citizen Science Catalog. 2021. Available online: https://www.citizenscience.gov/catalog/# (accessed on 15 September 2021).
  10. Kelling, S. Using bioinformatics in citizen science. In Citizen Science: Public Participation in Environmental Research; Dickinson, J.L., Bonney, R., Eds.; Cornell University Press: Ithaca, NY, USA, 2012; pp. 58–68. [Google Scholar]
  11. Dickinson, J.L.; Zuckerberg, B.; Bonter, D.N. Citizen science as an ecological research tool: Challenges and benefits. Annu. Rev. Ecol. Evol. Syst. 2010, 41, 149–172. [Google Scholar] [CrossRef] [Green Version]
  12. Sullivan, B.L.; Aycrigg, J.L.; Barry, J.H.; Bonney, R.E.; Bruns, N.; Cooper, C.B.; Damoulas, T.; Dhondt, A.A.; Dietterich, T.; Farnsworth, A.; et al. The eBird enterprise: An integrated approach to development and application of citizen science. Biol. Conserv. 2014, 169, 31–40. [Google Scholar] [CrossRef]
  13. Van Horn, G.; Aodha, O.M.; Song, Y.; Cui, Y.; Sun, C.; Shepard, A.; Adam, H.; Perona, P.; Belongie, S. The iNaturalist Species Classification and Detection Dataset. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018. [Google Scholar]
  14. Follett, R.; Strezov, V. An analysis of citizen science based research: Usage and publication patterns. PLoS ONE 2015, 10, e0143687. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Connors, J.P.; Lei, S.; Kelly, M. Citizen Science in the Age of Neogeography: Tilizing volunteered geographic information for environmental monitoring. Ann. Assoc. Am. Geogr. 2012, 102, 1267–1289. [Google Scholar] [CrossRef]
  16. Darch, P.T. Managing the Public to Manage Data: Citizen science and astronomy. arXiv 2017, arXiv:1703.00037. [Google Scholar] [CrossRef]
  17. Wiggins, A.; Wilbanks, J. The rise of citizen science in health and biomedical research. Am. J. Bioeth. 2019, 19, 3–14. [Google Scholar] [CrossRef] [Green Version]
  18. Prestopnik, N.R.; Tang, J. Points, stories, worlds, and diegesis: Comparing player experiences in two citizen science games. Comput. Hum. Behav. 2015, 52, 492–506. [Google Scholar] [CrossRef]
  19. Bonney, R.; Phillips, T.B.; Ballard, H.L.; Enck, J.W. Can citizen science enhance public understanding of science? Public Underst. Sci. 2016, 25, 2–16. [Google Scholar] [CrossRef] [PubMed]
  20. Jordan, R.C.; Gray, S.A.; Howe, D.V.; Brooks, W.R.; Ehrenfeld, J.G. Knowledge gain and behavioral change in citizen-science programs. Conserv. Biol. 2011, 25, 1148–1154. [Google Scholar] [CrossRef] [PubMed]
  21. Wiggins, A.; Newman, G.; Stevenson, R.D.; Crowston, K. Mechanisms for data quality and validation in citizen science. Presented at the 2011 IEEE Seventh International Conference on e-Science Workshops, Stockholm, Sweden, 5–8 December 2011. [Google Scholar]
  22. Theobald, E.J.; Ettinger, A.K.; Burgess, H.K.; DeBey, L.B.; Schmidt, N.R.; Froehlich, H.E.; Wagner, C.; HilleRisLambers, J.; Tewksbury, J.; Harsch, M.A.; et al. Global change and local solutions: Tapping the unrealised potential of citizen science for biodiversity research. Biol. Conserv. 2015, 181, 236–244. [Google Scholar] [CrossRef] [Green Version]
  23. Hecker, S.; Wicke, N.; Haklay, M.; Bonn, A. How does policy conceptualise citizen science? A qualitative content analysis of international policy documents. Citiz. Sci. Theory Pract. 2019, 4, 32. [Google Scholar]
  24. Evans, C.; Abrams, E.; Reitsma, R.; Roux, K.; Salmonsen, L.; Marra, P.P. The Neighborhood Nestwatch Program: Participant outcomes of a citizen-science ecological research project. Conserv. Biol. 2005, 19, 589–594. [Google Scholar] [CrossRef]
  25. Davis, L.S. Popularizing Antarctic science: Impact factors and penguins. Aquat. Conserv. Mar. Freshw. Ecosyst. 2007, 17 (Suppl. 1), S148–S164. [Google Scholar] [CrossRef]
  26. Neylon, C.; Wu, S. Level metrics and the evolution of scientific impact. PLoS Biol. 2009, 7, e1000242. [Google Scholar] [CrossRef] [Green Version]
  27. Freitag, A.; Pfeffer, M.J. Process, not product: Investigating recommendations for improving citizen science “success”. PLoS ONE 2013, 8, e64079. [Google Scholar] [CrossRef]
  28. Franzoni, C.; Sauermann, H. Crowd science: The organisation of scientific research in open collaborative projects. Res. Policy 2014, 43, 1–20. [Google Scholar] [CrossRef] [Green Version]
  29. Aceves-Bueno, E.; Adeleye, A.S.; Feraud, M.; Huang, Y.; Tao, M.; Yang, Y.; Anderson, S.E. The accuracy of citizen science data: A quantitative review. Bull. Ecol. Soc. Am. 2017, 98, 278–290. [Google Scholar] [CrossRef] [Green Version]
  30. Balázs, B.; Mooney, P.; Nováková, E.; Bastin, L.; Arsanjani, J.J. Data quality in citizen science. In The Science of Citizen Science; Springer: Berlin/Heidelberg, Germany, 2021; p. 139. [Google Scholar]
  31. Bonter, D.N.; Cooper, C.B. Data validation in citizen science: A case study from Project FeederWatch. Front. Ecol. Environ. 2012, 10, 305–307. [Google Scholar] [CrossRef]
  32. Cooper, C.B.; Shirk, J.; Zuckerberg, B. The invisible prevalence of citizen science in global research: Migratory birds and climate change. PLoS ONE 2014, 9, e106508. [Google Scholar] [CrossRef] [Green Version]
  33. Silvertown, J. A new dawn for citizen science. Trends Ecol. Evol. 2009, 24, 467–471. [Google Scholar] [CrossRef] [PubMed]
  34. Ferner, J.W. The Audubon Christmas Bird Count: A Valuable Teaching Resource. Am. Biol. Teach. 1977, 39, 533–535, 544. [Google Scholar] [CrossRef]
  35. Hames, I. Peer Review and Manuscript Management in Scientific Journals: Guidelines for Good Practice; Blackwell Publishing Ltd.: Malden, MA, USA, 2007. [Google Scholar]
  36. Kelly, J.; Sadeghieh, T.; Adeli, K. Peer Review in Scientific Publications: Benefits, Critiques, & A Survival Guide. EJIFCC 2014, 25, 227–243. [Google Scholar] [PubMed]
  37. Horbach, S.P.J.M.; Halffman, W. The changing forms and expectations of peer review. Res. Integr. Peer Rev. 2018, 3, 8. [Google Scholar] [CrossRef] [Green Version]
  38. Kirman, C.R.; Simon, T.W.; Hays, S.M. Science peer review for the 21st century: Assessing scientific consensus for decision-making while managing conflict of interests, reviewer and process bias. Regul. Toxicol. Pharmacol. 2019, 103, 73–85. [Google Scholar] [CrossRef]
  39. Ciasullo, M.V.; Carli, M.; Lim, W.M.; Palumbo, R. An open innovation approach to co-produce scientific knowledge: An examination of citizen science in the healthcare ecosystem. Eur. J. Innov. Manag. 2022, 25, 365–392. [Google Scholar] [CrossRef]
  40. Mazel-Cabasse, C. Modes and Existences in Citizen Science: Thoughts from Earthquake Country. Sci. Technol. Stud. 2018, 32, 34–51. [Google Scholar] [CrossRef]
  41. Skarlatidou, A.; Haklay, M. Citizen science impact pathways for a positive contribution to public participation in science. J. Sci. Commun. 2021, 20, A02. [Google Scholar] [CrossRef]
  42. Phillips, T.B.; Ballard, H.L.; Lewenstein, B.V.; Bonney, R. Engagement in science through citizen science: Moving beyond data collection. Sci. Educ. 2019, 103, 665–690. [Google Scholar] [CrossRef]
  43. Kasperowski, D.; Kullenberg, C. The many modes of citizen science. Sci. Technol. Stud. 2019, 32, 2–7. [Google Scholar] [CrossRef]
  44. Burgess, H.K.; DeBey, L.B.; Froehlich, H.E.; Schmidt, N.; Theobald, E.J.; Ettinger, A.K.; HilleRisLambers, J.; Tewksbury, J.; Parrish, J.K. The science of citizen science: Exploring barriers to use as a primary research tool. Biol. Conserv. 2017, 208, 113–120. [Google Scholar] [CrossRef] [Green Version]
  45. Gadermaier, G.; Dörler, D.; Heigl, F.; Mayr, S.; Rüdisser, J.; Brodschneider, R.; Marizzi, C. Peer-reviewed publishing of results from Citizen Science projects. JCOM 2018, 17, L01. [Google Scholar] [CrossRef]
  46. Golumbic, Y.N.; Baram-Tsabari, A.; Fishbain, B. Engagement styles in an environmental citizen science project. J. Sci. Commun. 2020, 19, A03. [Google Scholar] [CrossRef]
  47. Williams, C.R.; Burnell, S.M.; Rogers, M.; Flies, E.J.; Baldock, K.L. Nature-Based Citizen Science as a Mechanism to Improve Human Health in Urban Areas. Int. J. Environ. Res. Public Health 2022, 19, 68. [Google Scholar] [CrossRef] [PubMed]
  48. Williams, K.A.; Hall, T.E.; O’Connell, K. Classroom-based citizen science: Impacts on students’ science identity, nature connectedness, and curricular knowledge. Environ. Educ. Res. 2021, 27, 1037–1053. [Google Scholar] [CrossRef]
  49. Brandt, M.; Groom, M.A.; Misevic, D.; Narraway, C.L.; Bruckermann, T.; Beniermann, A.; Børsen, T.; González, J.; Meeus, S.; Roy, H.E.; et al. Promoting scientific literacy in evolution through citizen science. Proc. R. Soc. B 2022, 289, 20221077. [Google Scholar] [CrossRef]
  50. Kythreotis, A.P.; Mantyka-Pringle, C.; Mercer, T.G.; Whitmarsh, L.E.; Corner, A.; Paavola, J.; Chambers, C.; Miller, B.A.; Castree, N. Citizen Social Science for More Integrative and Effective Climate Action: A Science-Policy Perspective. Front. Environ. Sci. 2019, 7, 10. [Google Scholar] [CrossRef] [Green Version]
  51. MacLeod, C.J.; Scott, K. Mechanisms for enhancing public engagement with citizen science results. People Nat. 2021, 3, 32–50. [Google Scholar] [CrossRef]
  52. Toomey, A.H.; Domroese, M.C. Can citizen science lead to positive conservation attitudes and behaviors? Hum. Ecol. Rev. 2013, 20, 50–62. [Google Scholar]
  53. Crall, A.W.; Jordan, R.; Holfelder, K.; Newman, G.J.; Graham, J.; Waller, D.M. The impacts of an invasive species citizen science training program on participant attitudes, behavior, and science literacy. Public Underst. Sci. 2012, 22, 745–764. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  54. Somerwill, L.; When, U. How to measure the impact of citizen science on environmental attitudes, behaviour and knowledge? A review of state-of-the-art approaches. Environ. Sci. Eur. 2022, 34, 18. [Google Scholar] [CrossRef]
  55. Van Brussel, S.; Huyse, H. Citizen science on speed? Realising the triple objective of scientific rigour, policy influence and deep citizen engagement in a large-scale citizen science project on ambient air quality in Antwerp. J. Environ. Plan. Manag. 2019, 62, 534–551. [Google Scholar] [CrossRef] [Green Version]
  56. Hajibayova, L.; Coladangelo, L.P.; Soyka, H.A. Exploring the invisible college of citizen science: Questions, methods and contributions. Scientometrics 2021, 126, 6989–7003. [Google Scholar] [CrossRef]
  57. Kam, W.; Haklay, M.; Lorke, J. Exploring factors associated with participation in citizen science among UK museum visitors aged 40–60: A qualitative study using the theoretical domains framework and the capability opportunity motivation-behaviour model. Public Underst. Sci. 2021, 30, 212–228. [Google Scholar] [CrossRef]
  58. Church, S.P.; Payne, L.B.; Peel, S.; Prokopy, L.S. Beyond water data: Benefits to volunteers and to local water from a citizen science program. J. Environ. Plan. Manag. 2019, 62, 306–326. [Google Scholar] [CrossRef]
  59. Asingizwe, D.; Poortvliet, P.M.; van Vliet, A.J.H.; Koenraadt, C.J.M.; Ingabire, C.M.; Mutesa, L.; Leeuwis, C. What do people benefit from a citizen science programme? Evidence from a Rwandan citizen science programme on malaria control. Malar. J. 2020, 19, 283. [Google Scholar] [CrossRef]
  60. Spiers, H.; Swanson, A.; Fortson, L.; Simmons, B.D.; Trouille, L.; Blickhan, S.; Lintott, C. Patterns of Volunteer Behaviour Across Online Citizen Science. In Proceedings of the WWW 2018 Companion Proceedings of the World Wide Web Conference, Lyon, France, 23–27 April 2018; pp. 93–94. [Google Scholar]
Figure 1. Categories of sampled citizen science projects, n = 895 (a) and a breakdown of biology, n = 589 (b).
Figure 1. Categories of sampled citizen science projects, n = 895 (a) and a breakdown of biology, n = 589 (b).
Sustainability 15 04577 g001
Figure 2. The proportion of projects that had at least one published peer-reviewed article by different spatial scales (regions). n = 895.
Figure 2. The proportion of projects that had at least one published peer-reviewed article by different spatial scales (regions). n = 895.
Sustainability 15 04577 g002
Figure 3. The proportion of projects in different scientific disciplines that had at least one published peer-reviewed article. n = 895.
Figure 3. The proportion of projects in different scientific disciplines that had at least one published peer-reviewed article. n = 895.
Sustainability 15 04577 g003
Figure 4. Time in years taken to produce the first peer-reviewed publication in relation to the year citizen science projects were launched.
Figure 4. Time in years taken to produce the first peer-reviewed publication in relation to the year citizen science projects were launched.
Sustainability 15 04577 g004
Figure 5. Average time taken from launching a citizen science project to publication of its first peer-reviewed article, according to the scientific discipline of the project.
Figure 5. Average time taken from launching a citizen science project to publication of its first peer-reviewed article, according to the scientific discipline of the project.
Sustainability 15 04577 g005
Table 1. The citizen science portal sites that aggregate citizen science projects analysed by this study (as accessed on 15 September 2021).
Table 1. The citizen science portal sites that aggregate citizen science projects analysed by this study (as accessed on 15 September 2021).
NamesURLsRegionsFields
Australian Citizen Science Assncitizenscience.org.auAustraliaAll
British Trust for Ornithologywww.bto.orgUKBiology
Citizen Science Portalwww.ic.gc.caCanadaAll
CitizenScience.govwww.citizenscience.govUnited StatesAll
iNaturalist Citizen Science www.inaturalist.org/projectsWorldwideBiology
Science Learning Hubwww.sciencelearn.org.nzNew ZealandAll
SciStarterscistarter.orgWorldwideAll
Zooniversewww.zooniverse.orgWorldwideAll
Table 2. The top five citizen science projects with respect to output of peer-reviewed publications.
Table 2. The top five citizen science projects with respect to output of peer-reviewed publications.
ProjectsRegionYearsScientific DisciplinesPapers
Nth American Breeding Bird SurveyNorth America1966Biology (Ornithology)265
eBirdWorldwide2002Biology (Ornithology)222
Galaxy ZooWorldwide2007Astronomy190
DISCOVER-AQNorth America2011Environmental Science138
iNaturalistWorldwide2008Biology (Biodiversity)109
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Davis, L.S.; Zhu, L.; Finkler, W. Citizen Science: Is It Good Science? Sustainability 2023, 15, 4577. https://doi.org/10.3390/su15054577

AMA Style

Davis LS, Zhu L, Finkler W. Citizen Science: Is It Good Science? Sustainability. 2023; 15(5):4577. https://doi.org/10.3390/su15054577

Chicago/Turabian Style

Davis, Lloyd S., Lei Zhu, and Wiebke Finkler. 2023. "Citizen Science: Is It Good Science?" Sustainability 15, no. 5: 4577. https://doi.org/10.3390/su15054577

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop