322
Views
0
CrossRef citations to date
0
Altmetric
Report

Environmental education outcomes of community and citizen science: a systematic review of empirical research

ORCID Icon, ORCID Icon & ORCID Icon
Pages 1007-1040 | Received 20 Dec 2021, Accepted 22 Apr 2024, Published online: 04 Jun 2024

Abstract

Citizen science, community science, and related participatory approaches to scientific research and monitoring are increasingly used by environmental educators and conservation practitioners to achieve environmental education (EE) goals. However, evidence of EE learning outcomes from these approaches are typically reported on a case-by-case basis, if at all. We undertook a systematic review of empirical studies in which community and citizen science (CCS) projects lead to EE outcomes. We surfaced 100 studies that met our inclusion criteria using a broad definition of CCS and empirical research on EE outcomes. The studies involved people in a wide variety of aspects of environmental research that included, but also went beyond, data collection. We found CCS approaches to EE overall resulted in positive learning outcomes for adults and youth, particularly gains for science content knowledge (56 articles), science inquiry skills (32), positive attitudes toward science and the environment (16), and self-efficacy toward science and the environment (11). We also found evidence for positive gains in environmental behavior and stewardship (29) and community connectedness and cooperation outcomes (30). These findings highlight how CCS programs may be uniquely impactful when involving people in planning, data analysis, and reporting aspects of environmental research and monitoring, as well as in data collection, and we offer examples and suggestions for CCS design. We found a heavy reliance on self-reporting in the research methods used in many studies, however, and so offer suggestions for more rigorous methods and directions for future research.

SUSTAINABLE DEVELOPMENT GOAL:

Over the past several decades, interest in community and citizen science (CCS) as a strategy for environmental education (EE) has increased dramatically amongst scholars and practitioners alike. In the context of the socio-ecological crises facing the planet, there is a growing body of multidisciplinary research as well as curricula, programs, and organizations advancing participation in CCS as a valuable way to support learning in, with, and about the environment (Dillon et al. Citation2016; National Academies, 2018; Reid et al. Citation2021). Much of this research, however, remains conceptual, often citing the potential of CCS as a tool for EE (Bela et al. Citation2016), with a subset of these examining empirical evidence of EE outcomes for participants. In light of the myriad promises regarding CCS, a systematic review of empirical research in this field can provide a foundation for understanding what and how EE outcomes actually result from CCS and can challenge the field to more deeply and broadly seek evidence for when, how, and why participation in CCS might accomplish EE goals. Similarly, systematic reviews of related fields whose outcomes align with traditional EE outcomes can help the field of EE continue to coalesce the research evidence for learning outcomes (Ardoin et al.; 2018, Clark et al. Citation2020)

In light of this, in 2013, the North American Association for Environmental Education (NAAEE) launched an initiative called eeWorks: From Anecdotes to Evidence (https://naaee.org/programs/eeworks) inviting teams of scholars to conduct comprehensive international research reviews to investigate the evidence for achieving outcomes of EE in several priority areas, including citizen science and related participatory approaches, to demonstrate the impact and value of EE. The reviews occurred from 2013 to 2018, for example: examining outcomes for K-12 students ended in 2013 (Ardoin et al., Citation2018), for climate change education ended in 2015 (Monroe et al. Citation2019), and for early childhood EE ended in 2016 (Ardoin et al. Citation2020). In this context, we conducted a systematic review of the empirical, peer-reviewed research literature linking CCS and EE, with the goals of: (1) characterizing the studies that meet criteria for inclusion in this review (2) identifying the extent to which EE learning outcomes have been empirically measured as resulting from CCS participation, (3) developing recommendations for the design and implementation of CCS for EE practitioners, and (4) suggesting directions for future research. This review examined research ending in 2018 in keeping with the other review papers in eeWorks to establish a foundational baseline of empirical findings from which the field may build, critique, and expand as the fields of CCS and EE evolve.

Community and citizen science (CCS)

Despite having received increased attention in recent years, CCS is not a new way of doing environmental science. Members of the public have a long history of recording their observations of the natural world, generating data that have been used to advance science in ways that would otherwise be impossible (Miller-Rushing et al. Citation2012). There continue to be debates concerning definitions of citizen science, community science, and public participation in scientific research (PPSR), and concerning how these align with associated approaches such as volunteer monitoring, community-based participatory research, and other participatory approaches to research and monitoring (Cooper et al. Citation2021; Eitzel et al. Citation2017; Haklay, Citation2013; Shirk et al. Citation2012). Because we consider scientist-led (citizen science) and community-led (community science) endeavors distinct but related parts of the same spectrum of participatory approaches to research and monitoring, we use the term community and citizen science (CCS) for this review, defined as including the range of participatory ways of doing science that involve members of the public in some or all parts of scientific research or monitoring projects for which the data or results are used for monitoring, decision-making, or basic research (Ballard et al. Citation2017a). This includes projects that range from being ‘contributory’ in nature, where public participation is limited primarily to data collection and/or analysis, to those that are ‘co-created’, where public participation plays a role in all parts of the scientific research, from defining the research questions to acting on the results (Shirk et al. Citation2012; Bonney et al. Citation2009). While CCS occurs in a range of scientific fields, from astronomy to genetics to paleontology and increasingly in the humanities and social sciences (Tauginienė et al. Citation2020), we focused our review on studies of projects related to biodiversity, conservation, and the environment, including the fields of environmental health and justice.

Because learning is intimately tied to participation in different social practices, such as those within CCS projects, a key goal of this study was to examine evidence of the degree to which participation in the different activities of scientific research related to EE learning outcomes. Early on, Shirk et al. (Citation2012) focused on ‘stages of the scientific process’ in which people participate, ranging from contributory, to collaborative to co-created as described above, as did Dillon et al. (Citation2016) using the terms science-driven and transition-driven citizen science, respectively. As several authors have pointed out, what influences learning is not just participation but also the control over the scientific process that participants have (Shirk et al. Citation2012; Haklay Citation2013).

In the end, most typologies of CCS and related approaches categorize based on the degree or extent to which members of the public participate in the main activities of scientific research (Haklay Citation2013; Wiggins & Crowston Citation2011; Eitzel et al. Citation2017; Shirk et al. Citation2012), and these articles were frequently cited by studies in the review when describing their participatory approaches. Consequently, we argue that it is the authentic participation in one or more of these main activities of scientific research that can be considered the ‘instructional approach’ that CCS projects have to offer the fields of environmental and science education. Therefore, for the purposes of this review, we used an aggregate of the categories offered by all these typologies, rather than delving into sociology and philosophy of science literature.

Identifying science and environmental education outcomes of CCS

CCS as an educational approach intertwines science as well as environmental learning outcomes that we had to consider in our review. Wals et al. (Citation2014) argue that citizen science may serve as a nexus between science education, which focuses on teaching knowledge and skills, and environmental education, which additionally focuses on values and behaviors with respect to the environment, in order to produce a public who can respond to pressing environmental problems. Lindgren et al. (Citation2021) further argue that the overlapping goals of science education and environmental education converge around the practices of questioning, analysis, and interpretation as key outcomes, resonant with key features of CCS activities described above. Hence, in examining the evidence for CCS in promoting EE learning outcomes, we draw on previous research from both science education and environmental education outcomes to develop our analytical frame.

Recently, interest in the potential of CCS to improve public understanding of science as well as youth science education (Bonney et al. Citation2016; Kloetzer et al. Citation2021) has skyrocketed. Researchers and educators view CCS as a strategy to facilitate meaningful engagement in authentic scientific inquiry and investigation of place-based phenomena (Harris et al. Citation2020; Calabrese Barton, Citation2012; Mueller et al. Citation2012). This is in part because effective science education reflects the ways scientists actually work (National Research Council, 2009), and participation in different science practices may spur different forms of science reasoning and sensemaking practices (Hayes et al. Citation2020). In a comprehensive literature synthesis, Phillips et al. (Citation2018) identified science learning outcomes that might result from CCS participation (science content knowledge, science inquiry skills, self-efficacy with science and the environment, knowledge of the nature of science, science interest), which we used to analyze articles included in our review, with a particular focus on environmental science as the domain rather than all natural sciences more broadly.

Beyond science-focused learning outcomes, CCS is a promising strategy for achieving the broader environmental education goals that include but are not limited to science learning. To determine these, we turned to scholarship focused on defining key EE outcomes for individuals and communities that range from content knowledge to civic engagement, nature connectedness, pro-environmental behavior, and care for the natural world (Clark et al. Citation2020; Jacobson et al. Citation2015; Krasny, Citation2020). Employing a Delphi study of EE professionals, Clark et al. (Citation2020) identified five core outcome areas for EE: environmentally related action and behavior change, connection to nature, improving health of the environment, improving social and cultural aspects of the human experience, and learning necessary skills and competencies necessary to engage in environmentally related decision-making and behaviors, which build on and align with the key components of environmental literacy identified by the North American Association for Environmental Education (Hollweg et al. Citation2011).

In addition to these outcome areas, we considered areas that EE researchers are increasingly focused on that reflect the capacities of individuals and communities to help resolve complex social-ecological issues (Ardoin et al. Citation2013; Stevenson et al. Citation2013) Firstly CCS projects can provide opportunities for people to participate and contribute to the life of their community, for example, through volunteer water quality monitoring of local streams, fostering scientific literacy that is inclusive of and extends beyond the K-12 settings (Roth & Calabrese Barton, Citation2004), hence we considered ‘community connectedness and cooperation’ as a possible EE outcome from CCS. We also see important research on ‘environmental behaviors and stewardship’ as an EE outcome, emphasizing the relationship between participation, sensemaking, and stewardship (Korfiatis & Petrou, Citation2021), and between sense of place, connection to nature, and environmental behaviors (Gosling & Williams, Citation2010). When participants see the scientific work they have done on a CCS project get taken up in conservation efforts or policy, this can influence the degree to which some participants experience stewardship learning outcomes (Ballard et al. Citation2017b). Finally, CCS researchers have examined ‘sense of place’ and associated place values, including place identity and place attachment (a positive connection or emotional bond with a place (Gosling & Williams, Citation2010), and found these types of constructs were associated with some kinds of CCS participation (Haywood et al. Citation2016). We view these constructs, which we cluster together as ‘place values’, as parallel to Clark et al.’s (Citation2020) core EE outcome of connection to nature (the extent to which an individual feels part of nature, Schultz, Citation2001). In summary, by encompassing key outcome areas for science and environmental education, we aimed to capture the particular learning outcomes that may be afforded by participation in environmental research and monitoring.

Reviews of research in CCS

In the context of the several systematic reviews examining empirical research in EE and the increased attention to CCS as a strategy for EE, we conducted a systematic review of the peer-reviewed academic literature to shed light on what empirical evidence existed linking the two. Few reviews of this nature and scope exist. Peter et al. (Citation2019), for instance, reviewed the literature for the participant outcomes of biodiversity citizen science projects that ended in 2017, but used a more limited definition of CCS and reviewed only fourteen studies. Others have focused on climate change education outcomes (Groulx et al. Citation2017) or on environmental monitoring (Stepenuck & Green, Citation2015), or on participation aspects but not measured learning outcomes (Vasiliades et al. Citation2021). In our review, we sought to use an expansive definition of CCS described above based on the historical development of the field to better understand what learning outcomes have been documented across the broad spectrum of project types and environmental topics that exist in practice, focusing on outcomes relevant to EE (details in Methods).

Our research questions guiding this review were:

  1. What are the characteristics of CCS projects that reported EE learning outcomes?

  2. What evidence do the studies provide for EE learning outcomes achieved by CCS projects?

  3. How did participation in specific scientific research activities impact EE learning outcomes?

  4. What research methods and approaches have been used to study the EE learning outcomes of CCS, and what are the implications for future research?

Methods

Systematic review approach

Marcinkowski (Citation2003) describes the purposes of reviews of research: ‘(a) to identify research studies; (b) to describe or characterize research studies; (c) to critique research studies; and (d) to summarize or synthesize the results or claims of research studies’ (183). As suggested for all reviews by Grant & Booth’s (Citation2009) typology of reviews, we systematically searched the literature, applied strict inclusion and exclusion criteria, and synthesized what we know about CCS as a process of EE from the body of literature. We approached this review through a systematic qualitative analysis with the primary goals of identifying, describing, and synthesizing the claims of studies across diverse methods and research paradigms. We provide here a PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) flow diagram of our process (), to report transparently how the review was conducted and what was found (Page et al. Citation2021).

Figure 1. PRISMA flow diagram of systematic review methods based on Moher et al. (Citation2009).

Figure 1. PRISMA flow diagram of systematic review methods based on Moher et al. (Citation2009).

Literature search

We began our review by performing pilot searches using different terms to identify articles focused on CCS and EE learning outcomes discussed in the literature as above, with the variety of terms used in multiple fields. For all searches we used the database EBSCOhost to search all the included databases in the University of California, Davis subscription, which included Academic Search Complete as an all-around database, education indexes such as ERIC, Education Source, EdArXiv, and environment indexes such as Ecology Abstracts, BIOSIS; we confirmed that all the major journals of the environmental education field were included in these databases. Terms at this early stage included but were not limited to: environmental education, civic engagement, youth development, participatory,and conservation. We examined the returns of each search to determine if we were gaining relevant articles; those results brought us to the final list of keywords in that we used to conduct an unqualified search (meaning the database searched the title, abstract, subject, keywords, and author) with no start date and ending at our search date of September 2018 (see below for Limitations).

Figure 2. Keyword Boolean code search criteria.

Figure 2. Keyword Boolean code search criteria.

In , the first nesting of terms aims to capture all the articles in the area of community and citizen science. With feedback from a diverse set of colleagues in EE and CCS fields, we intentionally included a broad array of participatory research approaches as search terms to ensure it captured the diverse forms of public participation in science that can be considered CCS, given our earlier definition. To limit to articles that included research on educational outcomes, we included the second nesting of terms. The final nesting of terms aims to limit included articles to papers within the realm of environmental topics only and exclude those from other scientific fields. We also limited the search with the following restrictions: peer-reviewed articles, written in English. This produced 1639 unique articles for initial review published prior to September 2018, which was the end of the grant-funding and target timeframe for this component of the eeWorks initiative. (For a further scan of the literature from 2018–2023 to preliminarily examine if themes persisted, please see Discussion).

Identifying articles for inclusion

We imported citations and abstracts from the keyword search into Covidence systematic review management software (Babineau, Citation2014) and used a decision tree to exclude sources that did not meet our criteria (). The decision tree was refined and calibrated by team member review using a pool of 20 articles. Team members then reviewed random subsets of abstracts until each abstract had been reviewed by two people. The decision tree continued to be refined during meetings to resolve conflicts between readers. Abstracts that included unclear wording of EE outcomes remained categorized as ‘maybes’ and were prioritized for full-text review.

Requirements included peer-reviewed, in English, not book reviews or dissertations. Authors of articles had to identify their approach using at least one of our keywords for CCS inclusion. Additionally, abstracts had to include at least one of the following participant outcomes in this phase of the screening process (see our final list of EE learning outcomes defined below): knowledge of environmental science, environmental science skills, environmentally responsible behavior change, attitude toward the environment, and/or participation in environmental science, which the North American Association for Environmental Education identify as key components of environmental literacy (Hollweg et al. Citation2011). Some of the most commonly excluded articles were about public health topics that did not include an explicit environmental focus, like food access and urban planning. When we applied the decision tree criteria, we excluded 74% (n = 1221) of the initial article pool ().

Following abstract screening the remaining 26% of the articles went to full-text review. This process involved a higher level of scrutiny and deeper reading of articles, leading to the exclusion of an additional 88 articles (5%) that failed the decision tree after full-text review. Articles were then sorted to describe only articles that included empirical findings of participant outcomes, resulting in 100 articles (6%) that met decision tree criteria and empirically investigated participant outcomes related to EE. We then included these 100 articles for full coding and analysis.

Developing coding scheme and process

We developed categorical codes both deductively and inductively. We began with our original study objectives and developed categorical codes, then added additional codes as they emerged from article review, as well as sub-codes (). Further, because CCS projects aim to produce data or results used for scientific research, monitoring, or decision-making, we see participation in those scientific research activities as the most distinguishing educational approach that CCS projects employ compared to other EE initiatives, and therefore endeavored to identify the nuances of participation in the articles, using sub-codes for the major activities of scientific research as explained below (Eitzel et al. Citation2017; Haklay, Citation2013; Shirk et al. Citation2012). We drew from Shirk et al.’s (Citation2012), Haklay’s (Citation2013), and Wiggins’ (2011) categories of ‘parts of scientific process’ in which members of the public participated to arrive at our categories. We recognize that, in reality, there is no singular scientific method, and instead, there are multiple ways in which scientists may engage in inquiry across disciplines and paradigms (Ault, Citation2023; NRC, 2012). While these categories can be, on one hand, read as overly rigid and prescriptive, on the other hand, we recognize they are also iterative, fluid, and overlapping (Ault, Citation2023), and very much depend on the discipline and/or paradigm at play. For the purposes of our analysis, the goal was to categorize the myriad activities and tasks that make up an environmental science research or monitoring project, rather than try to discern scientific reasoning practices, for example, described in the U.S. Next Generation Science Standards (NRC, 2012) or encompass the nuances of each discipline. We, therefore, delineated what are commonly recognized in the CCS typologies as the ‘main scientific research activities’ in which people in CCS participate (codes in ) and created four clusters based on the predominant activities for each (planning, data collection, data analysis, and reporting; see for details).

Table 1. Example main codes for analysis of empirical articles.

Table 2. Environmental education learning outcomes and definitions.

For participant learning outcomes codes, we brought together key outcome areas for science education and environmental education, taking into consideration the particular context of people participating in CCS. Specifically, as we explain above, we included versions of EE outcomes that drew from Clark and colleagues’ (2020) five core EE outcomes and Holloweg and colleagues’ (2011) environmental literacy outcomes, tailored to the CCS context, such as environmental stewardship and behaviors (Phillips et al. Citation2018), community outcomes (Jordan et al. Citation2012), and place-based outcomes (Haywood et al. Citation2016; Kudryavtsev et al. Citation2012) (). We also included the science-focused learning outcomes of citizen science identified through a literature review of science education research by Phillips et al. (Citation2018) with a focus on the domain of environmental science; many of these outcomes were also used by a recent narrower review of biodiversity citizen science projects by Peter et al. (Citation2019) ().

Using the main categories and sub-codes, coders filled out a Google form to assign non-mutually exclusive sub-codes to each article, applying presence/absence to each. We established inter-rater reliability by two coders coding the same 20 articles, discussing differences in coding, and revising and refining the codes and definitions accordingly. Both coders coded another five articles separately and achieved total agreement on coding. The remainder of the articles were coded by one coder. In addition to looking at each learning outcome across the total set of articles we reviewed, we also wanted to examine how different outcomes might link to particular CCS approaches to participation, and whether participation in more than one activity of scientific research led to particular learning outcomes. We, therefore, determined the number of articles with each learning outcome for each cluster, and examined these to identify the ways in which authors described how each type of participation may have led to those learning outcomes.

Results

RQ 1) What are characteristics of CCS programs that reported EE learning outcomes?

We report here first the age groups of participants, whether projects were primarily field-based, online, or both, and the main topics and taxa that CCS projects focused on. We then report the specific scientific activities in which people participated in order to reveal the predominant ways people participated in CCS in studies with EE outcomes.

Age groups

A slim majority of the studies targeted adults (54%), with 13% of all articles focused on college students. Twenty-nine of the articles focused on young people ages 4–18 years old, with middle-school-aged youth (11 articles) and high-school-aged youth (10 articles) being more predominantly studied. Over one-third (35%) of the studies did not report the age group of the participating audience. Some of these were large app-based projects where participants did not report demographic information (e.g. iNaturalist), though many articles referred to ‘the community’ in ways that implied participants were primarily adults.

Field-based vs. online

The majority of articles (81) reported on CCS projects in which participants were involved in field-based settings (where the primary activities were outdoors and/or in-person, e.g., a BioBlitz (one-day/short-term events to inventory biodiversity in a bounded location like a park or city), bird and plant phenology projects, or water quality monitoring). Relatively few articles (5) reported involvement that was entirely online (e.g., online wildlife camera image identification platform or other image classification platforms such as Zooniverse), the low number of which likely reflects the review end date. Fourteen articles reported on projects that included both field-based and online settings, such as a participatory mapping project that involved field-based data collection as well as web-based digital mapping for community bushfire preparation in Tasmania (Haworth et al. Citation2016). In another example, university students created walking maps of health hazards with community members in South Carolina while also using GIS to analyze spatial disparities (Wilson et al. Citation2012).

Environmental topics or issues

The articles reported on CCS projects that investigated a broad range of environmental issues, from more conventional scientific topics such as particular taxa (14 articles focused on insects, and 8 articles each focused on birds, mammals, and plants) or biodiversity in general (8 articles focused on terrestrial and 4 on aquatic biodiversity), to explicitly socio-scientific topics: Fifteen studies on environmental health examined human exposure to environmental pollutants, with people collecting data about symptoms and health impacts of pollution (7 articles on air quality and 8 articles on water quality monitoring). Additionally, fifteen studies focused on environmental social science included local people gathering perspectives on natural disaster preparedness or using participatory mapping and photovoice to document habitat and natural resource loss or people’s willingness to pay for ecosystem service restoration. There were also 23 articles focused on broader place-based conservation topics, nine on habitat loss and deforestation, and nine each on invasive species or on food and agriculture-related conservation issues.

Specific scientific research activities in which people participated

A key goal for this study was to examine the specific ways in which participants took part in CCS projects that had EE outcomes, rather than simply putting everything under the broad umbrella of ‘participation,’ Unsurprisingly, we found that the predominant way participants engaged in scientific research is through collecting data or samples (81 articles) (). However, many studies also involved participants in stages post-data collection, wherein they analyzed data (26), interpreted data/drew conclusions (21), or disseminated or translated results into action (21). We provide examples of how people participated in these activities below. Relatively fewer articles reported on projects where participants were involved in early stages of scientific research, like choosing/defining the research question (13) or designing of data collection methods (13) (). Somewhat surprisingly, we also found a number of articles (28) that engaged participants through documenting/reporting their own local ecological knowledge (LEK). For this category, we define LEK as the expertise of people who have wisdom, experience, and practices associated with local ecosystems (Olsson & Folke, Citation2001). While anthropologists and ethnobotanists often document LEK as part of their research, which may or may not have been participatory, we included articles only if participants were intentionally documenting their own knowledge in a participatory way to contribute to a scientific project.

Figure 3. Number of articles reporting participants taking part in of the main scientific research activities of CCS.

Figure 3. Number of articles reporting participants taking part in of the main scientific research activities of CCS.

We also clustered these more detailed activities to look for patterns in the ways people were involved in the scientific research activities: planning, data collection, data analysis and interpretation, and reporting, and determined the number of articles in each cluster (). This allowed us to examine the potential relationship between the ways that people participated and the EE outcomes we coded for below.

Table 3. Number of articles with participants engaging in clustered scientific research activities of CCS.

Because participation in more or a wider range of activities of scientific research may lead to more or different learning outcomes from CCS, we also examined the number of articles that reported participants engaging in one, two, three, or more of the 11 delineated scientific research activities.The majority of articles reported participants involved in only one of the main activities of scientific research (56), primarily in data collection. However, 14 reported participants involved in two activities, nine studies reported people involved in three scientific research activities (typically adding analysis and/or interpretation to data collection), and 21 reported involvement in four or more scientific research activities. The latter took a range of forms but was very often focused on an environmental justice issue of urgent concern to the local community. For example, Hoover (Citation2016) reported that Mohawk (Akwesasne) community members, specifically a midwife and mothers, designed and implemented all aspects of an environmental health study of the impacts of PCBs in local water in collaboration with SUNY Albany scientists. In a completely different disciplinary context, Allen et al. (Citation2015) reported how local residents on the steep slopes of Bogotá, Colombia, initiated a mapping project with an anthropologist to provide evidence of dwelling practices in an area demarcated for ecological preservation.

RQ 2) What evidence do the studies provide for EE learning outcomes achieved by CCS projects?

In answering our second research question, we found that nearly every study found positive gains in EE learning outcomes for participants. Because study methods were incredibly varied across qualitative and quantitative approaches, we are not able to quantify the impacts within each outcome, but we can report that we found compelling evidence for five clusters of outcomes: science content knowledge; science inquiry skills and understanding the nature of science; positive attitudes or interest toward science, the local place, and the environment; community connectedness and cooperation; and self-efficacy, identity, agency and environmental behavior and stewardship. Only a very few articles reported no change in the outcomes: understanding of the process and nature of science (two of the 17 that measured it, see ) and interest in science or environment (two of the eight articles measured it, ), the rest reporting positive changes.

Table 4. Number of articles reporting learning outcomes (all positive outcomes except for *knowledge of nature of science and interest in science or environment).

Science content knowledge

The most common learning outcome documented was science content knowledge gains (all positive for this outcome) (56 articles, ). For all of these articles, facts and concepts to be learned focused on the topic of the CCS project (insects, birds, biodiversity, water quality, etc.). For example, Hesley et al. (Citation2017) reported increased knowledge of coral reef ecology through participation in a reef monitoring and restoration project; Langsdale et al. (Citation2009) reported that participatory mapping used to inform models of a water resource system helped participants understand causal relationships across that system; and Haywood et al. (Citation2016) reported that participants in a beached seabird monitoring project displayed increased knowledge of bird biology, behavior, and ecology and ecosystem components, structure, and processes.

Science inquiry skills and understanding the nature of science

A number of articles reported that participants increased their science inquiry skills (32 articles, , all positive for this outcome). Importantly, these were observed by authors, or were self-reports by participants through surveys or interviews, rather than as assessments of actual skills. For example, some authors reported that participants gained research skills (Nicosia et al. Citation2014; Wilson et al. Citation2012) and accuracy in data collection tasks (Becker et al. Citation2013) while others were more specific: improved camera deployment skills, map interpretation skills, data analysis skills, and use of monitoring tools. Brannon et al. (Citation2017) observed students learn how to successfully distinguish small-mammal skulls; Sabai and Sisitka (Citation2013) found that fishers and mangrove restorers, when involved in developing both scientific indicators and locally derived indicators of mangrove health, gained and shared those skills with other local actors.

Seventeen articles investigated participants’ understanding of the process and nature of science, 15 of which reported positive outcomes and 2 with no change (). Mitchell et al. (Citation2017) reported on students contributing to the ClimateWatch citizen science program and writing journal articles on their findings. Through this process students were able to critique the reliability of data that were contributed to CCS projects and learned how to ensure they were contributing reliable data. Cronje et al. (Citation2011) found that there was a significant increase in participants’ understanding of scientific methodology, validity, and reasoning in an invasive species program. And in online settings, Jennett et al. (Citation2016) found that online CCS participants reported an increase in understanding of the process and nature of science, though were not specific. However, Scheuch et al. (Citation2018) found that teachers’ knowledge of the nature of science was not changed during their citizen science-focused professional development program, even though that was an explicit goal. Similarly, Brossard et al. (Citation2005) found no significant difference in adult participants’ understanding of the scientific process in a bird monitoring project, and Jordan et al. (Citation2011) found little change in this outcome in their survey of participants in an invasive species program. The lack of change or null findings for both NOS and interest are consistent with previous studies of CCS projects, which also found some negative or null outcomes for these outcomes (Druschke and Seltzer, Citation2012; Brossard, et al. Citation2005).

Positive attitudes or interest toward science, the local place, and the environment

Sixteen articles found positive changes in attitudes toward science and the environment (). While the differences between attitudes and interest as constructs may be muddy, we relied on these articles specifically framing their investigations as documenting ‘changes in attitudes’ to include them in this category. A common type of change reported by many of these articles was participants’ increased positive emotions toward the focal species of the project. For example, volunteers working alongside professional scientists to evaluate hedgehog urban habitat use reported feeling ‘closer to hedgehogs’ (Hobbs & White, Citation2015); and Lynch et al. (Citation2018) reported participants gaining positive feelings towards insects through participation in an entomology citizen science project (Lynch et al. Citation2018). Land-Zandstra et al. (Citation2016) reported participants who contribute to citizen science through their smartphones but had limited involvement with science in the rest of their lives felt ‘science can have a positive impact on our lives.’

Nine articles reported positive changes in what we defined as place values and connection to nature, () (Gosling & Williams, Citation2010; Kudryavtsev et al. Citation2012). Many were focused on participatory mapping projects; Bergós et al. (Citation2018) reported that local people participating in mapping wildlife in rural Uruguay had more positive relationships with local animals and the environment following their participation. Similarly, Allen et al. (Citation2015) reported that participants learned how to make and use maps to document and plan for dwelling spaces and that they were able to ‘engage with complex planning decisions and ethical questions concerning the social function of the land, the contention of further sprawl and ecological carrying capacity of the territory’ (269). Other projects reporting changes in place values addressed water quality monitoring and urban garden soil testing; Doyle and Krasny (Citation2003) report on a project in which local gardeners took part in a participatory rural appraisal of their community garden in New York, and youth conducted interviews of local residents, and that youth learned about the relationship of previous land use to the community garden.

Eight articles investigated interest in science and the environment as a learning outcome (). Though six studies found positive gains in this outcome, two articles reported no change in interest after participating in CCS projects. Everett & Geoghegan’s (Citation2016) study of OPAL (Open Air Laboratories) citizen science biological surveys in the UK found participants reported renewed interest in amateur naturalist and biological recording activities. However, Nicosia et al. (Citation2014) did not find significant changes in this outcome after students investigated ‘willingness to pay’ for ecosystem services as part of their high school biology course.

Community connectedness and cooperation

Thirty articles reported positive changes in community connectedness and cooperation (). We identified two category sub-themes: social connectedness outcomes and stakeholder cooperation/social learning. Several articles reported impacts on social connectedness. For example, Haworth et al. (Citation2016) reported that participatory mapping activities in a bushfire risk workshop in Australia contributed to community connectedness and knowledge sharing, which is theorized to improve community resilience and disaster preparedness. In addition, several articles reported impacts on stakeholder cooperation and social learning. For example, Bergós et al. (Citation2018) reported that a participatory wildlife monitoring project in Uruguay involved both youth and adults in collecting and analyzing data related to local wildlife in addition to sharing local ecological knowledge and discussing the results. The collaborative workshop process facilitated increased trust between the involved parties, which included village residents, a school, and a conservation organization. Similarly, Henly-Shepard et al. (Citation2015) reported that participatory modeling activities undertaken by a community disaster planning committee in Hawai’i contributed to social learning. They reported that both individual reflection and group deliberation as part of these activities, in which members of the committee reported local knowledge, facilitated social learning to manage uncertainty and increase adaptive capacity.

Self-efficacy, identity, agency, and environmental behavior and stewardship

Studies found that participation in community and citizen science programs may contribute to participants’ development of identity and agency with science, their sense of self-efficacy to improve ecosystem health and their ability to solve problems and take action. Specifically, eleven articles reported positive outcomes for participants’ self-efficacy toward science and the environment (). Grasser et al. (Citation2016) reported that children participating in an ethnobotany project showed increased confidence in their ability to produce something of value in general and specifically for an adult audience, and Jordan et al. (20106) reported that after a collaborative planning and monitoring process, participants in the Virginia Master Naturalists program increased their confidence in their ability and strategies to address a natural resource problem. Four articles included positive changes in either participants’ identity with science or the environment (two articles) or agency with science or the environment (two articles) (). For example, Calabrese Barton and Tan (Citation2010) found young people who led a community-based research project in their after-school program enacted agency with the physics knowledge they had learned by investigating whether and how their neighborhood was an urban heat island and sharing their findings with community members. Merenlender et al. (Citation2016) found that adults who had been trained as naturalists and subsequently volunteered for local citizen science projects increased their identification as people who understand and do science in their daily lives, and are recognized by others that way as well.

Twenty-nine articles provided empirical evidence of positive environmental stewardship and behavior outcomes (). This is a substantial proportion of articles with important EE outcomes from CCS, and so we identified several sub-themes within this category: sharing information with others, civic action, and stewardship as site management. First, one commonly reported behavior and stewardship outcome of citizen science projects is participants sharing information with people in their networks. For example, Lewandowski & Oberhauser (Citation2017a) reported that 95% of participants in butterfly citizen science projects reported talking to others about butterflies or conservation and recruiting others, among other conservation actions. Specifically, 73% of participants reported talking informally with others about butterflies; 69% of participants who volunteered more of their time had higher odds of reporting that they involved others in monitoring or conservation. Second, many articles described CCS participants leveraging the project as part of civic action to advance political goals. For example, participants used data from FracTracker, a participatory geographic information system focused on unconventional natural gas development, to take some sort of action (Malone et al. Citation2012): 57% of individual survey respondents used FracTracker in discussions with regulatory officials, and 67% of nonprofit respondents used it to write to state representatives and regulatory officials. Third, many articles reported that participants subsequently or concurrently engaged in improving local habitats through stewardship and restoration. For instance, participants in East Bay Academy for Young Scientists (EBAYS), a community science program for high school students in California, found a lack of biodiversity and high levels of toxins in a local urban creek, prompting them to take action by removing trash and invasive species (Ballard et al. Citation2017b).

RQ 3) How did participation in specific scientific research activities impact EE learning outcomes?

In this section, we report the number of articles that included participation in each of the activities of scientific research, and the learning outcomes that were reported for each of those clustered activities, to examine how different CCS approaches to participation might relate to particular learning outcomes (). While we do not attempt to make causal claims from our data, we can say that for each of these clustered activities of CCS projects, the studies that examined each of the learning outcomes reported in found positive outcomes.

Table 5. EE learning outcomes by clustered scientific research activities of CCS.

Participation in planning

Of the 18 articles with volunteers participating in planning scientific research, a large portion reported positive gains in science content knowledge (12; ). In the majority of these 18 studies, participants helped shape the initial research questions, mostly by acting collectively. Many were participatory modeling or mapping projects, where participants helped design the mapping of vernal pools in Maine, USA (McGreavy et al. Citation2016), or dwelling practices in Bogotá, Colombia (Allen et al. Citation2015), or setting the agenda and selecting water management models in Vermont, USA (Gaddis et al. Citation2010). These all reported increased participant understanding and awareness of the environmental or social systems being studied and attributed this to the early initial meetings focused on deeply educating participants about the issue of concern to plan appropriate mapping or modeling and action efforts.

Participation in data collection

Of the 97 articles with volunteers participating in data collection, a large portion reported positive gains for environmental science content knowledge (55), science inquiry skills (30), community-level outcomes(2), and environmental behavior and stewardship outcomes (28) (). Many of those that measured gains in content knowledge were projects focused on particular taxa or habitats that provided explicit support to collect high-quality data, including taxonomic or ecological disciplinary content as well as the scientific skills required. For example, volunteers who collected samples of Staghorn corals to propagate and monitor coral reefs in the Caribbean gained knowledge of coral reef ecology (Hesley et al. Citation2017), and participants who collected forensic data on seabirds washed on the beach in the Pacific Northwest gained knowledge of bird biology and ecology (Haywood et al. Citation2016).

Participation in data analysis and interpretation

Of the 33 articles with participants involved in data analysis and interpretation, a majority reported positive gains in science content knowledge (20), but also gains in science inquiry skills (16) (). Several of these studies were of CCS projects in online settings, where participation focused on classification tasks for data analysis, described as ‘volunteer thinking’ by Jennett et al. (Citation2016), as opposed to more passive online tasks. As an example of how participation in analysis is linked to knowledge and skills gains, in their review of several online CCS projects, Jennett et al. (Citation2016) found that along with content knowledge, participants gained ‘pattern recognition skills’ and an enhanced understanding of how science works, including the use of rigorous protocols, the importance of learning from failures, and continued exploration. At the other end of the spectrum of intensely field-based settings, youth conducting a Participatory Rural Appraisal (PRA) of their own community garden analyzed their interview data and gained an understanding of their neighborhood’s environmental and social history, soil chemistry, and interviewing and mapping skills, as well as a stronger connection to place.

Participation in reporting

Of the 25 articles reporting participants involved in scientific reporting, 15 articles reported positive gains in scientific content knowledge, and a bit less than half reported evidence of gains in science inquiry skills and environmental stewardship and behavior gains (11 each) (). For example, Whitman et al. (Citation2015) report on a participatory action research project on farm slurry pollution in the UK that included local stakeholders working with a River Trust not only in vegetation mapping, but also in developing and disseminating a toolkit to other communities and presenting at conferences. Zeegers et al. (Citation2012) involved teacher leaders in Australia in not only conducting bird studies with their students, but also writing and disseminating their results via professional development workshops for other teachers. Taylor and Hall (Citation2013) reported on young people’s mapping of their urban neighborhood while riding bikes and collecting GPS data, sharing their findings with the neighborhood, and resulted in their using their data to critique their built environment and neighborhood. These findings might indicate that involving people in the reporting of scientific findings can have substantially positive impacts on these outcomes, particularly related to conservation and stewardship actions, which we discuss further below.

Participation in multiple scientific research activities

We also examined whether participation in a breadth of different scientific research activities or in only one kind of research activity seems to relate to EE learning outcomes. For the 61 articles reporting participation in just one type of scientific research activity (typically data collection), we found evidence of positive gains across all the EE learning outcomes. Further, fifteen articles reported people participating in two of our clustered types of participation (typically in data collection and in data analysis or planning), 14 articles included participants in three main areas, and ten articles reported people participating in all four main areas of scientific research as we defined them.

We examined these latter studies in greater detail to identify examples of ways that this intensive participation in nearly all aspects of scientific research resulted in environmental learning outcomes. Half of the ten studies with participation across all aspects of scientific research found positive impacts on environmental behavior and stewardship outcomes, many using strategies that engaged people in collective planning and dissemination activities as well as individual activities around data collection. These provided multiple opportunities for sense-making with the data they were collecting, and made visible ways that their own science production was being used to answer collective questions or solve collective problems. This latter took the form of monitoring the extraction and health of forest resources in Yunnan, China (Van Rijsoort & Jinfeng, Citation2005) or studying the siting of a waste facility in Los Angeles, California, USA (Dhillon, Citation2017). These resulted in both collective and individual behavior changes, with the participants in the former project changing their own harvest practices as well as creating village-level policies (Van Rijsoort & Jinfeng, Citation2005), and the participants in the latter project changing their own behavior related to trash as well as advocating collectively as an environmental justice group (Dhillon, Citation2017). While only a small subset of the 100 articles in the review, these projects with participation across the range of research activities offer strategies of participation that resulted in important EE learning outcomes.

RQ 4) What research methods and approaches have been used to study the EE learning outcomes of CCS, and what are the implications for future research?

Areas of publication over time

One hundred articles in our sample were published in 65 different journals in diverse fields (see full list in Appendix). The most popular journal outlets were Ecology and Society (7 articles), Conservation Biology (6 articles), Biological Conservation (5 articles), and Journal of Science Communication (5 articles). The remaining 77 articles were distributed across 61 journals in many fields, speaking to the multidisciplinarity of research on CCS and EE. We also found an increasing number of empirical journal articles exploring CCS with EE outcomes over time (). 2016 saw the most articles published, which can be explained in part by a number of these journals publishing special issues focused on citizen science that year (e.g., Journal of Science Communication, Conservation Biology, Biological Conservation). The trend indicates a rapid increase in research publications in the area of CCS for EE.

Figure 4. Number of empirical articles on CCS for EE published 2003–2018.

Figure 4. Number of empirical articles on CCS for EE published 2003–2018.

Research designs

Ten different research designs were used across the studies we reviewed, with over a quarter each using a case study approach, or a quasi-experimental approach with pre- and post-measures, or multiple group design (27 and 26 respectively) (). Twenty-one articles used participatory approaches, which involve participants in scientific research activities, with our search terms likely contributing to this large representation. We found that nearly all the studies except the two noted above reported uniformly positive findings for the outcomes they measured. This pattern may be reflective of the tendency for journals to publish only positive findings, which is a failing of the academic publishing culture more broadly as well as in the field of environmental education. In this case, it may be because the outcomes reported could be a particular reflection of what the researchers chose to study (see Discussion below) and because null or non-significant findings often go unreported outside of randomized control trials (Herrington & Maynard, Citation2019), which none of our included studies utilized.

Table 6. Research designs used in studies of CCS for EE.

Research methods

The most common data collection method was structured survey instruments (n = 54), followed by interviews (n = 45) (). Surveys included open and closed-ended questions, validated scales, self-reported assessments, and traditional knowledge assessments. Fifty-seven studies used more than one method to collect data, typically combining other research methods such as observations (n = 25) or focus groups (n = 21). Only six studies reported using quality control checks. Because environmental and science content knowledge was the most often studied and reported outcome, we examined the methods used to study these in more detail to examine the robustness of the findings reported. Within the studies that reported science content knowledge outcomes (n = 56), all reporting positive gains, we found that 25 used surveys to measure this outcome, 16 of which were self-reported, and nine assessed pre/post. Sixteen studies used interviews to measure science content knowledge gains, 12 of which used self-reports and four assessed with pre/post-tests. The three studies that used focus groups to measure this outcome all relied on self-reports, as did one of the three studies using community meetings/listening sessions. Five studies used observations by researchers to determine knowledge gains. Below, we discuss the implications and concerns surrounding the predominant use of self-reporting to measure this outcome.

Table 7. Data collection methods used (n = 100).

Discussion

Characteristics of CCS projects with EE learning outcomes

The fifty-four articles reporting positive EE learning outcomes for adult participants in CCS (n = 54) provide compelling evidence that CCS can support lifelong environmental learning beyond the K-12 years (Roth & Lee, Citation2004) and positive impacts on public engagement in science suggested in more conceptual articles by Bonney et al. (Citation2016) and others. In addition, the 29 articles that found positive learning gains for young people ages 4–18 reinforces the increasing interest in K-12 school settings in CCS as an effective approach for engaging students in science and science reasoning practices in and out of school (Shah & Martinez, Citation2016). While we found a relatively small proportion of empirical studies of online-only CCS projects, this is a rapidly expanding area of CCS and has likely increased since our 2018 search cut-off, including research on youth participation in online citizen science platforms like Zooniverse (https://www.zooniverse.org/), iSpot (https://www.ispotnature.org/), and iNaturalist (https://www.inaturalist.org/) (Aristeidou & Herodotou, Citation2020; Aristeidou et al. Citation2021).

While we saw a broad range of EE-relevant issues addressed by CCS approaches, a large portion of the articles focused on studying environmental social science issues. While this primarily reflects what researchers chose to study, it may also reflect the growing interest in the concept of ‘citizen social science’ on the part of the CCS field (Purdam, Citation2014; Kythreotis et al. Citation2019), particularly in Europe (Heiss & Matthes, Citation2017). This new area faces tensions around the differences between participatory approaches and standard social science survey and interview methods, but is clearly a burgeoning area for the field (Albert et al. Citation2021).

Compelling evidence that participation in CCS results in EE learning outcomes: trends and implications for the field

Our systematic review of empirical research in CCS reveals substantial evidence that participation in environmental science can lead to important EE learning outcomes and offers a comprehensive picture of the EE learning outcomes that result across the range of CCS approaches to EE. These findings complement and support several conceptual and conjectural articles about how participation in environmental science might lead to important learning outcomes. To begin, the large number of articles that studied and found positive gains in science and environmental content knowledge (56 articles) is consistent with Phillips et al. (Citation2018), who found this was one of the learning outcomes most often reported in their review of citizen science project websites. CCS programs can be assured that when they train participants to identify and collect data (the most predominant type of participation), they will likely gain knowledge of the target environmental science content of their project. However, it is concerning that even for this outcome, which can be assessed with a pre-post survey more easily than other more nuanced constructs, researchers still relied heavily on self-reports rather than assessments. If CCS is going to be more widely adopted as an approach to EE to address K-12 education standards such as the Disciplinary Core Ideas in the Next Generation Science Standards (NGSS Lead States, Citation2013) in the U.S., researchers should use less self-reporting and more observable gains in knowledge wherever possible.

The large number of studies (30) that documented positive community-level outcomes from CCS projects is notable, considering other recent reviews found far fewer studies in reviewing civic engagement outcomes of EE programs (Ardoin et al. Citation2022), and these outcomes can be difficult to measure (Thomas et al. Citation2018). Environmental education researchers and practitioners are increasingly focused on community (Aguilar, Citation2018) and ways collective action is intertwined with EE (Ardoin et al. Citation2022). The large number of studies that chose to focus on community connectedness and cooperation, often in the context of environmental health research, may reflect the growing focus on community science and other community-driven, rather than scientist-driven, forms of participatory environmental science (Cooper et al. Citation2021; Reid et al. Citation2021). These fields may not identify their work as EE, but nevertheless, they are impacting and measuring EE outcomes, as evidenced in our review.

Promoting pro-environmental behavior change is a crucial goal in environmental education, yet achieving it remains a complex task for educators (Heimlich & Ardoin, Citation2008); we found evidence that CCS approaches may be particularly effective at achieving those outcomes. While Phillips et al. (Citation2018) found that increased environmental stewardship was the third most common outcome self-reported by citizen science projects on their websites, little research evidence was available for those self-reports. Outcomes around participant behaviors like ‘sharing project findings with others’ indicate that CCS may increase the conservation involvement of volunteers by offering a sense of social connectedness (Lewandowski & Oberhauser, Citation2017b). Many of the CCS projects that resulted in participants taking action to improve habitat and other site management actions intentionally integrated stewardship activities into the program that built from the data collection and citizen science work (for example, Ballard et al. Citation2017b). This integration of both participation in the scientific research and stewardship activities may be particularly powerful and consistent with recommendations for including direct environmental action, individual and collective, in EE programs (Ardoin et al. Citation2022; Dubois et al. Citation2018).

While very distinct and separate outcomes, the development of self-efficacy, identity, and agency with science and the environment are all important aspects of learning through environmental education that CCS is particularly well-positioned to address (Phillips et al. Citation2018). We found that most articles reporting these outcomes involved participation in more than three of the scientific research activities beyond data collection. Therefore, we suggest that CCS project designers consider that outcomes like identity development require practicing one’s identities over time (Nasir & Hand, Citation2008), and do not necessarily result from all CCS projects. While only 11 studies focused on these constructs, research on these outcomes from CCS approaches has increased since 2018, and continues to provide evidence for positive and nuanced impacts on these outcomes of CCS approaches (i.e. He et al. Citation2019; Williams et al. Citation2021).

Impacts of participation in scientific research activities on EE learning outcomes: recommendations for design

As a distinguishing feature of CCS for EE, we wanted to discover precisely which scientific research activities, including but beyond data collection, produced EE learning outcomes. First, for those articles wherein people solely participated in data collection, we found a wide range and ample evidence of EE learning outcomes. This suggests that while more participation might be better, even simply participating in data collection can still have positive EE outcomes. Further, studies are emerging that directly compare the learning outcomes for data collection-only CCS projects versus those where participants help or lead throughout the scientific process, finding important outcomes for both (Williams et al. Citation2021).

Second, we found that participation beyond data collection was more prevalent than might have been expected, and found extensive evidence for positive EE outcomes from participation in these other aspects of scientific research, i.e., planning (18 articles), data analysis (33 articles) and reporting (25 articles). Further, by examining closely the ten studies that involved people in all four main clusters of participation in scientific research activities, we found they reflected the more co-created (Shirk et al. Citation2012/community science (Dosemagen & Parker Citation2019)/community-based participatory research (Israel et al. Citation2013) approaches that may be particularly impactful for participant learning. We found that involving people in the planning stages of CCS projects can also have substantially positive impacts on a number of important EE learning science and environmental learning outcomes, particularly community-level outcomes like social connectedness and stakeholder cooperation, as well as environmental behaviors and stewardship. This suggests CCS project design for EE should involve participants in planning stages, focused on interaction and deliberative dialogue (Muro & Jeffrey, Citation2008) to ensure ownership of the project and research questions focused on community-driven questions, and can be important catalysts for social learning in collaborative resource management and conservation settings (Conrad & Hilchey, Citation2011; Jadallah & Ballard, Citation2021).

We suggest that EE practitioners designing CCS projects could intentionally work to involve participants in the planning, analysis and/or reporting stages in addition to data collection, consistent with Phillips et al. (Citation2019), with numerous examples provided above, and in particular:

  1. Involving participants in research planning for a CCS project, such as community meetings or providing feedback on methods, can result in positive EE outcomes in community-conneectedness and cooperation especially, and also in science content knowledge, and environmental behaviors and stewardship;

  2. Involving participantsin data analysis and interpretation for a CCS project, such as group discussions around data visualizations, can result in positive EE outcomes in place values;

  3. Involving participants in the reporting for a CCS project, such as sharing presentations, newsletters, blogs, or posters locally, can result in positive EE outcomes in science inquiry skills, environmental behaviors and stewardship, and place values.

Critiquing the research on CCS for EE and recommendations for future research

Our review brought to light the fact that some of the empirical studies on EE learning outcomes of CCS were less rigorous methodologically. As empirical research on CCS as an approach to EE continues to expand, we offer the gaps and blind spots in research methods for future research. We found a mix of research designs and data collection methods used across reviewed articles, and this can contribute to a robust understanding of the field, with quantitative and qualitative methods contributing different types of evidence to the understanding of a construct. However, reliance on case studies was the most frequently used design (27%), and while case studies allow for deep understanding of a phenomena, they have severe limitations when it comes to generalizability (although we note they often don’t carry the goal of generalizability). Studies that make use of different research designs, such as quasi-experimental research, can be of use in helping construct causal relationships with more confidence. Also, very few of the articles used validated measures for their instruments; in fact, we know that recently validated survey scales for many common learning constructs have been developed and are beginning to be implemented in the CCS field (Phillips et al. Citation2018).

Another blind spot lies in the heavy reliance on participants self-reporting learning and skill gains, which is also an important limitation in drawing conclusions about impacts on learning outcomes. Self-assessments of knowledge only moderately correlate with learning in adults and have been found to be inaccurate over half the time in studies of undergraduate students (Sitzmann et al. Citation2010). To reduce reliance on self-reporting, we suggest increasing the use of embedded assessments (Becker-Klein et al. Citation2016), observation protocols, and quality control checks to obtain reliable measures of knowledge, skills, and science process knowledge. This would reduce the burden on participants with time-consuming or ‘test-like’ surveys or interviews and increase our confidence in interpreting findings in these outcome areas. Self-reporting is not a concern for all outcomes, however. For outcomes such as self-efficacy, identity, agency, interest, and attitudes, interviews and other qualitative methods focused on self-story are valuable and even essential to reflect participants’ own meaning in their words and can supplement quantitative measures of these outcomes. We also saw some examples of mixed methods yielding nuanced results, as in Lynch et al.’s (2017) findings on nature relatedness using both quantitative and qualitative methods in an entomology citizen science project. Finally, we believe that emergent methods not used in any of the studies we reviewed, such as art-based methods or walking interviews, can be used to further assess learning outcomes in novel ways (Tuck & McKenzie, Citation2014).

Further, the numerous studies we found reporting environmental behavior and stewardship varied in their use of measures for intention to behave, self-reporting of behaviors already performed, and observations by researchers of actual behaviors. Actual pro-environmental behaviors are very difficult to measure (Ardoin et al. Citation2018; Heimlich & Ardoin, Citation2008), and Hughes et al. (Citation2013) found in studying the impact of viewing wildlife on behaviors that positive intentions are not good indicators of long-term behavior change. These limitations don’t discount the mounting evidence that CCS can successfully achieve common EE outcomes, as we saw in our review, but to be more confident in our assertions as a field, we must include more rigorous research designs and data collection methods.

Limitations and continuing and future trends

We acknowledge several limitations to our analysis that constrain our ability to make causal claims. First, the predominance of articles reporting a particular learning outcome might only reflect the relative ease or popularity for researchers to study that outcome (i.e., studying scientific content knowledge is very common), possibly reflecting the ‘streetlight effect’ whereby we see more evidence of outcomes that researchers find easier to measure (Ardoin et al. Citation2018; Freedman, Citation2010). We can only report on the studies that were conducted and, therefore, what the researchers chose to study. That is to say, none of these studies attempted to examine all of the learning outcomes we delineated, such that comparing the outcomes to each other directly is not possible. Last, many disciplines, including education, have a bias toward publishing positive findings and a dearth of published studies reporting null or negative findings (Fanelli, Citation2012). Therefore, the absence of evidence doesn’t reflect evidence of absence; that is, other outcomes may have resulted from a CCS project beyond what the researchers chose to measure and report in these articles. That said, each of these articles designed and reported almost entirely positive learning outcomes from their studies of CCS projects, regardless of whether they were designed originally as educational programs, which leads us to conclude that CCS participation has indeed led to a myriad of positive EE learning outcomes. Lastly, we do not claim that these 100 articles are proportionally representative of all CCS projects in general, as many projects and programs remain under-evaluated in the field (Jordan et al. Citation2012; Phillips et al. Citation2018).

We also acknowledge that our review of the literature ended in 2018, but offer our findings as a crucial baseline from which other systematic reviews may build. Furthermore, the literature published after this systematic review (2019–2023) suggests the same types of EE outcomes continue to be achieved by CCS programs for adults (He et al. Citation2019) and young people. In particular, research on learning outcomes for young people in and outside of schools has surged (Kali et al. Citation2023), finding many of the strong positive outcomes, particularly in understanding environmental science content (Yan et al. Citation2023), positive attitudes about the environment (Aivelo, Citation2023), and self-efficacy and interest in environmental science (Clement et al. Citation2023). CCS approaches in K-12 education settings are increasingly implemented globally (Atias et al. Citation2023), not only to teach STEM content but to engage young people in environmental problem-solving and action. We found outcomes for teachers/educators that resulted from incorporating CCS into their classroom teaching, including changes in teacher practices by sharing more authority with students and enhancing pedagogical content knowledge and approaches to STEM teaching. A recent special issue of Instructional Science journal on citizen science in schools for learning in a networked society (Kali et al. Citation2023) provides continuing evidence of the trends in positive science learning outcomes for K-12 students we found in our review.

In addition, several potential trends warrant future investigation, including increasing focus on the ways that CCS may or may not address equity and diversity issues in science and environmental education (Parrish et al. Citation2019; Pateman and West, Citation2023), and empower participants to address impacts of climate change (Day et al. Citation2022). Research on learning outcomes of online-focused citizen science participation has surged recently (Aristeidou & Herodotou, Citation2020), and will become even more complex as Artificial Intelligence becomes even further embedded in many crowdsourcing platforms such as Zooniverse and iNaturalist. An important area of growing research is in studies examining evidence that CCS programs impact broader socio-ecological systems through the combined positive outcomes for participants and more robust science (Jadallah & Wise, Citation2023, Jørgensen & Jørgensen, Citation2021; Receveur et al. Citation2022). Finally, several studies examined the changes in participation in CCS during the COVID-19 pandemic lockdowns (Coldren, Citation2022; Drill et al. Citation2022) which may continue as broader education research continues to examine the impacts of the pandemic shutdowns on learning for K-12 students, as well as uses of CCS to study aspects of the pandemic itself (Sadiković et al. Citation2020).

A final note: challenges in defining EE and CCS

Our review raises the perennial question as to what actually constitutes ‘environmental education’ and ‘community and citizen science’, and whether we can or should have strict definitions for these areas (Ardoin et al. Citation2018; Eitzel et al. Citation2017). Particularly, we debated if authors needed to self-identify using key terms, or rely on our assessment of the project activities, topics of study, and/or social science research. We decided to use both, but acknowledge that this is raises questions. In CCS, what actually constitutes participation in science? Many projects claim to be ‘participatory’, but how this occurs isn’t always evident and is a sticking point in defining the field of citizen science and all its permutations (Eitzel et al. Citation2017; Shirk et al. Citation2012). In our review, when we couldn’t tell, we had to honor the authors’ use of the term. Given that the term ‘participatory’ has itself been widely debated and yet adopted throughout education, sociology, and conservation literature for many decades (e.g., Arnstein, Citation1969), we suggest that any claims about participation, co-creation, and power-sharing in science should be transparent and critically interrogated. For EE, the field has evolved, and terminology can sometimes be limiting (Ardoin et al. Citation2018). Our inclusion criteria meant we included many articles focused on environmental health, environmental justice, and natural resource management that DO meet those criteria, even though those authors and/or participants may never have considered it ‘environmental education.’ This speaks to Clark and colleagues’ (2020) Delphi study with EE experts to determine how those and other topics fit under the larger umbrellas of EE. By delving into and including articles that don’t self-identify as CCS or as EE, but clearly involve the activities, topics, and outcomes of CCS and EE as we’ve defined them, we were able to find evidence of participatory approaches to environmental science that result in some of the more intractable outcomes we care about in EE, like community connectedness and cooperation. We suggest that widely agreed-upon and static definitions for either field are neither likely nor, perhaps, desirable. Similarly, as environmental education becomes increasingly, and appropriately, intertwined with social justice, diversity, and equity-focused education and advocacy, the definition of what constitutes EE should also expand, and the premise of CCS to broaden participation in science can contribute to these goals. The expanding implementation of CCS projects aimed at the UN Sustainable Development Goals provides a rich opportunity for environmental educators and researchers to help shape these ongoing debates.

Conclusions

In summary, we found consistent and diverse evidence that community and citizen science as an environmental education approach results in positive science and environmental learning outcomes. This empirical research on CCS for EE highlights the ways citizen science and community science contexts can provide diverse learning settings to support EE outcomes, including but far beyond K-12 classrooms, including community-based organizations; local, state, and federal government organizations; conservation organizations; as well as more typical EE settings. Importantly, we saw a large number of studies reporting community connectedness and environmental behavior and stewardship outcomes in settings and projects outside what may be typically considered EE, such as environmental health and community-based participatory research projects, suggesting these contexts and approaches to CCS could be important mechanisms for achieving EE outcomes in ways the designers may not have realized or intended. In that context, CCS projects can provide opportunities for people to participate and contribute to the life of their community, fostering scientific literacy that is inclusive of and extends beyond K-12 settings.

Overall, we argue that participants gained positive EE learning outcomes from engaging in not only data collection, but also the planning, analysis and interpretation, and reporting activities of scientific research. Involving participants in any of the activities of scientific research, is not about cursory outreach, but about engaging participants in the inquiry and scientific reasoning practices akin to those in the Next Generation Science StandardsFootnote1 from the United States (i.e., asking researchable questions, constructing explanations, model-based reasoning, developing arguments from evidence, analyzing and communicating with data, etc.). This can and does happen in K-12 classrooms, but also in broad-scale environmental monitoring projects and intensive community-based projects. However, it should not overshadow that we found clear evidence for positive EE outcomes for more contributory CCS projects that involve participants primarily in data collection. What these CCS projects have in common is the relevance of the topics that participants are studying and the ways in which the project makes clear through feedback how their contributions to the scientific research activities make a difference for the issue they care about.

In an era when distrust of science is high, and climate change impacts are emerging daily around the globe, we need environmental education programs that tackle both these societal problems head-on. Our findings offer research-based evidence that involving people in science to address environmental problems is an effective way to impact learning and stewardship with thoughtfully designed participatory approaches. Clearly, our review only scratches the surface in examining that evidence. We need more rigorous empirical research with carefully designed research methods across projects to better understand how, and the extent to which, different participatory approaches may result in environmental education learning outcomes. Our review shows that research in this area is burgeoning, and we look forward to learning from what is to come.

Acknowledgements

We thank several people who have provided insightful feedback and suggestions at various stages of this study: Martha Monroe, Joanna Nelson, Drew Burnett, Judy Braus, Nicole Ardoin, Ryan Meyer, Erin Bird, and the anonymous reviewers who truly helped improve the manuscript.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This project was funded by the North American Association for Environmental Education.

Notes

1 The Next Generation Science Standards emerged from the context of the United States’ National Research Council’s Framework for Science Education, and focus largely on promoting science learning through processes of inquiry (NGSS Lead States, Citation2013).

References

  • *Adams, C., P. Brown, R. Morello-Frosch, J. G. Brody, R. Rudel, A. Zota, S. Dunagan, J. Tovar, S. Patton, and S. Patton. 2011. “Disentangling the Exposure Experience: The Roles of Community Context and Report-Back of Environmental Exposure Data.” Journal of Health and Social Behavior 52 (2): 180–196. https://doi.org/10.1177/0022146510395593.
  • Aguilar, O. 2018. “Toward a Theoretical Framework for Community EE.” The Journal of Environmental Education 49 (3): 207–227. https://doi.org/10.1080/00958964.2017.1397593.
  • *Akom, A., A. Shah, A. Nakai, and T. Cruz. 2016. “Youth Participatory Action Research 2.0: How Technological Innovation and Digital Organizing Sparked a Food Revolution in East Oakland.” International Journal of Qualitative Studies in Education: QSE 29 (10): 1287–1307. https://doi.org/10.1080/09518398.2016.1201609.
  • *Alender, B. 2016. “Understanding Volunteer Motivations to Participate in Citizen Science Projects: A Deeper Look at Water Quality Monitoring.” Journal of Science Communication 15 (03): A04. https://doi.org/10.22323/2.15030204.
  • Albert, A., B. Balázs, E. Butkevičienė, K. Mayer, J. Perelló, et al. 2021. “Citizen Social Science: New and Established Approaches to Participation in Social Research.” In The Science of Citizen Science, edited by Vohland K, 119–138. Cham: Springer Nature Switzerland AG. https://doi.org/10.1007/978-3-030-58278-4.
  • *Allen, A., R. Lambert, A. Apsan Frediani, and T. Ome. 2015. “Can Participatory Mapping Activate Spatial and Political Practices? Mapping Popular Resistance and Dwelling Practices in Bogotá Eastern Hills.” Area 47 (3): 261–271. https://doi.org/10.1111/area.12187.
  • *Andow, D., E. Borgida, T. Hurley, and A. Williams. 2016. “Recruitment and Retention of Volunteers in a Citizen Science Network to Detect Invasive Species on Private Lands.” Environmental Management 58 (4): 606–618. https://doi.org/10.1007/s00267-016-0746-7.
  • *Apgar, J. M., P. J. Cohen, B. D. Ratner, S. de Silva, M.-C. Buisson, C. Longley, R. C. Bastakoti, and E. Mapedza. 2017. “Identifying Opportunities to Improve Governance of Aquatic Agricultural Systems through Participatory Action Research.” Ecology & Society 22 (1 PG-626–638): 626–638.
  • Arnstein, S. R. 1969. “A Ladder of Citizen Participation.” Journal of the American Institute of Planners 35 (4): 216–224. https://doi.org/10.1080/01944366908977225.
  • Ardoin, N. M., A. W. Bowers, and E. Gaillard. 2020. “Environmental Education Outcomes for Conservation: A Systematic Review.” Biological Conservation 241: 108224. https://doi.org/10.1016/j.biocon.2019.108224.
  • Ardoin, N. M., A. W. Bowers, N. W. Roth, and N. Holthuis. 2018. “Environmental Education and K-12 Student Outcomes: A Review and Analysis of Research.” The Journal of Environmental Education 49 (1): 1–17. https://doi.org/10.1080/00958964.2017.1366155.
  • Ardoin, N. M., A. W. Bowers, and M. Wheaton. 2022. “Leveraging Collective Action and Environmental Literacy to Address Complex Sustainability Challenges.” Ambio 52 (1): 30–44. https://doi.org/10.1007/s13280-022-01764-6.
  • Ardoin, N. M., C. Clark, and E. Kelsey. 2013. “An Exploration of Future Trends in Environmental Education Research.” Environmental Education Research 19 (4): 499–520. https://doi.org/10.1080/13504622.2012.709823.
  • Aristeidou, M., and C. Herodotou. 2020. “Online Citizen Science: A Systematic Review of Effects on Learning and Scientific Literacy.” Citizen Science: Theory and Practice 5 (1): 1–12. https://doi.org/10.5334/cstp.224.
  • Aristeidou, M., C. Herodotou, H. L. Ballard, A. N. Young, A. E. Miller, L. Higgins, and R. F. Johnson. 2021. “Exploring the Participation of Young Citizen Scientists in Scientific Research: The Case of iNaturalist.” PloS One 16 (1): E 0245682. https://doi.org/10.1371/journal.pone.0245682.
  • *Atchison, J., L. Gibbs, and E. Taylor. 2017. “Killing Carp (Cyprinus Carpio) as a Volunteer Practice: Implications for Community Involvement in Invasive Species Management and Policy.” Australian Geographer 48 (3): 333–348. https://doi.org/10.1080/00049182.2016.1265229.
  • Atias, O., A. Baram-Tsabari, Y. Kali, and A. Shavit. 2023. “In Pursuit of Mutual Benefits in School-Based Citizen Science: Who Wins What in a Win-Win Situation?” Instructional Science 51 (5): 695–728. https://doi.org/10.1007/s11251-022-09608-2.
  • Ault, C. R. 2023. “Trusting the Social Value of Diverse Scientific Enterprises: Matching a Patchwork of Challenges to a Mosaic of Responses.” Science Education 107 (1): 28–41. https://doi.org/10.1002/sce.21761.
  • Aivelo, T. 2023. “School Students’ Attitudes towards Unloved Biodiversity: Insights from a Citizen Science Project about Urban Rats.” Environmental Education Research 29 (1): 81–98. https://doi.org/10.1080/13504622.2022.2140125.
  • *Avriel-Avni, N. 2017. “Education for Sustainability: Teachers Conceptualize Their New Role by Participatory Action Research.” ALAR Journal 23 (1): 7–33.
  • Babineau, J. 2014. “Product Review: Covidence (Systematic Review Software).” Journal of the Canadian Health Libraries Association / Journal de l’Association des bibliothèques de la santé du Canada 35 (2): 68–71. https://doi.org/10.5596/c14-016.
  • Ballard, H. L., L. D. Robinson, A. N. Young, G. B. Pauly, L. M. Higgins, R. F. Johnson, and J. C. Tweddle. 2017. “Contributions to Conservation Outcomes of Natural History Museum-Led Citizen Science: Examining Evidence and Next Steps.” Biological Conservation 208: 87–97. https://doi.org/10.1016/j.biocon.2016.08.040.
  • Ballard, H. L., C. G. Dixon, and E. M. Harris. 2017. “Youth-Focused Citizen Science: Examining the Role of Environmental Science Learning and Agency for Conservation.” Biological Conservation 208: 65–75. https://doi.org/10.1016/j.biocon.2016.05.024.
  • Bandura, A. 1997. Self-Efficacy: The Exercise of Control. New York, NY: W. H. Freeman.
  • *Becker, M., S. Caminiti, D. Fiorella, L. Francis, P. Gravino, M. (Muki) Haklay, A. Hotho, et al. 2013. “Awareness and Learning in Participatory Noise Sensing.” PloS One 8 (12): E 81638. https://doi.org/10.1371/journal.pone.0081638.
  • Becker-Klein, R., K. Peterman, and C. Stylinski. 2016. “Embedded Assessment as an Essential Method for Understanding Public Engagement in Citizen Science.” Citizen Science: Theory and Practice 1 (1): 8. https://doi.org/10.5334/cstp.15.
  • Bela, Györgyi, Taru Peltola, Juliette C. Young, Bálint Balázs, Isabelle Arpin, György Pataki, Jennifer Hauck, et al. 2016. “Learning and the Transformative Potential of Citizen Science.” Conservation Biology: The Journal of the Society for Conservation Biology 30 (5): 990–999. https://doi.org/10.1111/cobi.12762.
  • *Bergós, L., F. Grattarola, J. M. Barreneche, D. Hernández, and S. González. 2018. “Fogones de Fauna: An Experience of Participatory Monitoring of Wildlife in Rural Uruguay.” Society & Animals 26 (2): 171–185. https://doi.org/10.1163/15685306-12341497.
  • *Boissière, M., F. Bastide, I. Basuki, J. Pfund, and A. Boucard. 2014. “Can we Make Participatory NTFP Monitoring Work? Lessons Learnt from the Development of a Multi-Stakeholder System in Northern Laos.” Biodiversity and Conservation 23 (1): 149–170. https://doi.org/10.1007/s10531-013-0589-y.
  • Bonney, R., H. Ballard, R. Jordan, E. McCallie, T. Phillips, J. Shirk, and C. C. Wilderman. 2009. Public Participation in Scientific Research: Defining the Field and Assessing its Potential for Informal Science Education. Washington, D.C.: Center for the Advancement of Informal Science Education (CAISE).
  • Bonney, R., T. B. Phillips, H. L. Ballard, and J. W. Enck. 2016. “Can Citizen Science Enhance Public Understanding of Science?” Public Understanding of Science (Bristol, England) 25 (1): 2–16. https://doi.org/10.1177/0963662515607406.
  • *Branchini, S., M. Meschini, C. Covi, C. Piccinetti, F. Zaccanti, and S. Goffredo. 2015. “Participating in a Citizen Science Monitoring Program: Implications for Environmental Education.” PloS One 10 (7): E 0131812. https://doi.org/10.1371/journal.pone.0131812.
  • *Brannon, M. P., J. K. H. Brannon, and R. E. Baird. 2017. “Educational Applications of Small-Mammal Skeletal Remains Found in Discarded Bottles.” Southeastern Naturalist 16 (sp10): 4–10. https://doi.org/10.1656/058.016.0sp1005.
  • *Britton, S. A., and D. J. Tippins. 2015. “Practice or Theory: Situating Science Teacher Preparation within a Context of Ecojustice Philosophy.” Research in Science Education 45 (3): 425–443. https://doi.org/10.1007/s11165-014-9430-1.
  • *Brossard, D., B. Lewenstein, and R. Bonney. 2005. “Scientific Knowledge and Attitude Change: The Impact of a Citizen Science Project.” International Journal of Science Education 27 (9): 1099–1121. https://doi.org/10.1080/09500690500069483.
  • Calabrese Barton, A. M. 2012. “Citizen (s’) Science. A Response to" the Future of Citizen Science.” Democracy and Education 20 (2): 12.
  • *Calabrese Barton, A., and E. Tan. 2010. “We Be Burnin’! Agency, Identity, and Science Learning.” Journal of the Learning Sciences 19 (2): 187–229. https://doi.org/10.1080/10508400903530044.
  • *Campos, I. S., F. M. Alves, J. Dinis, M. Truninger, A. Vizinho, and G. Penha-Lopes. 2017. “Climate Adaptation, Transitions, and Socially Innovative Action-Research Approaches.” Ecology & Society 22 (2): 26–36.
  • *Canevari-Luzardo, L., J. Bastide, I. Choutet, and D. Liverman. 2017. “Using Partial Participatory GIS in Vulnerability and Disaster Risk Reduction in Grenada.” Climate and Development 9 (2): 95–109. https://doi.org/10.1080/17565529.2015.1067593.
  • *Carton, L., and P. Ache. 2017. “Citizen-Sensor-Networks to Confront Government Decision-Makers: Two Lessons from The Netherlands.” Journal of Environmental Management 196: 234–251. https://doi.org/10.1016/j.jenvman.2017.02.044.
  • *Cherry, E. 2018. “Birding, Citizen Science, and Wildlife Conservation in Sociological Perspective.” Society & Animals 26 (2): 130–147. https://doi.org/10.1163/15685306-12341500.
  • Clark, C. R., J. E. Heimlich, N. M. Ardoin, and J. Braus. 2020. “Using a Delphi Study to Clarify the Landscape and Core Outcomes in Environmental Education.” Environmental Education Research 26 (3): 381–399. https://doi.org/10.1080/13504622.2020.1727859.
  • Clement, S., K. Spellman, L. Oxtoby, K. Kealy, K. Bodony, E. Sparrow, and C. Arp. 2023. “Redistributing Power in Community and Citizen Science: Effects on Youth Science Self-Efficacy and Interest.” Sustainability 15 (11): 8876. https://doi.org/10.3390/su15118876.
  • Coldren, C. 2022. “Citizen Science and the Pandemic: A Case Study of the Christmas Bird Count.” Citizen Science: Theory & Practice 7 (1): 1–9.
  • Conrad, C. C., and K. G. Hilchey. 2011. “A Review of Citizen Science and Community-Based Environmental Monitoring: Issues and Opportunities.” Environmental Monitoring and Assessment 176 (1-4): 273–291. https://doi.org/10.1007/s10661-010-1582-5.
  • Cooper, Caren B., Chris L. Hawn, Lincoln R. Larson, Julia K. Parrish, Gillian Bowser, Darlene Cavalier, Robert R. Dunn, et al. 2021. “Inclusion in Citizen Science: The Conundrum of Rebranding.” Science 372 (6549): 1386–1388. https://doi.org/10.1126/science.abi6487.
  • *Cosquer, A., R. Raymond, and A.-C. Prevot-Julliard. 2012. “Observations of Everyday Biodiversity: A New Perspective for Conservation?” Ecology & Society 17 (4): 82–96.
  • *Crall, A. W., R. Jordan, K. Holfelder, G. J. Newman, J. Graham, and D. M. Waller. 2013. “The Impacts of an Invasive Species Citizen Science Training Program on Participant Attitudes, Behavior, and Science Literacy.” Public Understanding of Science (Bristol, England) 22 (6): 745–764. https://doi.org/10.1177/0963662511434894.
  • *Cronin, D. P., and J. E. Messemer. 2013. “Elevating Adult Civic Science Literacy through a Renewed Citizen Science Paradigm.” Adult Learning 24 (4): 143–150. https://doi.org/10.1177/1045159513499550.
  • *Cronje, R., S. Rohlinger, A. Crall, and G. Newman. 2011. “Does Participation in Citizen Science Improve Scientific Literacy? A Study to Compare Assessment Methods.” Applied Environmental Education & Communication 10 (3): 135–145. https://doi.org/10.1080/1533015X.2011.603611.
  • *Dantas Brites, A., and C. Morsello. 2017. “Beliefs about the Potential Impacts of Exploiting Non-Timber Forest Products Predict Voluntary Participation in Monitoring.” Environmental Management 59 (6): 898–911. https://doi.org/10.1007/s00267-017-0845-0.
  • *Daw, T. M., S. Coulthard, W. W. L. Cheung, K. Brown, C. Abunge, D. Galafassi, G. D. Peterson, T. R. McClanahan, J. O. Omukoto, and L. Munyi. 2015. “Evaluating Taboo Trade-Offs in Ecosystems Services and Human Well-Being.” Proceedings of the National Academy of Sciences of the United States of America 112 (22): 6949–6954. https://doi.org/10.1073/pnas.1414900112.
  • Day, G., R. A. Fuller, C. Nichols, and A. J. Dean. 2022. “Characteristics of Immersive Citizen Science Experiences That Drive Conservation Engagement.” People and Nature 4 (4): 983–995. https://doi.org/10.1002/pan3.10332.
  • *de Toledo, R. F., and L. L. Giatti. 2015. “Challenges to Participation in Action Research.” Health Promotion International 30 (1): 162–173. https://doi.org/10.1093/heapro/dau079.
  • *Dhillon, C. M. 2017. “Using Citizen Science in Environmental Justice: Participation and Decision-Making in a Southern California Waste Facility Siting Conflict.” Local Environment 22 (12): 1479–1496. https://doi.org/10.1080/13549839.2017.1360263.
  • Dillon, J., R. B. Stevenson, and A. E. Wals. 2016. “Introduction to the Special Section Moving from Citizen to Civic Science to Address Wicked Conservation Problems. Corrected by Erratum 12844.” Conservation Biology 30 (3): 450–455. https://doi.org/10.1111/cobi.12689.
  • *Dolrenry, S., L. Hazzah, and L. G. Frank. 2016. “Conservation and Monitoring of a Persecuted African Lion Population by Maasai Warriors.” Conservation Biology: The Journal of the Society for Conservation Biology 30 (3): 467–475. https://doi.org/10.1111/cobi.12703.
  • *Domroese, M. C., and E. A. Johnson. 2017. “Why Watch Bees? Motivations of Citizen Science Volunteers in the Great Pollinator Project.” Biological Conservation 208: 40–47. https://doi.org/10.1016/j.biocon.2016.08.020.
  • Dosemagen, S., and A. Parker. 2019. “Citizen Science Across a Spectrum: Building Partnerships to Broaden the Impact of Citizen Science.” Science & Technology Studies 32 (2): 24–33. https://doi.org/10.23987/sts.60419.
  • *Downs, T. J., L. Ross, D. Mucciarone, M.-C. Calvache, O. Taylor, and R. Goble. 2010. “Participatory Testing and Reporting in an Environmental-Justice Community of Worcester, Massachusetts: A Pilot Project.” Environmental Health: A Global Access Science Source 9 (1): 34–48. https://doi.org/10.1186/1476-069X-9-34.
  • *Doyle, R., and M. Krasny. 2003. “Participatory Rural Appraisal as an Approach to Environmental Education in Urban Community Gardens.” Environmental Education Research 9 (1): 91–115. https://doi.org/10.1080/13504620303464.
  • Drill, S., C. Rosenblatt, C. Cooper, D. Cavalier, and H. Ballard. 2022. “The Effect of the COVID-19 Pandemic and Associated Restrictions on Participation in Community and Citizen Science.” Citizen Science: Theory and Practice 7 (1): 1–4. https://doi.org/10.5334/cstp.463.
  • *Druschke, C. G., and C. E. Seltzer. 2012. “Failures of Engagement: Lessons Learned from a Citizen Science Pilot Study.” Applied Environmental Education & Communication 11 (3–4): 178–188. https://doi.org/10.1080/1533015X.2012.777224.
  • DuBois, B., M. E. Krasny, and J. G. Smith. 2018. “Connecting Brawn, Brains, and People: An Exploration of Non-Traditional Outcomes of Youth Stewardship Programs.” Environmental Education Research 24 (7): 937–954. https://doi.org/10.1080/13504622.2017.1373069.
  • *Environmental Reviews and Case Studies: FracTracker Survey and Case Studies: Application for Participatory GIS in Unconventional Natural Gas Development. 2012.
  • Eitzel, M. V., J. L. Cappadonna, C. Santos-Lang, R. E. Duerr, A. Virapongse, S. E. West, C. C. M. Kyba, et al. 2017. “Citizen Science Terminology Matters: Exploring Key Terms.” Citizen Science: Theory and Practice 2 (1): 2. https://doi.org/10.5334/cstp.113.
  • *Everett, G., and H. Geoghegan. 2016. “Initiating and Continuing Participation in Citizen Science for Natural History.” BMC Ecology 16 (Suppl 1): 13. https://doi.org/10.1186/s12898-016-0062-3.
  • Fanelli, D. 2012. “Negative Results Are Disappearing from Most Disciplines and Countries.” Scientometrics 90 (3): 891–904. https://doi.org/10.1007/s11192-011-0494-7.
  • *Ferreira, M., L. Soares, and F. Andrade. 2012. “Educating Citizens about Their Coastal Environments: Beach Profiling in the Coastwatch Project.” Journal of Coastal Conservation 16 (4): 567–574. https://doi.org/10.1007/s11852-012-0203-6.
  • *Forrester, Tavis D., Megan Baker, Robert Costello, Roland Kays, Arielle W. Parsons, and William J. McShea. 2017. “Creating Advocates for Mammal Conservation through Citizen Science.” Biological Conservation 208: 98–105. https://doi.org/10.1016/j.biocon.2016.06.025.
  • Freedman, D. H. 2010. “Why Scientific Studies Are so Often Wrong: The Streetlight Effect.” Discover Magazine 2010 (July-August), Retrieved from http://discovermagazine.com/2010/jul-aug/29-whyscientific-studies-often-wrong-streetlight-effect
  • Friedman, A. J., S. Allen, P. B. Campbell, L. D. Dierking, B. N. Flagg, C. Garibay, R. Korn, G. Silverstein, and D. A. Ucko. 2008. Framework for Evaluating Impacts of Informal Science Education Projects. Report from a National Science Foundation Workshop, Washington, D.C. (Available at: http://insci.org/resources/Eval_Framework.pdf)
  • *Gaddis, E. J. B., H. H. Falk, C. Ginger, and A. Voinov. 2010. “Effectiveness of a Participatory Modeling Effort to Identify and Advance Community Water Resource Goals in St. Albans, Vermont.” Environmental Modelling & Software 25 (11): 1428–1438. https://doi.org/10.1016/j.envsoft.2009.06.004.
  • *Galloway, A. W. E., R. J. Hickey, and G. M. Koehler. 2011. “A Survey of Ungulates by Students along Rural School Bus Routes.” Society & Natural Resources 24 (2): 201–204. https://doi.org/10.1080/08941920903222572.
  • Gosling, E., and K. J. Williams. 2010. “Connectedness to Nature, Place Attachment and Conservation Behaviour: Testing Connectedness Theory among Farmers.” Journal of Environmental Psychology 30 (3): 298–304. https://doi.org/10.1016/j.jenvp.2010.01.005.
  • Grant, M. J., and A. Booth. 2009. “A Typology of Reviews: An Analysis of 14 Review Types and Associated Methodologies.” Health Information and Libraries Journal 26 (2): 91–108. https://doi.org/10.1111/j.1471-1842.2009.00848.
  • *Grasser, S., C. Schunko, and C. R. Vogl. 2016. “Children as Ethnobotanists: Methods and Local Impact of a Participatory Research Project with Children on Wild Plant Gathering in the Grosses Walsertal Biosphere Reserve, Austria.” Journal of Ethnobiology & Ethnomedicine 12: 1–16.
  • *Gray, Steven, Rebecca Jordan, Alycia Crall, Greg Newman, Cindy Hmelo-Silver, Joey Huang, Whitney Novak, et al. 2017. “Combining Participatory Modelling and Citizen Science to Support Volunteer Conservation Action.” Biological Conservation 208: 76–86. https://doi.org/10.1016/j.biocon.2016.07.037.
  • Groulx, M., M. C. Brisbois, C. J. Lemieux, A. Winegardner, and L. Fishback. 2017. “A Role for Nature-Based Citizen Science in Promoting Individual and Collective Climate Change Action? A Systematic Review of Learning Outcomes.” Science Communication 39 (1): 45–76. https://doi.org/10.1177/1075547016688324.
  • Haklay, M. 2013. “Citizen Science and Volunteered Geographic Information: Overview and Typology of Participation.” In: Sui, D., Elwood, S., Goodchild, M. (eds) Crowdsourcing Geographic Knowledge. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-4587-2_7
  • *Harper, S. 2016. “Keystone Characteristics That Support Cultural Resilience in Karen Refugee Parents.” Cultural Studies of Science Education 11 (4): 1029–1060. https://doi.org/10.1007/s11422-015-9681-9.
  • Harris, E. M., C. G. Dixon, E. B. Bird, and H. L. Ballard. 2020. “For Science and Self: Youth Interactions with Data in Community and Citizen Science.” Journal of the Learning Sciences 29 (2): 224–263. https://doi.org/10.1080/10508406.2019.1693379.
  • *Haworth, B., J. Whittaker, and E. Bruce. 2016. “Assessing the Application and Value of Participatory Mapping for Community Bushfire Preparation.” Applied Geography 76: 115–127. https://doi.org/10.1016/j.apgeog.2016.09.019.
  • Hayes, M., P. S. Smith, and W. R. Midden. 2020. “Students as Citizen Scientists: It’s Elementary.” Science and Children 57 (9): 60–64. https://doi.org/10.1080/00368148.2020.12318579.
  • *Haywood, B. K., J. K. Parrish, and J. Dolliver. 2016. “Place-Based and Data-Rich Citizen Science as a Precursor for Conservation Action.” Conservation Biology: The Journal of the Society for Conservation Biology 30 (3): 476–486. https://doi.org/10.1111/cobi.12702.
  • He, Y., J. K. Parrish, S. Rowe, and T. Jones. 2019. “Evolving Interest and Sense of Self in an Environmental Citizen Science Program.” Ecology and Society 24 (2): 1–24. https://doi.org/10.5751/ES-10956-240233.
  • Heimlich, J. E., and N. M. Ardoin. 2008. “Understanding Behavior to Understand Behavior Change: A Literature Review.” Environmental Education Research 14 (3): 215–237. https://doi.org/10.1080/13504620802148881.
  • Heiss, R., and J. Matthes. 2017. “Citizen Science in the Social Sciences: A Call for More Evidence.” GAIA - Ecological Perspectives for Science and Society 26 (1): 22–26. https://doi.org/10.14512/gaia.26.1.7.
  • *Henly-Shepard, S., S. A. Gray, and L. J. Cox. 2015. “The Use of Participatory Modeling to Promote Social Learning and Facilitate Community Disaster Planning.” Environmental Science & Policy 45: 109–122. https://doi.org/10.1016/j.envsci.2014.10.004.
  • *Hermans, F., W. Haarmann, and J. Dagevos. 2011. “Evaluation of Stakeholder Participation in Monitoring Regional Sustainable Development.” Regional Environmental Change 11 (4): 805–815. https://doi.org/10.1007/s10113-011-0216-y.
  • Herrington, C. D., and R. Maynard. 2019. “Editors’ Introduction: Randomized Controlled Trials Meet the Real World: The Nature and Consequences of Null Findings.” Educational Researcher 48 (9): 577–579. https://doi.org/10.3102/0013189X19891441.
  • *Hesley, D., D. Burdeno, C. Drury, S. Schopmeyer, and D. Lirman. 2017. “Citizen Science Benefits Coral Reef Restoration Activities.” Journal for Nature Conservation 40: 94–99. https://doi.org/10.1016/j.jnc.2017.09.001.
  • *Hobbs, S. J., and P. C. L. White. 2012. “Motivations and Barriers in Relation to Community Participation in Biodiversity Recording.” Journal for Nature Conservation 20 (6): 364–373. https://doi.org/10.1016/j.jnc.2012.08.002.
  • *Hobbs, S. J., and P. C. L. White. 2015. “Achieving Positive Social Outcomes through Participatory Urban Wildlife Conservation Projects.” Wildlife Research 42 (7): 607–617. https://doi.org/10.1071/WR14184.
  • Holland, D., W. Lachicotte, D. Skinner, and C. Cain. 1998. Identity and Agency in Cultural Worlds. Cambridge, MA: Harvard University Press.
  • *Hollow, B., P. E. J. Roetman, M. Walter, and C. B. Daniels. 2015. “Citizen Science for Policy Development: The Case of Koala Management in South Australia.” Environmental Science & Policy 47: 126–136. https://doi.org/10.1016/j.envsci.2014.10.007.
  • Hollweg, K. S., J. R. Taylor, R. W. Bybee, T. J. Marcinkowski, W. C. McBeth, and P. Zoido. 2011. Developing a Framework for Assessing Environmental Literacy. Washington, DC: North American Association for Environmental Education.
  • *Hoover, E. 2016. “We’re Not Going to Be Guinea Pigs;” Citizen Science and Environmental Health in a Native American Community.” Journal of Science Communication 15 (01): A05. https://doi.org/10.22323/2.15010205.
  • *Huang, J., C. E. Hmelo-Silver, R. Jordan, S. Gray, T. Frensley, G. Newman, and M. J. Stern. 2018. “Scientific Discourse of Citizen Scientists: Models as a Boundary Object for Collaborative Problem Solving.” Computers in Human Behavior 87: 480–492. https://doi.org/10.1016/j.chb.2018.04.004.
  • Hughes, K. 2013. “Measuring the Impact of Viewing Wildlife: Do Positive Intentions Equate to Long-Term Changes in Conservation Behaviour?” 21 (1): 42–59. Journal of Sustainable Tourism.
  • Israel, B. A., E. Eng, A. J. Schulz, and E. A. Parker. 2013. Introduction to Methods for CBPR for Health. San Francisco, CA: Jossey-Bass.
  • Jacobson, S. K., M. D. McDuff, and M. C. Monroe. 2015. Conservation Education and Outreach Techniques. Oxford, UK: Oxford University Press.
  • Jadallah, C., and H. L. Ballard. 2021. “Social Learning in Participatory Approaches to Conservation and Natural Resource Management: Taking a Sociocultural Perspective.” Ecology and Society 26 (4): 34. https://doi.org/10.5751/ES-12654-260437.
  • Jadallah, C. C., and A. L. Wise. 2023. “Enduring Tensions between Scientific Outputs and Science Learning in Citizen Science.” Biological Conservation 284: 110141. https://doi.org/10.1016/j.biocon.2023.110141.
  • *Jansujwicz, J. S., A. J. K. Calhoun, and R. J. Lilieholm. 2013. “The Maine Vernal Pool Mapping and Assessment Program: Engaging Municipal Officials and Private Landowners in Community-Based Citizen Science.” Environmental Management 52 (6): 1369–1385. https://doi.org/10.1007/s00267-013-0168-8.
  • *Jennett, C., L. Kloetzer, D. Schneider, I. Iacovides, A. L. Cox, M. Gold, B. Fuchs, et al. 2016. “Motivations, Learning and Creativity in Online Citizen Science.” Journal of Science Communication 15 (03): A05. https://doi.org/10.22323/2.15030205.
  • *Joffre, O. M., R. H. Bosma, A. Ligtenberg, V. P. D. Tri, T. T. P. Ha, and A. K. Bregt. 2015. “Combining Participatory Approaches and an Agent-Based Model for Better Planning Shrimp Aquaculture.” Agricultural Systems 141: 149–159. https://doi.org/10.1016/j.agsy.2015.10.006.
  • *Johnson, M. F., C. Hannah, L. Acton, R. Popovici, K. K. Karanth, and E. Weinthal. 2014. “Network Environmentalism: Citizen Scientists as Agents for Environmental Advocacy.” Global Environmental Change 29: 235–245. https://doi.org/10.1016/j.gloenvcha.2014.10.006.
  • Jordan, R. C., H. L. Ballard, and T. B. Phillips. 2012. “Key Issues and New Approaches for Evaluating Citizen-Science Learning Outcomes.” Frontiers in Ecology and the Environment 10 (6): 307–309. https://doi.org/10.1890/110280.
  • *Jordan, R. C., S. A. Gray, D. V. Howe, W. R. Brooks, and J. G. Ehrenfeld. 2011. “Knowledge Gain and Behavioral Change in Citizen-Science Programs.” Conservation Biology: The Journal of the Society for Conservation Biology 25 (6): 1148–1154. https://doi.org/10.1111/j.1523-1739.2011.01745.x.
  • *Jordan, R., W. Brooks, D. Howe, and J. Ehrenfeld. 2012. “Evaluating the Performance of Volunteers in Mapping Invasive Plants in Public Conservation Lands.” Environmental Management 49 (2): 425–434. https://doi.org/10.1007/s00267-011-9789-y.
  • *Jordan, R., S. Gray, A. Sorensen, G. Newman, D. Mellor, G. Newman, C. Hmelo-Silver, S. Ladeau, D. Biehler, and A. Crall. 2016. “Studying Citizen Science through Adaptive Management and Learning Feedbacks as Mechanisms for Improving Conservation.” Conservation Biology: The Journal of the Society for Conservation Biology 30 (3): 487–495. https://doi.org/10.1111/cobi.12659.
  • Jørgensen, F. A., and D. Jørgensen. 2021. “Citizen Science for Environmental Citizenship.” Conservation Biology: The Journal of the Society for Conservation Biology 35 (4): 1344–1347. https://doi.org/10.1111/cobi.13649.
  • Kali, Y., O. Sagy, C. Matuk, and R. Magnussen. 2023. “School Participation in Citizen Science (SPICES): Substantiating a Field of Research and Practice.” Instructional Science 51 (5): 687–694. https://doi.org/10.1007/s11251-023-09638-4.
  • Kloetzer, L., J. Lorke, J. Roche, Y. Golumbic, S. Winter, and A. Jõgeva. 2021. “Learning in Citizen Science.” In The Science of Citizen Science, edited by Vohland, K., et al. Berlin: Springer.
  • *Kopainsky, B., G. Hager, H. Herrera, and P. H. Nyanga. 2017. “Transforming Food Systems at Local Levels: Using Participatory System Dynamics in an Interactive Manner to Refine Small-Scale Farmers’ Mental Models.” Ecological Modelling 362: 101–110. https://doi.org/10.1016/j.ecolmodel.2017.08.010.
  • Korfiatis, K., and S. Petrou. 2021. “Participation and Why It Matters: Children’s Perspectives and Expressions of Ownership, Motivation, Collective Efficacy and Self-Efficacy and Locus of Control.” Environmental Education Research 27 (12): 1700–1722. https://doi.org/10.1080/13504622.2021.1959900.
  • *Koss, R. S., and J. ‘Yotti Kingsley. ‘ 2010. “Volunteer Health and Emotional Wellbeing in Marine Protected Areas.” Ocean & Coastal Management 53 (8): 447–453. https://doi.org/10.1016/j.ocecoaman.2010.06.002.
  • *Kountoupes, D. L., and K. S. Oberhauser. 2008. “Citizen Science and Youth Audiences: Educational Outcomes of the Monarch Larva Monitoring Project.” Journal of Community Engagement & Scholarship 1 (1): 10–20.
  • Krasny, M. E. 2020. Advancing Environmental Education Practice, 312. Ithaca, N.Y.: Cornell University Press.
  • Kudryavtsev, A., R. C. Stedman, and M. E. Krasny. 2012. “Sense of Place in Environmental Education.” Environmental Education Research 18 (2): 229–250. https://doi.org/10.1080/13504622.2011.609615.
  • Kythreotis, Andrew P., Chrystal Mantyka-Pringle, Theresa G. Mercer, Lorraine E. Whitmarsh, Adam Corner, Jouni Paavola, Chris Chambers, Byron A. Miller, and Noel Castree. 2019. “Citizen Social Science for More Integrative and Effective Climate Action: A Science-Policy Perspective.” Frontiers in Environmental Science 7: 10. https://doi.org/10.3389/fenvs.2019.00010.
  • *Land-Zandstra, A. M., J. L. A. Devilee, F. Snik, F. Buurmeijer, and J. M. van den Broek. 2016. “Citizen Science on a Smartphone: Participants’ Motivations and Learning.” Public Understanding of Science (Bristol, England) 25 (1): 45–60. https://doi.org/10.1177/0963662515602406.
  • *Langsdale, S. M., A. Beall, J. Carmichael, S. J. Cohen, C. B. Forster, and T. Neale. 2009. “Exploring the Implications of Climate Change on Water Resources through Participatory Modeling: Case Study of the Okanagan Basin, British Columbia.” Journal of Water Resources Planning and Management 135 (5): 373–381. https://doi.org/10.1061/(ASCE)0733-9496(2009)135:5(373).
  • *Lewandowski, E. J., and K. S. Oberhauser. 2017a. “Butterfly Citizen Scientists in the United States Increase Their Engagement in Conservation.” Biological Conservation 208: 106–112. https://doi.org/10.1016/j.biocon.2015.07.029.
  • *Lewandowski, E. J., and K. S. Oberhauser. 2017b. “Contributions of Citizen Scientists and Habitat Volunteers to Monarch Butterfly Conservation.” Human Dimensions of Wildlife 22 (1): 55–70. https://doi.org/10.1080/10871209.2017.1250293.
  • Lindgren, S., K. Morris, and A. Price. 2021. “Designing Environmental Storylines to Achieve the Complementary Aims of Environmental and Science Education through Science and Engineering Practices.” The Journal of Environmental Education 52 (4): 239–255. https://doi.org/10.1080/00958964.2021.1949569.
  • *López-Marrero, T., and P. Tschakert. 2011. “From Theory to Practice: Building More Resilient Communities in Flood-Prone Areas.” Environment and Urbanization 23 (1): 229–249. https://doi.org/10.1177/0956247810396055.
  • *Lynch, L. I., J. M. Dauer, W. A. Babchuk, T. Heng-Moss, and D. Golick. 2018. “In Their Own Words: The Significance of Participant Perceptions in Assessing Entomology Citizen Science Learning Outcomes Using a Mixed Methods Approach.” Insects 9 (1): 16. https://doi.org/10.3390/insects9010016.
  • *Malone, Samantha, Matthew Kelso, Drew Michanowicz, Kyle Ferrar, Kyra Naumoff Shields, and Jill Kriesky. 2012. “FracTracker Survey and Case Studies: Application for Participatory GIS in Unconventional Natural Gas Development.” Environmental Practice 14 (4): 342–351. https://doi.org/10.1017/S1466046612000324.
  • Marcinkowski, T. 2003. “Commentary on Rickinson’s ‘Learners and Learning in Environmental Education: A Critical Review of the Evidence’ (EER 7 (3)).” Environmental Education Research 9 (2): 181–214. https://doi.org/10.1080/13504620303474.
  • *McCall, M. K., and P. A. Minang. 2005. “Assessing Participatory GIS for Community-Based Natural Resource Management: Claiming Community Forests in Cameroon.” The Geographical Journal 171 (4): 340–356. https://doi.org/10.1111/j.1475-4959.2005.00173.x.
  • *McEwen, L. 2011. “Approaches to Community Flood Science Engagement: The River Severn Catchment, UK as Case-Study.” The International Journal of Science in Society 2 (4): 159–180. https://doi.org/10.18848/1836-6236/CGP/v02i04/51288.
  • *McGreavy, B., A. J. K. Calhoun, J. Jansujwicz, and V. Levesque. 2016. “Citizen Science and Natural Resource Governance: Program Design for Vernal Pool Policy Innovation.” Ecology & Society 21 (2): 649–659.
  • *Merenlender, A. M., A. W. Crall, S. Drill, M. Prysby, and H. Ballard. 2016. “Evaluating Environmental Education, Citizen Science, and Stewardship through Naturalist Programs.” Conservation Biology: The Journal of the Society for Conservation Biology 30 (6): 1255–1265. https://doi.org/10.1111/cobi.12737.
  • *Metcalf, S. S., E. Wheeler, T. K. BenDor, K. S. Lubinski, and B. M. Hannon. 2010. “Sharing the Floodplain: Mediated Modeling for Environmental Management.” Environmental Modelling & Software 25 (11): 1282–1290. https://doi.org/10.1016/j.envsoft.2008.11.009.
  • Miller-Rushing, A., R. Primack, and R. Bonney. 2012. “The History of Public Participation in Ecological Research.” Frontiers in Ecology and the Environment 10 (6): 285–290. https://doi.org/10.1890/110278.
  • *Mitchell, N., M. Triska, A. Liberatore, L. Ashcroft, R. Weatherill, and N. Longnecker. 2017. “Benefits and Challenges of Incorporating Citizen Science into University Education.” PloS One 12 (11): E 0186285. https://doi.org/10.1371/journal.pone.0186285.
  • Moher, D., A. Liberati, J. Tetzlaff, and D. G. Altman, PRISMA Group. 2009. “Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement.” PLoS Medicine 6 (7): e1000097. https://doi.org/10.1371/journal.pmed.1000097.
  • Monroe, M. C., R. R. Plate, A. Oxarart, A. Bowers, and W. A. Chaves. 2019. “Identifying Effective Climate Change Education Strategies: A Systematic Review of the Research.” Environmental Education Research 25 (6): 791–812. https://doi.org/10.1080/13504622.2017.1360842.
  • Mueller, M., D. Tippins, and L. Bryan. 2012. “The Future of Citizen Science.” Democracy & Education 20 (1): Article 2.
  • Muro, M., and P. Jeffrey. 2008. “A Critical Review of the Theory and Application of Social Learning in Participatory Natural Resource Management Processes.” Journal of Environmental Planning and Management 51 (3): 325–344. https://doi.org/10.1080/09640560801977190.
  • *Musesengwa, R., M. J. Chimbari, and S. Mukaratirwa. 2017. “Initiating Community Engagement in an Ecohealth Research Project in Southern Africa.” Infectious Diseases of Poverty 6 (1): 22. https://doi.org/10.1186/s40249-016-0231-9.
  • Nasir, N. I. S., and V. Hand. 2008. “From the Court to the Classroom: Opportunities for Engagement, Learning, and Identity in Basketball and Classroom Mathematics.” Journal of the Learning Sciences 17 (2): 143–179. https://doi.org/10.1080/10508400801986108.
  • National Research Council. 2009. “Learning Science in Informal Environments: People, Places, and Pursuits. Committee on Learning Science in Informal Environments.” In Board on Science Education, Center for Education. Division of Behavioral and Social Sciences and Education, edited by Feder, M. A., Shouse, A. W., Lewenstein, B., & Bell, P. Washington, D.C.: National Academies Press.
  • National Academies of Sciences, Engineering, and Medicine. 2018. Learning through Citizen Science: Enhancing Opportunities by Design. Washington, D.C.: National Academies Press.
  • NGSS Lead States. 2013. Next Generation Science Standards: For States, by States. Washington, D.C.: National Academies Press. https://doi.org/10.17226/18290.
  • *Newman, G., A. Crall, M. Laituri, J. Graham, T. Stohlgren, J. C. Moore, K. Kodrich, and K. A. Holfelder. 2010. “Teaching Citizen Science Skills Online: Implications for Invasive Species Training Programs.” Applied Environmental Education & Communication 9 (4): 276–286. https://doi.org/10.1080/1533015X.2010.530896.
  • *Nicosia, K., S. Daaram, B. Edelman, L. Gedrich, E. He, S. McNeilly, V. Shenoy, et al. 2014. “Determining the Willingness to Pay for Ecosystem Service Restoration in a Degraded Coastal Watershed: A Ninth Grade Investigation.” Ecological Economics 104: 145–151. https://doi.org/10.1016/j.ecolecon.2014.02.010.
  • Olsson, P., and C. Folke. 2001. “Local Ecological Knowledge and Institutional Dynamics for Ecosystem Management: A Study of Lake Racken Watershed, Sweden.” Ecosystems 4 (2): 85–104. https://doi.org/10.1007/s100210000061.
  • Page, Matthew J., Joanne E. McKenzie, Patrick M. Bossuyt, Isabelle Boutron, Tammy C. Hoffmann, Cynthia D. Mulrow, Larissa Shamseer, et al. 2021. “The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews.” Systematic Reviews 10 (1): 89. https://doi.org/10.1186/s13643-021-01626-4.
  • *Parker, E., L. Chung, B. Israel, A. Reyes, and D. Wilkins. 2010. “Community Organizing Network for Environmental Health: Using a Community Health Development Approach to Increase Community Capacity around Reduction of Environmental Triggers.” The Journal of Primary Prevention 31 (1–2): 41–58. https://doi.org/10.1007/s10935-010-0207-7.
  • Parrish, J. K., T. Jones, H. K. Burgess, Y. He, L. Fortson, and D. Cavalier. 2019. “Hoping for Optimality or Designing for Inclusion: Persistence, Learning, and the Social Network of Citizen Science.” Proceedings of the National Academy of Sciences of the United States of America 116 (6): 1894–1901. https://doi.org/10.1073/pnas.1807186115.
  • Pateman, R. M., and S. E. West. 2023. “Citizen Science: Pathways to Impact and Why Participant Diversity Matters.” Citizen Science: Theory and Practice 8 (1): 50. https://doi.org/10.5334/cstp.569.
  • Peter, M., T. Diekötter, and K. Kremer. 2019. “Participant Outcomes of Biodiversity Citizen Science Projects: A Systematic Literature Review.” Sustainability 11 (10): 2780. https://doi.org/10.3390/su11102780.
  • Phillips, T. B., H. L. Ballard, B. V. Lewenstein, and R. Bonney. 2019. “Engagement in Science through Citizen Science: Moving beyond Data Collection.” Science Education 103 (3): 665–690. https://doi.org/10.1002/sce.21501.
  • Phillips, T., N. Porticella, M. Constas, and R. Bonney. 2018. “A Framework for Articulating and Measuring Individual Learning Outcomes from Participation in Citizen Science.” Citizen Science: Theory and Practice 3 (2): 3. https://doi.org/10.5334/cstp.126.
  • *Postma, J., J. Peterson, M. J. Ybarra Vega, C. Ramon, and G. Cortes. 2014. “Latina Youths’ Perceptions of Children’s Environmental Health Risks in an Agricultural Community.” Public Health Nursing (Boston, Mass.) 31 (6): 508–516. https://doi.org/10.1111/phn.12112.
  • Purdam, K. 2014. “Citizen Social Science and Citizen Data? Methodological and Ethical Challenges for Social Research.” Current Sociology 62 (3): 374–392. https://doi.org/10.1177/0011392114527997.
  • *Quandt, S. A., A. M. Doran, P. Rao, J. A. Hoppin, B. M. Snively, and T. A. Arcury. 2004. “Reporting Pesticide Assessment Results to Farmworker Families: Development, Implementation, and Evaluation of a Risk Communication Strategy.” Environmental Health Perspectives 112 (5): 636–642. https://doi.org/10.1289/ehp.6754.
  • Receveur, A., L. Poulet, B. Dalmas, B. Gonçalves, and A. Vernay. 2022. “Citizen Science: How to Extend Reciprocal Benefits from the Project Community to the Broader Socio-Ecological System.” Quantitative Plant Biology 3: e20. https://doi.org/10.1017/qpb.2022.16.
  • Reid, A., J. Dillon, N. Ardoin, and J. A. Ferreira. 2021. “Scientists’ Warnings and the Need to Reimagine, Recreate, and Restore Environmental Education.” Environmental Education Research 27 (6): 783–795. https://doi.org/10.1080/13504622.2021.1937577.
  • Roth, W. M., and A. Calabrese Barton. 2004. Rethinking Scientific Literacy. New York: Routledge.
  • Roth, W. M., and S. Lee. 2004. “Science Education as/for Participation in the Community.” Science Education 88 (2): 263–291. https://doi.org/10.1002/sce.10113.
  • *Sabai, Daniel, and Heila Sisitka. 2013. “Analyzing Learning at the Interface of Scientific and Traditional Ecological Knowledge in a Mangrove Ecosystem Restoration Scenario in the Eastern Coast of Tanzania.” Transylvanian Review of Systematical and Ecological Research 15 (2): 185–210. https://doi.org/10.2478/trser-2013-0027.
  • Sadiković, S., B. Branovački, M. Oljača, D. Mitrović, D. Pajić, and S. Smederevac. 2020. “Daily Monitoring of Emotional Responses to the Coronavirus Pandemic in Serbia: A Citizen Science Approach.” Frontiers in Psychology 11: 2133. https://doi.org/10.3389/fpsyg.2020.02133.
  • *Sandker, M., B. M. Campbell, M. Ruiz-Pérez, J. A. Sayer, R. Cowling, H. Kassa, and A. T. Knight. 2010. “The Role of Participatory Modeling in Landscape Approaches to Reconcile Conservation and Development.” Ecology & Society 15 (2): 1–16.
  • *Scheuch, M., T. Panhuber, S. Winter, J. Kelemen-Finan, M. Bardy-Durchhalter, and S. Kapelari. 2018. “Butterflies & Wild Bees: Biology Teachers’ PCK Development through Citizen Science*.” Journal of Biological Education 52 (1): 79–88. https://doi.org/10.1080/00219266.2017.1405530.
  • Schultz, P. W. 2001. “The Structure of Environmental Concern: Concern for Self, Other People, and the Biosphere.” Journal of Environmental Psychology 21 (4): 327–339.
  • *Scott, C. M. 2016. “Using Citizen Science to Engage Preservice Elementary Educators in Scientific Fieldwork.” Journal of College Science Teaching 046 (02): 37–41. https://doi.org/10.2505/4/jcst16_046_02_37.
  • *Scyphers, S. B., S. P. Powers, J. L. Akins, J. M. Drymon, C. W. Martin, Z. H. Schobernd, P. J. Schofield, R. L. Shipp, and T. S. Switzer. 2015. “The Role of Citizens in Detecting and Responding to a Rapid Marine Invasion.” Conservation Letters 8 (4): 242–250. https://doi.org/10.1111/conl.12127.
  • Shah, H. R., and L. R. Martinez. 2016. “Current Approaches in Implementing Citizen Science in the Classroom.” Journal of Microbiology & Biology Education 17 (1): 17–22. https://doi.org/10.1128/jmbe.v17i1.1032.
  • Shirk, Jennifer L., Heidi L. Ballard, Candie C. Wilderman, Tina Phillips, Andrea Wiggins, Rebecca Jordan, Ellen McCallie, et al. 2012. “Public Participation in Scientific Research: A Framework for Deliberate Design.” Ecology and Society 17 (2): 29. https://doi.org/10.5751/ES-04705-170229.
  • *Sickler, J., T. M. Cherry, L. Allee, R. R. Smyth, and J. Losey. 2014. “Scientific Value and Educational Goals: Balancing Priorities and Increasing Adult Engagement in a Citizen Science Project.” Applied Environmental Education & Communication 13 (2): 109–119. https://doi.org/10.1080/1533015X.2014.947051.
  • *Sîrbu, A., M. Becker, S. Caminiti, B. De Baets, B. Elen, L. Francis, P. Gravino, et al. 2015. “Participatory Patterns in an International Air Quality Monitoring Initiative.” PloS One 10 (8): e0136763. https://doi.org/10.1371/journal.pone.0136763.
  • Sitzmann, T., K. Ely, K. G. Brown, and K. N. Bauer. 2010. “Self-Assessment of Knowledge: A Cognitive Learning or Affective Measure?” Academy of Management Learning & Education 9 (2): 169–191. https://doi.org/10.5465/AMLE.2010.51428542.
  • *Staddon, S. A. M. C., A. N. D. R. E. A. Nightingale, and S. H. Y. A. M. K. Shrestha. 2015. “Exploring Participation in Ecological Monitoring in Nepal’s Community Forests.” Environmental Conservation 42 (3): 268–277. https://doi.org/10.1017/S037689291500003X.
  • Stepenuck, K. F., and L. T. Green. 2015. “Individual-and Community-Level Impacts of Volunteer Environmental Monitoring: A Synthesis of Peer-Reviewed Literature.” Ecology and Society 20 (3): 19. https://doi.org/10.5751/ES-07329-200319.
  • Stevenson, K. T., M. N. Peterson, H. D. Bondell, A. G. Mertig, and S. E. Moore. 2013. “Environmental, Institutional, and Demographic Predictors of Environmental Literacy among Middle School Children.” PloS One 8 (3): e59519. https://doi.org/10.1371/journal.pone.0059519.
  • *Storey, R. G., A. Wright-Stow, E. Kin, R. J. Davies-Colley, and R. Stott. 2016. “Volunteer Stream Monitoring: Do the Data Quality and Monitoring Experience Support Increased Community Involvement in Freshwater Decision Making?” Ecology and Society 21 (4): 32. https://doi.org/10.5751/ES-08934-210432.
  • *Sullivan, J., S. Petronella, E. Brooks, L. Rudkin, J. Ward, and M. Murillo. 2007. “Using Participatory Action Research Methods to Troubleshoot Communications Issues in an NIEHS Environmental Justice Partnerships Project.” Texas Public Health Association Journal 59 (2): 45–48.
  • Tauginienė, Loreta, Eglė Butkevičienė, Katrin Vohland, Barbara Heinisch, Maria Daskolia, Monika Suškevičs, Manuel Portela, Bálint Balázs, and Baiba Prūse. 2020. “Citizen Science in the Social Sciences and Humanities: The Power of Interdisciplinarity.” Palgrave Communications 6 (1): 1–11. https://doi.org/10.1057/s41599-020-0471-y.
  • Taylor, K. H., and R. Hall. 2013. “Counter-Mapping the Neighborhood on Bicycles: Mobilizing Youth to Reimagine the City.” Technology, Knowledge and Learning 18 (1–2): 65–93. https://doi.org/10.1007/s10758-013-9201-5.
  • Thomas, R., T. Teel, B. Bruyere, and S. Laurence. 2018. “Metrics and Outcomes of Conservation Education: A Quarter Century of Lessons Learned.” Environmental Education Research 25 (2): 172–192. https://doi.org/10.1080/13504622.2018.1450849.
  • Tuck, E., and M. McKenzie. 2014. Place in Research: Theory, Methodology, and Methods. New York: Routledge.
  • *van Rijsoort, J., and Z. Jinfeng. 2005. “Participatory Resource Monitoring as a Means for Promoting Social Change in Yunnan, China.” Biodiversity and Conservation 14 (11): 2543–2573. https://doi.org/10.1007/s10531-005-8377-y.
  • Vasiliades, M. A., A. C. Hadjichambis, D. Paraskeva-Hadjichambi, A. Adamou, and Y. Georgiou. 2021. “A Systematic Literature Review on the Participation Aspects of Environmental and Nature-Based Citizen Science Initiatives.” Sustainability 13 (13): 7457. https://doi.org/10.3390/su13137457.
  • *Vitone, T., K. A. Stofer, M. S. Steininger, J. Hulcr, R. Dunn, and A. Lucky. 2016. “School of Ants Goes to College: Integrating Citizen Science into the General Education Classroom Increases Engagement with Science.” Journal of Science Communication 15 (01): A03. https://doi.org/10.22323/2.15010203.
  • *Wal, R., N. Sharma, C. Mellish, A. Robinson, and A. Siddharthan. 2016. “The Role of Automated Feedback in Training and Retaining Biological Recorders for Citizen Science.” Conservation Biology: The Journal of the Society for Conservation Biology 30 (3): 550–561. https://doi.org/10.1111/cobi.12705.
  • Wals, A. E., M. Brody, J. Dillon, and R. B. Stevenson. 2014. “Convergence between Science and Environmental Education.” Science (New York, N.Y.) 344 (6184): 583–584. https://doi.org/10.1126/science.1250515.
  • *West, S. E., and R. M. Pateman. 2016. “Recruiting and Retaining Participants in Citizen Science: What Can Be Learned from the Volunteering Literature?” Citizen Science: Theory and Practice 1 (2): 15: 1–10. https://doi.org/10.5334/cstp.8.
  • *Whitman, G. P., R. Pain, and D. G. Milledge. 2015. “Going with the Flow? Using Participatory Action Research in Physical Geography.” Progress in Physical Geography: Earth and Environment 39 (5): 622–639. https://doi.org/10.1177/0309133315589707.
  • Wiggins, A., and K. Crowston. 2011. “From Conservation to Crowdsourcing: A Typology of Citizen Science.” In 2011 44th Hawaii International Conference on System Sciences, 1–10.
  • Williams, K. A., T. E. Hall, and K. O’Connell. 2021. “Classroom-Based Citizen Science: Impacts on Students’ Science Identity, Nature Connectedness, and Curricular Knowledge.” Environmental Education Research 27 (7): 1037–1053. https://doi.org/10.1080/13504622.2021.1927990.
  • *Wilson, S. M., D. Campbell, K. Burwell, L. Rice, and E. M. Williams. 2012. “Assessment and Impact of a Summer Environmental Justice and Health Enrichment Program: A Model for Pipeline Development.” Environmental Justice 5 (6): 279–286. https://doi.org/10.1089/env.2012.0014.
  • *Wing, Steve, Rachel Avery Horton, Naeema Muhammad, Gary R. Grant, Mansoureh Tajik, and Kendall Thu. 2008. “Integrating Epidemiology, Education, and Organizing for Environmental Justice: Community Health Effects of Industrial Hog Operations.” American Journal of Public Health 98 (8): 1390–1397. https://doi.org/10.2105/AJPH.2007.110486.
  • *Wright, D. R., L. G. Underhill, M. Keene, and A. T. Knight. 2015. “Understanding the Motivations and Satisfactions of Volunteers to Improve the Effectiveness of Citizen Science Programs.” Society & Natural Resources 28 (9): 1013–1029. https://doi.org/10.1080/08941920.2015.1054976.
  • Yan, Shulong, Alexandra I. Race, Heidi L. Ballard, Erin Bird, Sol Henson, Evan F. Portier, Amanda Lindell, Maryam Ghadiri Khanaposhtani, Jadda M. Miller, and Emma R. Schectman. 2023. “How Can Participating in a Forest Community and Citizen Science Program Support Elementary School Students’ Understanding of Socio-Ecological Systems?” Sustainability 15 (24): 16832. https://doi.org/10.3390/su152416832.
  • *Zeegers, Y., K. Paige, D. Lloyd, and P. Roetman. 2012. “Operation Magpie”: Inspiring Teachers’ Professional Learning through Environmental Science.” Australian Journal of Environmental Education 28 (1): 27–41. https://doi.org/10.1017/aee.2012.4.