4,947
Views
10
CrossRef citations to date
0
Altmetric
Review article

Meeting volunteer expectations — a review of volunteer motivations in citizen science and best practices for their retention through implementation of functional features in CS tools

ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 2089-2113 | Received 18 Jul 2019, Accepted 30 Oct 2020, Published online: 02 Feb 2021

Abstract

Citizen Science (CS) projects vary greatly. The aims and goals of a CS project determine the type of citizen involvement and the tools to be used, which in most cases also entail information and communication technology (ICT) that facilitates public participation in scientific research. Resource limitations in CS projects often require adopting suboptimal tools, which, however, may come with hidden costs stemming from poor usability and underwhelming functionality, thus reducing volunteers’ motivation. Meeting the volunteers’ expectations by designing or using existing tools with functional features which fulfill and nurture their motivations, will foster long-term participation and contribute to project sustainability. This paper reviews the types of CS projects, volunteer motivation and retention strategies from the literature and classifies them thematically. This is distilled into guidance that can help CS practitioners to design and implement CS tools and plan and manage CS projects, which better serve their scientific and volunteer-related goals.

1. Introduction

Citizen Science (CS) refers to the voluntary public engagement in scientific research activities, such as, collecting and processing environmental data and helping to answer real-world scientific questions with their intellectual effort, knowledge of their surroundings or tools and other resources (Bonney, Cooper, et al. Citation2009; Sanz et al. Citation2014; Silvertown Citation2009). Methods to practice CS have undergone a revolution over the last two decades, mainly due to the advances in Information and Communication Technologies (ICT), sensor technologies and social networking, which has resulted in new tools which foster open, efficient and agile systems (Sanz et al. Citation2014 ; Silvertown Citation2009; Wald, Longo, and Dobell Citation2016). Many CS projects rely on surveys completed by volunteers who report objective or subjective observations about their environment; a task made ever-easier using modern technologies such as smartphone apps that facilitate the interaction between scientists and volunteers. Many projects require volunteers to submit data on, e.g. biodiversity and environmental phenomena or conditions using image acquisition, web- and smartphone-based applications and surveys (Couvet and Prevot Citation2015; Havlik and Schimak Citation2014; Wallace, Snedigar, and Cameron Citation2015). More and more CS projects also explore the use of low-cost devices, including handheld or portable sensor systems (Aspuru et al. Citation2016; Thompson Citation2016; Uhrner et al. Citation2014). Data collected are increasingly visualized in web-based portals or via smart phone apps in near real time. Moreover, there is a gradual shift toward participatory, bottom-up and co-design processes, where citizens are not just data collectors and where communication is only one way, as in traditional contributory crowdsourcing projects, but are involved in advanced participatory communication models addressing community needs (Mominó, Piera, and Jurado Citation2016; Sanz et al. Citation2014; Shirk et al. Citation2012). In these more reciprocal and inclusive projects, such as Citizen Observatory (CO) projects, the aim is to directly benefit the citizens and society generally, rather than science alone (Grainger Citation2017). These projects strengthen social capital, collective intelligence, scientific capacity, and inclusiveness in local decision making (Aspuru, Herranz-Pascual, and Santander Citation2016; Dickinson et al. Citation2012; Wehn et al. Citation2018). They also provide the public with scientific information, which is accessible in forms they can understand and utilize (Golumbic, Fishbain, and Baram-Tsabari Citation2019).

Making sure that the tools used in CS are of scientific value and provide meaningful information to the volunteers, many authors call for more user involvement in the design of the services (Golumbic, Fishbain, and Baram-Tsabari Citation2019; Preece Citation2016; Robinson et al. Citation2018; Sanz et al. Citation2014; Skarlatidou et al. Citation2019). It is frequently reported that the lack of user involvement can cause problems both for the volunteers as well as for the scientists, affecting motivation and data quality, and to this end, it is suggested that Human-Computer Interaction (HCI) research should be incorporated in Citizen Science (Preece Citation2016). If data collection is too complicated, or too time-consuming, volunteers often lose their initial desire and drive to participate (Robinson et al. Citation2018; See, Fritz et al. Citation2016).

The advantages of well-designed ICT products are numerous; for example, they enable CS projects to engage volunteers more efficiently, increase scientific productivity and keep volunteers engaged for a longer time (Wald, Longo, and Dobell Citation2016). Further, involving users in the product design process increases their satisfaction with the final product (Mahmood et al. Citation2000). Moreover, understanding volunteers’ motivation and fulfilling their expectations increase their satisfaction, which in turn determines their level of participation (Wright et al. Citation2015). Consequently, involving the citizens at various design stages and incorporating their feedback increases their expectations and capacity for participation, while conversely, failing to do so inhibits satisfaction and participation (Mahmood et al. Citation2000).

Despite the apparent benefits of involving citizens in the design process of CS projects and tools, their involvement is not yet the mainstream practice. Resource limitations in projects usually constrain inclusive involvement of citizens, leading to the use of tools which might lack appropriate functionality (Wiggins Citation2013). Furthermore, limiting crowdsourcing only to a proof of concept level e.g. to test and develop a new CS tool might be de-motivating for the volunteers, who are otherwise altruistic and ready to substantially contribute (Havlik and Schimak 2014). Knowing certain features of tools which are known to work can help to bridge the gap for those who do not have the resources to involve volunteers in the design process, yet acknowledge the benefits. Few authors have attempted to create guidelines to facilitate the overall design process. For example, Jennett and Cox (Citation2014) provided guidelines for virtual CS projects focusing on website interfaces. Similarly, Skarlatidou et al. (Citation2019) summarized best practices for CS applications, with a focus on web interface features based on a systematic literature review. Our paper extends the scope of these technical guidelines covering a broader topic of volunteer motivation, which is the driving force that determines the types of technical features CS tools need to meet volunteer expectations and retain their interest long-term. The aim of the paper is to make a synthesis of current knowledge from the CS literature with respect to i) understanding types of CS related projects and the characteristics of the volunteers involved, including participation patterns, ii) grouping common volunteer motivations according to functional categories, and iii) summarizing strategies for retaining volunteers, with examples of functional features for CS tools.

This literature review draws best practices from inclusive CS projects including Citizen Observatories, since the trend is toward collaboration and co-design in line with the level of understanding and involvement of the volunteers. It aims at helping to manage volunteer expectations better and ultimately supporting the planning and management of successful and sustainable CS projects.

The work was conducted within the H2020 SMURBS/ERA-PLANET projectFootnote1 under a wider literature review on Citizens Observatories, which covered 352 articles found in SCOPUS, Web of Science and PubMed in July 2018. Detailed methodology can be found in Robinson (Citation2019). For the purpose of this manuscript with a narrower focus on citizen involvement, only articles which address inclusive citizen science were considered. Additional articles were reviewed for the purpose of specific chapters to improve readability and to cover a joint discussion of motivations, volunteer experiences and suggestions for design.

2. Characteristics of CS projects and volunteers

What is common to all CS projects is that they involve both competent communities, i.e. the scientists and the volunteers who are participating as citizen scientists. The experts and CS practitioners involved in the projects include, but are not limited to, environmental scientists, social scientists, hardware and software specialists, and data analysts. Depending on the type of CS project, the target group of volunteers and other stakeholders might involve locals living nearby who are affected by the issue at hand (Grossberndt and Liu Citation2016; Uhrner et al. Citation2014), schools (Hunt et al. Citation2015; Kobernus et al. Citation2013; Mominó, Piera, and Jurado Citation2016), representatives of local officials (Mietlicki, Gaudibert, and Vincent Citation2012) and do-it-yourself (DIY) communities (Busch et al. Citation2016). In addition, external stakeholders such as scientists from other domains, NGOs, and government agencies may want to harness the data for various purposes, such as complementing their own data sets (Cooper et al. Citation2017; Ferster and Coops Citation2013; Hunt et al. Citation2015; Wehn and Evers Citation2015). It is observed that people who take part in CS projects are generally highly educated and belong to middle and upper socioeconomic classes (Soleri et al. Citation2016), are retired (Wright et al. Citation2015) or have an already established interest in the topic (Mazumdar et al. Citation2017).

The specific aims and goals of a Citizen Science project are to determine what kind of volunteers it seeks to engage and what kind of tools are best suited for its purpose (Wiggins and Crowston Citation2011). To meet the expectations of volunteers and CS practitioners, who typically consist of experts with various backgrounds (some without previous experience of CS), communicating the objectives early on is crucial to the process (Grossberndt and Liu Citation2016). These are commonly written using technical terms and need to be translated into a common language to be easily understood by the layperson (Sanz et al. Citation2014). Understanding and using the correct terminology can help define the aims already outlined in the project proposal at the design stage while reducing misunderstanding later on by clarifying participants’ roles, responsibilities and limitations (Grossberndt and Liu Citation2016; Wiggins and Crowston Citation2011). There is currently no single definition of CS, but there are a plethora of different types of CS projects that reveal the dynamics of this evolving research field (Sanz et al. Citation2014). The terminology used in the literature is summarized in . The definitions provide additional information about the level and type of participation.

Table 1. Citizen Science terminology.

In addition to the terminologies, CS projects can be classified according to different levels of engagement, e.g. contributory, collaborative and co-designed CS projects (Bonney, Ballard, et al. Citation2009; Shirk et al. Citation2012). The involvement of the public and the amount of control over different steps of the project increases in this classification. In contributory projects the volunteers are merely involved in data collection, where as in collaborative projects, in addition to data collection, they can also analyze and disseminate data and participate in the project design. Co-designed projects present the highest degree of engagement including the co-design phase, as well as active involvement in most or all aspects of the research itself. While some projects aim at long term volunteer commitment (e.g. birding communities [Sullivan et al. Citation2009]), others might rely on event-driven volunteer contributions, e.g. monitoring ash fall (Wallace, Snedigar, and Cameron Citation2015), earthquakes (Dell'Acqua and De Vecchi Citation2017), and invasive species (Crall et al. Citation2011).

The level and pattern of volunteer participation, as well as satisfaction, is known to vary and evolve (Ferster and Coops Citation2013; Grainger Citation2017; Houghton et al. Citation2019; Jennett et al. Citation2016; Wright et al. Citation2015). Some contribute consistently, others in bursts of activity, while others contribute more sporadically (Ferster and Coops Citation2013). Moreover, as Grainger (Citation2017) points out, some of the observing activities are regular and thus advocates that the volunteers’ activities need to become institutionalized in the sense of having reoccurring patterns. However, in reality this might be disrupted due to unforeseen incidents such as an illness or other causes which hinder routine. Volunteers’ contribution levels might indeed plummet due to loss of interest or time, or because of changing prioritizations (Jennett et al. Citation2016). The decision of volunteers to contribute is influenced by whether the particular activity fits with the volunteers’ needs and goals (Clary and Snyder Citation1999; Wright et al. Citation2015). That is why understanding volunteer psychology, e.g. their motivations, will promote the efficient gathering of quality data while optimizing the benefits for the volunteers (Wright et al. Citation2015). Familiarity with the theories that explain participation patterns, e.g. the Technology Acceptance Model (TAM), Theory of Diffusion of Innovations (DIT), Theory of Planned Behavior (TPB) and the 90-9-1 rule for participation inequality, will help CS practitioners to understand the different roles and characteristics as well as the information needs of citizen scientists (Haklay Citation2016; Lai Citation2017). The TAM by Davis in 1986 focuses on acceptance of information technologies explaining user behavior. It is comprised of the Perceived Usefulness (PU) and Perceived Ease of Use (PEU) components. The DIT Theory, developed by Rogers in 1995, helps researchers understand the acceptance and adoption of innovations among individuals and organizations over time. Rogers’ famous S-shaped curve encapsulates the innovators, early adopters, early majority, late majority and laggers. The TPB was developed by Ajzen in 1991 and theorizes the behavioral intention of the person’s attitudes toward a particular behavior, broken down into attitude, subjective norms and perceived behavioral control. Participation inequality or the 90-9-1 rule described by Nielsen in 2006 is the phenomenon where a small percentage of volunteers contribute a significant proportion of information to the total output, while the majority are less involved. To give an example, in the well-known Zooniverse project, only 4–7% of volunteers actively contribute over long periods (Sauermann and Franzoni Citation2015), which implies that even the most successful projects need robust mechanisms to increase and sustain participation. The upside however is that these “super volunteers” are most likely able to contribute consistently and submit good quality data (Jennett et al. Citation2016).

Similarly, several authors have identified different roles of volunteers and participation characteristics. For example Cooper et al. (Citation2017), divides volunteers into two categories, data collectors and data consumers, whereas Mominó, Piera, and Jurado (Citation2016) introduce the so called “makers”, “observers” and “analyzers” with different levels and roles of participation. More specifically, the makers describe people with abilities and interests in the technologies with the capacity to build DIY observation devices. The observers collect observations and the analyzers interpret the data. Following a different classification Kobernus et al. (Citation2013) have identified six citizen roles. The Observer makes the initial observation, the Publisher makes the observation discoverable by others, the Discoverer finds an existing data resource, the Service Provider makes the information accessible to others to use and download, and finally, the Service Orchestrator who combines existing services for a distinct purpose i.e. a smartphone application, and the Decision maker who exploits the data to make an informed decision.

3. Motivation of CS volunteers

Motivation drives people to take action (Kaufmann, Schulze, and Veit Citation2011). The higher the motivation, the more the volunteers are satisfied with the project outcomes and are willing to continue to contribute (Clary and Snyder Citation1999; Wright et al. Citation2015). The nature of volunteerism can be multi-motivational, a volunteer having both altruistic and egoistic reasons to participate e.g. both wanting to selflessly help as well as desire to benefit oneself (Clary and Snyder Citation1999). Motivation can be classified into intrinsic and extrinsic motivation. Intrinsically motivated individuals might, for example, participate due to the project being fun, while extrinsically motivated individuals might be motivated due to short or long term pay offs, such as direct compensations or improved skills (Kaufmann, Schulze, and Veit Citation2011). Intrinsically motivated people are more likely to stay committed for longer (See, Fritz et al. Citation2016; Wright et al. Citation2015), while using incentives, such as rewards might have the opposite effect on volunteerism, due to the perception of external control versus personal control (Clary and Snyder Citation1999; Gharesifard and Wehn Citation2016). Clary and Snyder (Citation1999) indeed observed that volunteerism can be induced on the short term with incentives, i.e. external control, yet the stronger perception of personal control to volunteerism, i.e. participating from altruistic reasons, is the drive for contribution in the long run. The level of participation, as well as the goals and attitudes driving the participation, can vary over time (Kaufmann, Schulze, and Veit Citation2011; Wright et al. Citation2015). Initial interest to participate might be triggered by e.g. curiosity, interest in science and desire to contribute to research, which are factors that may not necessarily justify ongoing and sustained engagement (Jennett et al. Citation2016).

The motivational factors reported in the literature are thematically grouped and summarized in . Clary and Snyder’s (Citation1999) and Wright et al.’s (Citation2015) Volunteer Functions Inventory (VFI) was used to classify the different factors into five categories: (i) Values (ii) Personal development (iii)) Career and recognition (iv) Social, and (v) Recreation. Each are provided with an example that binds them with the concept of CS. Volunteer motivation can be studied through stakeholder and volunteer surveys as well as focus groups and analyzed with qualitative content analyses e.g. with coding, or with quantitative research methods using statistical tests (Clary and Snyder Citation1999; Kaufmann, Schulze, and Veit Citation2011; Robinson et al. Citation2018; Wright et al. Citation2015). Surveys should be designed together with scientists experienced in social sciences, and if necessary, approved by the researchers’ ethical committee (Druschke and Seltzer Citation2012).

Table 2. Summary of volunteer motivations.

Intrinsic motivation drives many people to volunteer. Protecting and improving the environment and making it a better place to live are amongst the most well-documented motivations in environmental CS projects (Bruyere and Rappe Citation2007; Havlik and Schimak 2014; See, Fritz et al. Citation2016). Many feel a civic responsibility to address certain issues affecting local communities (Hunt et al. Citation2015; Montargil and Santos Citation2017b; See, Fritz et al. Citation2016) and like to see the impact of their contribution to policy making, fulfilling them with the feeling of empowerment (See, Fritz et al. Citation2016). Contributing to science is also a strong motivation (Dickinson et al. Citation2012; Grainger Citation2017; Robinson et al. Citation2018; See, Fritz et al. Citation2016; Wright et al. Citation2015).

Citizen science projects can offer informal- (Hunt et al. Citation2015) and indirect community- (Jennett et al. Citation2016) learning opportunities and contribute to robust learning outcomes (Dickinson et al. Citation2012). Some volunteers want to learn more about familiar places, i.e. their neighborhood (Devictor, Whittaker, and Beltrame Citation2010; Grainger Citation2017; Robinson et al. Citation2018) and the natural environment (Bruyere and Rappe Citation2007), while at the same time enabling them to gain valuable and relevant information about the topics addressed in the project (Grainger Citation2017; Montargil and Santos Citation2017b). Others are interested in the science behind the project (See, Fritz et al. Citation2016) and the technicalities of data collection and analysis, which also enables them to make new “discoveries” while exploiting the collected data (Grainger Citation2017; Robinson et al. Citation2018). Curiosity may also drive many volunteers to participate (Devictor, Whittaker, and Beltrame Citation2010; Havlik and Schimak Citation2014).

In addition to informal learning opportunities, CS projects can help people gain valuable knowledge useful for their current or future career, especially in Science, Technology, Engineering and Mathematics (STEM) (Hunt et al. Citation2015). The volunteers also expect to gain some recognition for their input, e.g. through feedback and interaction with the CS community and scientists (Grainger Citation2017; Havlik and Schimak Citation2014; Kotovirta et al. Citation2015; See, Fritz et al. Citation2016). Some common incentives are monetary rewards (Kotovirta et al. Citation2015) and prizes (See, Fritz et al. Citation2016). However, Clary and Snyder (Citation1999) and Gharesifard and Wehn (Citation2016) warn about their demotivating effect because of their perceived external control, and counter that a greater perceived personal control will lead to stronger motivations and greater intentions to volunteer. Recognition of volunteers’ contribution can also be accomplished by providing them with an opportunity to perform more complex tasks, which in turn also requires more responsibility allowing them to achieve “expert status” in the project (See, Fritz et al. Citation2016). An example of this approach is providing experienced participants the role of moderating discussion forums (Jennett et al. Citation2016). For others, it is important to feel like an active participant and co-owner in the project (See, Fritz et al. Citation2016). Bruyere and Rappe (Citation2007) and Wright et al. (Citation2015) emphasize the need to recognize the significance of volunteers’ contributions, which will satisfy and enhance the volunteers’ intrinsic motivation.

Participating in activities in a community of likeminded people is an important social motivation (Bruyere and Rappe Citation2007; Dickinson et al. Citation2012; Grainger Citation2017; Montargil and Santos Citation2017b; See, Fritz et al. Citation2016; Wright et al. Citation2015). The social interaction with people with similar interests can be perceived as an important means of socializing and making new friends (Bruyere and Rappe Citation2007; See, Fritz et al. Citation2016).

Being able to connect the CS activity with a volunteer’s existing recreational activities will nurture the volunteer’s motivation to keep it fun (Grainger Citation2017; Havlik and Schimak Citation2014; See, Fritz et al. Citation2016). These activities can take place outdoors while performing observations in the natural environment, e.g. in biodiversity-related CS projects, or by performing tasks using a computer at home (online CS projects) (Grainger Citation2017; Tserstou et al. Citation2017; Wright et al. Citation2015).

4. Retention of CS volunteers

Retaining motivated volunteers is crucial for the sustainability of a CS project (See, Fritz et al. Citation2016). If citizens do not see any value and usefulness in their participation, loss of interest and abandonment of effort is likely (Ferster and Coops Citation2013; Wright et al. Citation2015) as is a lack of initial engagement (Gharesifard and Wehn Citation2016; Wright et al. Citation2015). In addition to the common constraint of lack of time, other barriers for sustained involvement include the tasks’ complexity, being mundane, or even apprehension for submitting incorrect data (Jennett et al. Citation2016). See, Fritz et al. (Citation2016), and Dickinson et al. (Citation2012) suggest communication strategies which take into account both volunteer recruitment and retention. In order to get the word out in recruiting the volunteers, Dickinson et al. (Citation2012) suggest using well-timed press-releases that are picked up by national and local media. Newsletters, blogs, and social-networking groups help to create a sense of community, in addition to active forms of communication, such as contents, badging systems and methods to recognize the effort of the volunteers (Dickinson et al. Citation2012). Providing rapid feedback to volunteers and regular communication on their contributions have also shown to work in terms of retaining a community (See, Fritz et al. Citation2016). Clary and Snyder (Citation1999) emphasize that those recruitment strategies, which address the specific motivational functions underlying behavior and attitudes, are the most successful. Montargil and Santos (Citation2017a) underline the critical balance between the participation effort and return benefits. Many projects lack motivation mechanisms and rely on volunteers’ goodwill and interest to assist in research (Kotovirta et al. Citation2015). However, several approaches have been established to motivate volunteers. These approaches are summarized in , with examples of functional features.

Table 3. Summary of approaches to retain volunteers.

The above-listed features match with the volunteers’ motivational functions, which CS practitioners should nurture. For example, Havlik and Schimak (Citation2014) argue that motivation determines the fate of the built infrastructure and therefore an application designed to be used by the volunteers should be fun, nourish curiosity and enable the user to make a difference, in order for them to stay motivated. can function as a checklist for CS practitioners, first during the design phase of the project to include as many features as possible, and secondly, to evaluate the project, whether these points were met to allow for correcting actions or a post-evaluation of the user-centricity of the CS project. The analysis, however, should take into account certain tradeoffs between the features of and other requirements of the project, such as robustness and scientific integrity of measurements as well as quality assurance (QA) and quality control (QC) procedures.

Resource limitations in CS projects often require adopting suboptimal ICT tools which come with hidden costs from poor usability and lack of appropriate functionality (Wiggins Citation2013). In order to adapt the tools to meet the various volunteers’ motivations and needs, they should be useful, easy to use and versatile (Grainger Citation2017; Havlik and Schimak 2014; Kobernus et al. Citation2013; Kotovirta et al. Citation2015; Mazumdar et al. Citation2017; Prakash et al. Citation2004; Wallace, Snedigar, and Cameron Citation2015). The first step to achieve this is to understand and adapt to the citizens’ needs as it is not enough to provide them with some arbitrary tools (Lawton et al. Citation2011; Liu and Kobernus Citation2017). Secondly, it is important to provide data in an easy to comprehend format. As Grainger (Citation2017) points out, data becomes information only after someone understands the method of data generation, knows how to interpret the data and knows how to use the data meaningfully. If data is presented only as lists of figures or spreadsheets that only experts can interpret, the project will lose part of its audience (Kobernus et al. Citation2013). Golumbic, Fishbain, and Baram-Tsabari (Citation2019) suggest to include Human-Computer Interaction (HCI) and User Centered Design (UCD) in the design process, since there is no universal user, and hence no universal interface. In this way, the design process becomes more inclusive and will increase the overall attractiveness of the interface. Nevertheless, in practical terms, many CS projects simply lack the resources to involve users in the design process, or it might not be their aim or priority, in which case, a good tradeoff would be to use preexisting tools, which are already deemed fit for purpose. If the project explicitly aims at developing a new tool, the volunteers should be involved in the design process, where as if the aim is to collect scientific data, it is more cost-efficient to use existing CS tools proven to be user-friendly.

Druschke & Seltzer (Citation2012) emphasize that it is critical to provide access to the data the volunteers collect, even preliminary ones, as soon as possible, in order to allow them to provide feedback and inquire on an issue that might interest them, while at the same time providing them with a feeling of connectedness to the study and demonstrating the value of their involvement. Providing rapid feedback might even involve utilizing suboptimal tools, but still, this expectation can sometimes be difficult to meet depending on analytical demands combined with time constraints.

See, Fritz et al. (Citation2016) emphasize how user needs can change, and there is a need to understand the volunteers’ skills, expectations and interest to adjust the developed tools accordingly, which might mean that data analysis, modeling and presentation of results must be adjusted for distinct user groups (Gharesifard, Wehn, and van der Zaag Citation2017; Grainger Citation2017; Kobernus et al. Citation2013). Golumbic, Fishbain, and Baram-Tsabari (Citation2019) suggest a multilayer information display to address different user needs. For example, scientists might be interested in the raw data, while the general public might prefer off-the-shelf data visualization products (Prakash et al. Citation2004). An example of adaptation to different user needs in data visualization platforms is given in and is based on the findings of Prakash et al. (Citation2004). However, many features depend on the level of desired participation, e.g. in virtual projects where participants analyze data vs. investigative projects where volunteers submit data, as well as the project goals (e.g. conservation vs. investigation), and physical location (e.g. local, global or virtual) (Wiggins and Crowston Citation2011).

Table 4. Adapting to user needs after Prakash et al. (Citation2004).

Citizen science campaigns should allow volunteers to match their interests with those environmental issues that directly affect their environment and provide relevance and meaning to their daily lives (Grossberndt and Liu Citation2016; Kotovirta et al. Citation2015). They can do this by involving communities, such as schools and other local populations directly affected by the issue (Ferster and Coops Citation2013; Grossberndt and Liu Citation2016). While the CS project will provide opportunities for informal training of the volunteers, the scientists will also gain valuable information as these locals will hold valuable knowledge of the study area and related issues and who can be actively involved in solving them (Ferster and Coops Citation2013; Guillaume et al. Citation2016; Hunt et al. Citation2015; Mominó, Piera, and Jurado Citation2016; Montargil and Santos Citation2017b; Williamson Citation2009). Mobile applications should be used both for extracting and disseminating information. The data should be visible to the end user since it can affect their willingness to participate and is a critical success factor (Montargil and Santos Citation2017a). The volunteers should receive instant qualitative feedback by, for example, displaying data on a live map (See, Fritz et al. Citation2016).

The desire by the volunteers to contribute to science and society means CS practitioners should frequently communicate the importance of their contribution and how their data is being utilized (Kotovirta et al. Citation2015; Mackay et al. Citation2015; See, Fritz et al. Citation2016; Wright et al. Citation2015). According to Grossberndt and Liu (Citation2016) to foster meaningful participation, it is essential to reassure the volunteers that the results are being used to solve real issues. Similarly, Aspuru, Herranz-Pascual, and Santander (Citation2016) noticed the importance of participants reaffirming that their observations can affect the decision-making process. It is also important to acknowledge their input, e.g. rewarding them with certificates of recognition (Dickinson et al. Citation2012; See, Fritz et al. Citation2016).

Providing access to individual contributions is rewarding for volunteers (Sullivan et al. Citation2009) and will boost their curiosity and personal developmental goals when they are given the opportunity to analyze their data (de Assis et al. Citation2018; Gharesifard, Wehn, and van der Zaag Citation2017; Hunt et al. Citation2015; Marantos et al. Citation2017). Sharing the data is also an integral part of establishing trust, fairness and value for those who contribute, while it increases comprehension of environmental issues by the general public (Ferster and Coops Citation2013; Mietlicki, Gaudibert, and Vincent Citation2012; Miorandi et al. Citation2013).

Rewarding volunteers is an effective way to encourage and support participation (See, Fritz et al. Citation2016). On the one hand, Clary and Snyder (Citation1999) acknowledge the efficiency of external rewards, yet on the other, they caution that once the external pressures to perform observations is removed, the motivational force to participate might shift and thus advocate for using methods which foster intrinsic motivation. In addition to providing access to individual data, which were classified separately, providing different levels of progression or reputation ranking and introducing elements of competition between volunteers including contests, incentives, and badging systems, might be rewarding to some of the volunteers (Dickinson et al. Citation2012; Kotovirta et al. Citation2015; See, Fritz et al. Citation2016). Increasingly CS projects are incorporating elements of gamification, for example, by asking users to conduct micro-tasks, e.g. classifying images, which is a type of volunteer thinking (See, Fritz et al. Citation2016). Gamification can help to solicit and maintain contributions from wider, already existing, communities, e.g. online communities (Simperl et al. Citation2018). It transforms the crowdsourcing procedure and makes it more engaging and fun (Gutiérrez et al. Citation2016). Gamification can also incorporate educational functions which can both enhance participation and learning (Mominó, Piera, and Jurado Citation2016). Some volunteers can indeed find playing a game rewarding (Jennett et al. Citation2016).

To support a volunteer’s desire to feel part of a community of likeminded people, adding social networking options to the crowdtasking might boost their motivation (Dickinson et al. Citation2012; See, Fritz et al. Citation2016). Creating a sense of community through discussion forums and social networking groups will enable social interactions with people of similar interests, e.g. to be able to connect and share experiences with peer contributors (Kotovirta et al. Citation2015). Enabling volunteers to interact might also lead to scientific discoveries that would have been otherwise overlooked by scientists, the project forum being a key space for such discussion between fellow citizen scientists (Jennett et al. Citation2016). Connecting people to frame and solve issues together will also create a sense of being part of a community (Williamson Citation2009). In some cases, merely watching submitted online data from other contributors can be enough for others to feel part of a group (See, Fritz et al. Citation2016).

Stakeholders need to achieve sufficient levels of trust, ownership and continuity to achieve the desired outcomes of social learning and engagement (Wehn et al. Citation2018). Volunteers should be offered ways to feel inspired, involved, and active in order to affect educational, attitudinal, behavioral, and scientific outcomes (Druschke and Seltzer Citation2012). Volunteers need to be actively and iteratively involved in the different aspects of the project, e.g. in the design, implementation and dissemination phases to foster the feeling of co-ownership in the project (Breen et al. Citation2015; Gharesifard, Wehn, and van der Zaag Citation2017; Grossberndt and Liu Citation2016; Mazumdar et al. Citation2017). Being involved will also give the volunteers a feeling of control over the scientific process (Grossberndt and Liu Citation2016). Furthermore, the more aspects of the project the volunteers are involved in, the more likely they are to retain interest; this includes especially the possibility to engage in various activities beyond the main task, such as discussion forums in online CS projects (Jennett et al. Citation2016). Yet, Wright et al. (Citation2015) caution that there is a fine line between overextending and supporting participation, and a balance needs to be struck and managed for equitable participation.

With regards to trust, Barcellos et al. (Citation2016), Gharesifard and Wehn (Citation2016) and Golumbic, Fishbain, and Baram-Tsabari (Citation2019) suggest providing information about the origin of the data and metadata in the platforms and using disclaimers. The metadata should include, for example, information about the observations, i.e. under what conditions the data were acquired, type and accuracy of the equipment used and how often equipment is calibrated (Gharesifard and Wehn Citation2016). The disclaimer should inform the user about the reliability of the data (Gharesifard and Wehn Citation2016).

As discussed in section 2, user patterns and levels of participation are known to vary. To keep users active, or to revitalize their activity, the users can be reminded to contribute, for example, with the help of the geofencing approach (Kotovirta et al. Citation2015; Mazumdar et al. Citation2018). Geofencing can be used to trigger a volunteer to act near a pre-defined location where they will receive a notification on their smartphones and can decide to act or disregard the message.

Volunteers should also receive training and sufficient information before data collection campaigns (Mietlicki, Gaudibert, and Vincent Citation2012; See, Fritz et al. Citation2016). They need to be provided with adequate information on the data collection procedure and to receive help on demand (Aspuru, Herranz-Pascual, and Santander Citation2016; See, Fritz et al. Citation2016). The information can be passed on remotely by providing volunteers with user-guides, instruction sheets, guidelines and protocols on the use of the measuring device and observation recording, or face-to-face guidance during a workshop or training session (Grainger Citation2017; Hunt et al. Citation2015; Liu and Kobernus Citation2017; See, Fritz et al. Citation2016; Wehn et al. Citation2015). Trained volunteers can then use their gained knowledge to train more users (Kotovirta et al. Citation2015). Training of volunteers will create win-win situations; the volunteers will increase their skills, knowledge and experience while the scientists will receive higher quality data (Aspuru, Herranz-Pascual, and Santander Citation2016; Jones et al. Citation2010; Liu and Kobernus Citation2017; Mietlicki, Gaudibert, and Vincent Citation2012; See, Fritz et al. Citation2016).

Investing time and effort to involve citizens does not guarantee success. Reporting on a project that appeared to be a success, at least regarding scientific outcome, Druschke and Seltzer (Citation2012) warned other CS practitioners that while superficially it may seem that all aspects of CS engagement have been taken into account, it is not enough to provide user-friendly info and tools and to set up a community communication medium. The volunteers need to be actively involved from the design phase onwards. Otherwise the apparent efforts towards engagement of volunteers in scientific data collection does not guarantee dual benefits e.g. positive learning outcomes for the participants.

5. Sustainability

Projects face many challenges concerning their long term sustainability. In CS projects, sustainability is guaranteed when both volunteers and the infrastructure e.g. data platform is maintained over the long term, and therefore retaining motivated volunteers is not the only aspect of sustainability (Ferster and Coops Citation2013; See, Fritz et al. Citation2016; Tapia et al. Citation2014; Wright et al. Citation2015). Several previously discussed points affect the sustainability of the project, such as creating tools and services which are usable, versatile, appealing, engaging, easy to use and intuitive (Aspuru, Herranz-Pascual, and Santander Citation2016; Barcellos et al. Citation2016; Botteldooren et al. Citation2013; Busch et al. Citation2016; Castell et al. Citation2015; Havlik and Schimak Citation2014; Kotovirta et al. Citation2015; Simpson, Page, and De Roure Citation2014). However, in order to sustain the infrastructure and other elements of the project, actions should be taken to foster a broader and more longer-lived influence beyond involved volunteers. In addition to providing the volunteers with access to the data, providing access to the general public (Ferster and Coops Citation2013; Montargil and Santos Citation2017a) and other interested parties, such as third-party developers (Ferster and Coops Citation2013; Miorandi et al. Citation2013) in the form of open APIs that extend the project’s reach to other platforms and applications, can foster its continuation and long term success. For this, it is important to use open standards for interoperability to complement and re-use existing datasets (Botteldooren et al. Citation2013; Busch et al. Citation2016; Liu and Kobernus Citation2017). By using standardized parameters and methods, data can be accepted more widely into other data repositories and to reach a wider user base (Busch et al. Citation2016), while interoperability becomes imperative when fusing data from different platforms and developing “universally” operated applications.

Creating a business model, e.g. selling a software license or finding organizations interested in the data or its user groups (Havlik and Schimak Citation2014), while making sure the data is preserved, curated and documented in a data management plan (Mietlicki, Gaudibert, and Vincent Citation2012; See, Fritz et al. Citation2016), can sustain the project features beyond its lifetime. Commercially driven projects and cross-financing can help to sustain a project financially (Gharesifard, Wehn, and van der Zaag Citation2017; Vincent et al. Citation2011). Close collaboration with well-known existing organizations and integrating the interactive features of CS using their infrastructure, such as enabling citizens to contribute observations through their webpage, can further promote the longevity of the project (Cooper et al. Citation2017; Grainger Citation2017; Sanz et al. Citation2014; Wehn and Evers Citation2015). This is both an efficient way to reach out to the target audience and at the same time, assure a communication network that can be used for information and support.

Reaching wider audiences via mass participation and scaling up the project geographically will increase both the number of observations and their geographical coverage (Gutiérrez et al. Citation2016). Other means to increase a broader uptake are, for example, the promotion of tools and services (Liu and Kobernus Citation2017) as well as ensuring they provide relevance and meaning to various stakeholders, such as enabling them to influence governance (Montargil and Santos Citation2017b; Vincent et al. Citation2011). Finally, Grossberndt and Liu (Citation2016) call for close project documentation of both successes and failures for other projects as a mean to learn from, while Sanz et al. (Citation2014) emphasize the need for evaluation of the projects.

6. Conclusions

This paper provides a typology of CS projects, and introduces a joint discussion on the motivation and expectations of volunteers, together with functional features of CS tools to retain them, including approaches which help to sustain a CS project over the long term. The broad application areas and types of CS projects make it difficult to describe a one-size-fits-all solution, which is why knowledge and suggestions summarized here should be considered and used as a map and reference point to inspire and guide CS practitioners in adopting specific aspects tailored to their projects. This is further supported, as emphasized previously by Freitag and Pfeffer (Citation2013), by the fact that different CS projects perceive success differently, which also depends on the particularities of the local setting. As the field of inclusive CS continues to grow, the joint discussion presented about volunteer motivation, retention strategies and design aspects is unambiguously of high value for the CS community. The value cross-cuts scale from the local to the global. Both scholarly and practice-based communities can adopt good practices presented here and in conjunction with the continuous technological innovations make the difference in incorporating CS practices in local use cases, while, given the increasing role of local views for addressing global issues, enhance the international relevance of CS i.e. in support of the Sustainable Development Goals (SDG) frame (Fraisl et al. Citation2020; Fritz et al. Citation2019).

In summary, this paper makes the following recommendations:

  1. Clarity of project aims as well as managing expectations early on in the process are vital elements of success.

  2. Recruitment of volunteers and stakeholders should be based on relevance criteria and after thorough collection and comprehension of their motivations and needs.

  3. Volunteers should be actively involved in all stages of the project and its products.

  4. Functional features should match the volunteers’ expectations and for this knowledge gained during their implementation should be fed back.

  5. The sustainability of projects is better served when relevant factors are addressed early in the design phase and potential exploitation scenarios are foreseen.

Importantly, citizens should have a central role in CS projects. Due to the transdisciplinary nature of CS projects, the focus should not only be on scientific outcomes, but also on the volunteers themselves and the participatory process needed for their efficient involvement. Examining and taking into account volunteers’ motivation and needs during the project design stage will result in longer-term participation retention, as long as their motives are fulfilled and a win-win situation is accomplished. Practitioners can achieve this by providing features within the tools used in CS projects which enable the matching of volunteers’ expectations and motivation to their benefit. A dynamic balance between co-benefit of volunteers and scientific objectives should be established for best-fit-for-purpose. Overall, for the optimization of CS applications, volunteers and infrastructure should be regarded as one integral ecosystem, whose good health significantly drives long term sustainability and success.

Acknowledgements

The authors would like to thank all SMURBS project partners who contributed in reviewing the CS and CO literature as part of the preparation used for SMURBS/ERA-PLANET project deliverable 3.5 “Citizen Observatories (COs) implementation”, a work which inspired the synthesis of this manuscript. We also thank the anonymous reviewers for providing feedback that greatly improved this manuscript.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was funded by Slovenian Research Agency/Javna Agencija za Raziskovalno Dejavnost Republike Slovenije (ARRS) program P1-0143 “Cycling of substances in the environment, mass balances, modelling of environmental processes and risk assessment and the and the ARRS Young researchers programme. This work was partly funded by ERA-PLANET (www.era-planet.eu), trans-national project SMURBS (www.smurbs.eu) (Grant Agreement No. 689443) under the EU Horizon 2020 Framework Programme.

Notes

1 Smurbs.eu

References

  • Aspuru, Itziar, Igone García, Karmele Herranz, and Alvaro Santander. 2016. “CITI-SENSE: Methods and Tools for Empowering Citizens to Observe Acoustic Comfort in Outdoor Public Spaces.” Noise Mapping 3 (1): 37–48. doi:10.1515/noise-2016-0003.
  • Aspuru, Itziar, Karmele Herranz-Pascual, and Alvaro Santander. 2016. “Empowering People on the Assessment of the Acoustic Comfort of Urban Places: CITI-SENSE Project.” In Proceedings of the INTER-NOISE 2016 - 45th International Congress and Exposition on Noise Control Engineering: Towards a Quieter Future, 4534–4543. Hamburg, Germany: Institute of Noise Control Engineering. https://www.ingentaconnect.com/contentone/ince/incecp/2016/00000253/00000004/art00075.
  • Barcellos, Christovam, Emmanuel Roux, Pietro Ceccato, Pierre Gosselin, Antonio Miguel Monteiro, Vanderlei Pascoal de Matos, and Diego Ricardo Xavier. 2016. “An Observatory to Gather and Disseminate Information on the Health-Related Effects of Environmental and Climate Change.” Revista Panamericana de Salud Publica = Pan American Journal of Public Health 40 (3): 167–173.
  • Bonney, Rick. 1996. “Citizen Science: A Lab Tradition.” Living Bird: For the Study and Conservation of Birds 15 (4): 7–15.
  • Bonney, Rick, Heidi Ballard, Rebecca Jordan, Ellen McCallie, Tina Phillips, Jennifer Shirk, and Candie C. Widelman. 2009. Public Participation in Scientific Research: Defining the Field and Assessing Its Potential for Informal Science Education. A CAISE Inquiry Group Report. Washington, DC, USA: Center for Advancement of Informal Science Education (CAISE). http://www.birds.cornell.edu/citscitoolkit/publications/CAISE-PPSR-report-2009.pdf/view.
  • Bonney, Rick, Caren B. Cooper, Janis Dickinson, Steve Kelling, Tina Phillips, Kenneth V. Rosenberg, and Jennifer Shirk. 2009. “Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy.” BioScience 59 (11): 977–984. doi:10.1525/bio.2009.59.11.9.
  • Botteldooren, Dick, Timothy Van Renterghem, Damiano Oldoni, Dauwe Samuel, Luc Dekoninck, Pieter Thomas, Weigang Wei, et al. 2013. “The Internet of Sound Observatories.” Proceedings of Meetings on Acoustics 19 (1): 040140–040140. doi:10.1121/1.4799869.
  • Breen, Jessica, Shanon Dosemagen, Jeffrey Warren, and Mathew Lippincott. 2015. “Mapping Grassroots: Geodata and the Structure of Community-Led Open Environmental Science.” ACME 14: 849–873. https://s3.amazonaws.com/academia.edu.documents/39033566/1236-3646-1-PB.pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=1555677669&Signature=LjBQ0lvm10Nwx3VR3GG9hFGBuik%3D&response-content-disposition=inline%3B%20filename%3DMapping_Grassroots_Geodata_and_the_struc.pdf.
  • Bruyere, Brett, and Silas Rappe. 2007. “Identifying the Motivations of Environmental Volunteers.” Journal of Environmental Planning and Management 50 (4): 503–516. doi:10.1080/09640560701402034.
  • Burke, J.A., D. Estrin, M. Hansen, A. Parker, N. Ramanathan, S. Reddy, and M. B. Srivastava. 2006. Participatory Sensing. Boulder, Colorado, USA: AMC SensSys ’06.
  • Busch, Julia, Raul Bardaji, Luigi Ceccaroni, Anna Friedrichs, Jaume Piera, Carine Simon, Peter Thijsse., et al. 2016. “Citizen Bio-Optical Observations from Coast- and Ocean and Their Compatibility with Ocean Colour Satellite Measurements.” Remote Sensing 8 (11): 879. doi:10.3390/rs8110879.
  • Castell, Nuria, Mike Kobernus, Hai-Ying Liu, Philipp Schneider, William Lahoz, Arne J. Berre, and Josef Noll. 2015. “Mobile Technologies and Services for Environmental Monitoring: The Citi-Sense-MOB Approach.” Urban Climate 14 (3): 370–382. doi:10.1016/j.uclim.2014.08.002.
  • Charvat, Karel, Bente Lilja Bye, Tomas Mildorf, Arne J. Berre, and Karel Jedlicka. 2018. “Open Data, VGI and Citizen Observatories INSPIRE Hackathon.” International Journal of Spatial Data Infrastructures Research 13 (0): 109–130. doi:10.2902/ijsdir.v13i0.481.
  • Clary, E. Gil, and Mark Snyder. 1999. “The Motivations to Volunteer: Theoretical and Practical Considerations.” Current Directions in Psychological Science 8 (5): 156–159. doi:10.1111/1467-8721.00037.
  • Cohn, Jeffrey P. 2008. “Citizen Science: Can Volunteers Do Real Research?” BioScience 58 (3): 192–197. doi:10.1641/B580303.
  • Cooper, Caren, Lincoln Larson, Kathleen Krafte Holland, Rebecca Gibson, David Farnham, Diana Hsueh, Patricia Culligan, and Wade McGillis. 2017. “Contrasting the Views and Actions of Data Collectors and Data Consumers in a Volunteer Water Quality Monitoring Project: Implications for Project Design and Management.” Citizen Science: Theory and Practice 2 (1): 8. doi:10.5334/cstp.82.
  • Couvet, Denis, and Anne-Caroline Prevot. 2015. “Citizen-Science Programs: Towards Transformative Biodiversity Governance.” Environmental Development 13 (January): 39–45. doi:10.1016/j.envdev.2014.11.003.
  • Craig, William J., Trevor M. Harris, and Daniel Weiner. 2002. Community Participation and Geographical Information Systems. London: Taylor & Francis.
  • Crall, Alycia W., Gregory J. Newman, Thomas J. Stohlgren, Kirstin A. Holfelder, Jim Graham, and Donald M. Waller. 2011. “Assessing Citizen Science Data Quality: An Invasive Species Case Study.” Conservation Letters 4 (6): 433–442. doi:10.1111/j.1755-263X.2011.00196.x.
  • de Assis, Luiz F.G., E.A. Flávio, Edison de Freitas Horita, Jó Ueyama, João de Albuquerque, Luiz Fernando F. G. de Assis, Flávio E. A. Horita, Edison P. de Freitas, Jó Ueyama, and João Porto de Albuquerque. 2018. “A Service-Oriented Middleware for Integrated Management of Crowdsourced and Sensor Data Streams in Disaster Management.” Sensors 18 (6): 1689. doi:10.3390/s18061689.
  • Degrossi, Lívia Castro, João Porto de Albuquerque, Maria Clara Fava, and Eduardo Mario Mendiondo. 2014. “Flood Citizen Observatory: A Crowdsourcing-Based Approach for Flood Risk Management in Brazil.” In Proceedings of the International Conference on Software Engineering and Knowledge Engineering,570–576. Vancouver, Canada: SEKE. https://ksiresearchorg.ipage.com/seke/Proceedings/seke/SEKE2014_Proceedings.pdf.
  • Dell'Acqua, Fabio, and Daniele De Vecchi. 2017. “Potentials of Active and Passive Geospatial Crowdsourcing in Complementing Sentinel Data and Supporting Copernicus Service Portfolio.” Proceedings of the IEEE 105 (10): 1913–1925. doi:10.1109/JPROC.2017.2727284.
  • Devictor, Vincent, Robert J. Whittaker, and Coralie Beltrame. 2010. “Beyond Scarcity: Citizen Science Programmes as Useful Tools for Conservation Biogeography.” Diversity and Distributions 16 (3): 354–362. doi:10.1111/j.1472-4642.2009.00615.x.
  • Dickinson, Janis L., Jennifer Shirk, David Bonter, Rick Bonney, Rhiannon L. Crain, Jason Martin, Tina Phillips, and Karen Purcell. 2012. “The Current State of Citizen Science as a Tool for Ecological Research and Public Engagement.” Frontiers in Ecology and the Environment 10 (6): 291–297. doi:10.1890/110236.
  • Druschke, Caroline Gottschalk, and Carrie E. Seltzer. 2012. “Failures of Engagement: Lessons Learned from a Citizen Science Pilot Study.” Applied Environmental Education & Communication 11 (3-4): 178–188. Routledge: doi:10.1080/1533015X.2012.777224.
  • Ferster, Colin J., and Nicholas C. Coops. 2013. “A Review of Earth Observation Using Mobile Personal Communication Devices.” Computers & Geosciences 51 (February): 339–349. doi:10.1016/j.cageo.2012.09.009.
  • Fraisl, Dilek, Jillian Campbell, Linda See, Uta Wehn, Jessica Wardlaw, Margaret Gold, Inian Moorthy., et al. 2020. “Mapping Citizen Science Contributions to the UN Sustainable Development Goals.” Sustainability Science 15 (6): 1735–1751. July. doi:10.1007/s11625-020-00833-7.
  • Freitag, Amy, and Max J. Pfeffer. 2013. “Process, Not Product: Investigating Recommendations for Improving Citizen Science ‘Success’.” PLoS ONE 8 (5): e64079 doi:10.1371/journal.pone.0064079.
  • Fritz, Steffen, Linda See, Tyler Carlson, Mordechai (Muki) Haklay, Jessie L. Oliver, Dilek Fraisl, Rosy Mondardini., et al. 2019. “Citizen Science and the United Nations Sustainable Development Goals.” Nature Sustainability 2 (10): 922–930. doi:10.1038/s41893-019-0390-3.
  • Ganti, R. K., F. Ye, and H. Lei. 2011. “Mobile Crowdsensing: Current State and Future Challenges.” IEEE Communications Magazine 49 (11): 32–39. doi:10.1109/MCOM.2011.6069707.
  • Gharesifard, Mohammad, and Uta Wehn. 2016. “To Share or Not to Share: Drivers and Barriers for Sharing Data via Online Amateur Weather Networks.” Journal of Hydrology 535 (April): 181–190. doi:10.1016/j.jhydrol.2016.01.036.
  • Gharesifard, Mohammad, Uta Wehn, and Pieter van der Zaag. 2017. “Towards Benchmarking Citizen Observatories: Features and Functioning of Online Amateur Weather Networks.” Journal of Environmental Management 193 (May): 381–393. doi:10.1016/j.jenvman.2017.02.003.
  • Golumbic, Yaela N., Barak Fishbain, and Ayelet Baram-Tsabari. 2019. “User Centered Design of a Citizen Science Air-Quality Monitoring Project.” International Journal of Science Education, Part B 9 (3): 195–119. doi:10.1080/21548455.2019.1597314.
  • Gómez-Barrón, José-Pablo, Miguel-Ángel Manso-Callejo, Ramón Alcarria, and Teresa Iturrioz. 2016. “Volunteered Geographic Information System Design: Project and Participation Guidelines.” ISPRS International Journal of Geo-Information 5 (7): 108. doi:10.3390/ijgi5070108.
  • Goodchild, Michael F. 2007a. “Citizens as Voluntary Sensors: Spatial Data Infrastructure in the World of Web 2.0.” International Journal of Spatial Data Infrastructures Research 2 (2): 24–32.
  • Goodchild, Michael F. 2007b. “Citizens as Sensors: The World of Volunteered Geography.” GeoJournal 69 (4): 211–221. doi:10.1007/s10708-007-9111-y.
  • Grainger, Alan. 2017. “Citizen Observatories and the New Earth Observation Science.” Remote Sensing 9 (2): 153. doi:10.3390/rs9020153.
  • Grossberndt, Sonja, and Hai-Ying Liu. 2016. “Citizen Participation Approaches in Environmental Health.” In Environmental Determinants of Human Health, edited by Jozef M. Pacyna and Elisabeth G. Pacyna, 225–248. Molecular and Integrative Toxicology. Cham: Springer International Publishing. doi:10.1007/978-3-319-43142-0_11.
  • Guillaume, Gwenaël, Arnaud Can, Gwendall Petit, Nicolas Fortin, Sylvain Palominos, Benoit Gauvreau, Erwan Bocher, and Judicaël Picaut. 2016. “Noise Mapping Based on Participative Measurements.” Noise Mapping 3 (1): 140–156. doi:10.1515/noise-2016-0011.
  • Gutiérrez, Verónica, Evangelos Theodoridis, Georgios Mylonas, Fengrui Shi, Usman Adeel, Luis Diez, Dimitrios Amaxilatis, et al. 2016. “Co-Creating the Cities of the Future.” Sensors 16 (11): 1971. doi:10.3390/s16111971.
  • Haklay, Muki. 2015. Citizen Science and Policy: A European Perspective. Washington, DC, USA: Woodrow Wilson International Center for Scholars. https://www.wilsoncenter.org/publication/citizen-science-and-policy-european-perspective.
  • Haklay, Muki. 2016. “Why is Participation Inequality Important?” In European Handbook of Crowdsourced Geographic Information, edited by Cristina Capineri, Muki Haklay, Haosheng Huang, Vyron Antoniou, Juhani Kettunen, Frank Ostermann, and Ross Purves, 35–44. London, UK: Ubiquity Press. https://www.ubiquitypress.com/site/chapters/10.5334/bax.c/.
  • Hand, Eric. 2010. “Citizen Science: People Power.” Nature 466 (7307): 685–687. doi:10.1038/466685a.
  • Havlik, Denis, and Gerald Schimak. 2014. “State and Trends in Mobile Observation Applications.” In Proceedings of the 7th International Congress on Environmental Modelling and Software, edited by D.P. Ames, N.W.T. Quinn, and A.E. Rizzoli, 294–303. San Diego, CA, USA: International Environmental Modelling and Software Society (iEMSs). https://scholarsarchive.byu.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=1042&context=iemssconference.
  • Houghton, Robert, James Sprinks, Jessica Wardlaw, Steven Bamford, and Stuart Marsh. 2019. “A Sociotechnical System Approach to Virtual Citizen Science: An Application of BS ISO 27500:2016.” Journal of Science Communication 18 (01): A01. doi:10.22323/2.18010201.
  • Howe, Jeff. 2006. “The Rise of Crowdsourcing.” Wired. https://www.wired.com/2006/06/crowds/.
  • Howe, Jeff. 2008. Crowdsourcing: How the Power of the Crowd is Driving the Future of Business. London: Random House Business.
  • Hunt, Nuala, Michael O’Grady, Conor Muldoon, Barnard Kroon, Jie Wan, and Gregory O’Hare. 2015. “Citizen Science: A Learning Paradigm for the Smart City?” Interaction Design and Architecture(s) 27: 28–43. http://www.mifav.uniroma2.it/inevent/events/idea2010/doc/27_3.pdf.
  • “IGARSS.” 2010. Community Remote Sensing. https://www.igarss2010.org/CommunityRemoteSensing.html.
  • Jennett, Charlene, and Anna L. Cox. 2014. “Eight Guidelines for Designing Virtual Citizen Science Projects.” In Second AAAI Conference on Human Computation and Crowdsourcing, 16–17. Pittsburgh, PA: AAAI. https://www.aaai.org/ocs/index.php/HCOMP/HCOMP14/paper/view/9261.
  • Jennett, Charlene, Laure Kloetzer, Daniel Schneider, Ioanna Iacovides, Anna Cox, Margaret Gold, Brian Fuchs., et al. 2016. “Motivations, Learning and Creativity in Online Citizen Science.” Journal of Science Communication 15 (3): 1–23. doi:10.22323/2.15030205.
  • Jirka, Dr Simon, Dr Albert Remke, and Arne Bröring. 2013. “EnviroCar: Crowd Sourced Traffic and Environment Data for Sustainable Mobility.” In Proceedings of the Workshop “Environmental Information Systems and Services - Infrastructures and Platforms 2013 - with Citizens Observatories, Linked Open Data and SEIS/SDI Best Practices,” 1322:7. Neusiedl am See, Austria: CEUR-Workshop Proceedings. http://ceur-ws.org/Vol-1322/.
  • Jones, K.B., H. Bogena, H. Vereecken, and J. F. Weltzin. 2010 “Design and Importance of Multi-Tiered Ecological Monitoring Networks.” In Long-Term Ecological Research: Between Theory and Application, edited by Felix Müller, Cornelia Baessler, Hendrik Schubert, and Stefan Klotz, 355–374. Dordrecht: Springer. www.springer.com/gp/book/9789048187812.
  • Kaufmann, Nicolas, Thimo Schulze, and Daniel Veit. 2011. “More than Fun and Money. Worker Motivation in Crowdsourcing: A Study on Mechanical Turk.” In A Renaissance of Information Technology for Sustainability and Global Competitivenes, 11. Detroit, Michigan, USA: AMCIS. https://aisel.aisnet.org/amcis2011_submissions/
  • Kobernus, Mike, Arne J. Berre, M. Gonzalez, Hai-Ying Liu, Mirjam Fredriksen, Richard Rombouts, and Alena Bartonova. 2013. “A Practical Approach to an Integrated Citizens’ Observatory: The CITI-SENSE Framework.” In Proceedings of the Workshop "Environmental Information Systems and Services - Infrastructures and Platforms 2013, 1322:15. Neusiedl am See, Austria: CEUR-Workshop Proceedings. http://ceur-ws.org/Vol-1322/paper_1.pdf.
  • Kotovirta, V., T. Toivanen, R. Tergujeff, T. Häme, and M. Molinier. 2015. “Citizen Science for Earth Observation: Applications in Environmental Monitoring and Disaster Response.” In Proceedings of the 36th International Symposium on Remote Sensing of Environment, ISRSE-36, XL-7-W3:1221–1226. Berlin, Germany: International Society for Photogrammetry and Remote Sensing isprs. doi:10.5194/isprsarchives-XL-7-W3-1221-2015.
  • Lai, P. C. 2017. “The Literature Review of Technology Adoption Models and Theories for the Novelty Technology.” Journal of Information Systems and Technology Management 14 (1): 21–38. doi:10.4301/S1807-17752017000100002.
  • Lane, Nicholas, Emiliano Miluzzo, Hong Lu, Daniel Peebles, Tanzeem Choudhury, and Andrew Campbell. 2010. “A Survey of Mobile Phone Sensing.” IEEE Communications Magazine 48 (9): 140–150. doi:10.1109/MCOM.2010.5560598.
  • Lanfranchi, Vitaveska, Neil Ireson, Uta Wehn, Stuart N. Wrigley, and Fabio Ciravegna. 2014. “Citizens’ Observatories for Situation Awareness in Flooding.” In Proceedings of the 11th International Conference on Information Systems for Crisis Response and Management, edited by S.R. Hiltz, M.S. Pfaff, L. Plotnick, and P.C. Shih, 145–154. Pennsylvania: The Pennsylvania State University. http://www.iscram.org/legacy/ISCRAM2014/ISCRAM2014_proceedings.pdf.
  • Lawton, Brandon, Bonnie Eisenhamer, Barbara J. Mattson, and Jordan M. Raddick. 2011. “Bringing the Virtual Astronomical Observatory to the Education Community.” In Proceedings of a National Conference on Science Education and Public Outreach: Connecting People to Science, edited by Joseph B. Jensen, James G. Manning, Michael G. Gibbs, and Doris Daou, 457:283–287. Baltimore, MA: Astronomical Society of the Pacific. http://www.aspbooks.org/a/volumes/article_details/?paper_id=34202.
  • Liu, Hai-Ying, and Mike Kobernus. 2017. “Citizen Science and Its Role in Sustainable Development: Status, Trends, Issues, and Opportunities.” In Analyzing the Role of Citizen Science in Modern Research, edited by Luigi Ceccaroni and Jaume Piera, 147–167. Hershey, PA: IGI Global. https://www.igi-global.com/chapter/citizen-science-and-its-role-in-sustainable-development/170188.
  • Liu, Hai-Ying, Mike Kobernus, David Broday, and Alena Bartonova. 2014. “A Conceptual Approach to a Citizens' Observatory-Supporting Community-Based Environmental Governance.” Environmental Health: A Global Access Science Source 13 (1): 107 doi:10.1186/1476-069X-13-107.
  • Loukis, Euripidis, and Yannis Charalabidis. 2015. “Active and Passive Crowdsourcing in Government.” In Policy Practice and Digital Science, edited by Marijn Janssen, Maria A. Wimmer, and Ameneh Deljoo, 261–289. Cham, Switzerland: Springer International Publishing. doi:10.1007/978-3-319-12784-2_12.
  • Mackay, E. B., M. E. Wilkinson, C. J. A. Macleod, K. Beven, B. J. Percy, M. G. Macklin, P. F. Quinn, M. Stutter, and P. M. Haygarth. 2015. “Digital Catchment Observatories: A Platform for Engagement and Knowledge Exchange between Catchment Scientists, Policy Makers, and Local Communities.” Water Resources Research 51 (6): 4815–4822. doi:10.1002/2014WR016824.
  • Mahmood, Mo A., Janice M. Burn, Leopoldo A. Gemoets, and Carmen Jacquez. 2000. “Variables Affecting Information Technology End-User Satisfaction: A Meta-Analysis of the Empirical Literature.” International Journal of Human-Computer Studies 52 (4): 751–771. doi:10.1006/ijhc.1999.0353.
  • Marantos, C., I. S. Paraskevas, K. Siozios, J. Mothe, C. Menou, and D. Soudris. 2017. “FabSpace 2.0: A Platform for Application and Service Development Based on Earth Observation Data.” In 2017 6th International Conference on Modern Circuits and Systems Technologies (MOCAST), 1–4. doi:10.1109/MOCAST.2017.7937657.
  • Mazumdar, Suvodeep, Stuart Wrigley, Fabio Ciravegna, Suvodeep Mazumdar, Stuart Wrigley, and Fabio Ciravegna. 2017. “Citizen Science and Crowdsourcing for Earth Observations: An Analysis of Stakeholder Opinions on the Present and Future.” Remote Sensing 9 (1): 87. doi:10.3390/rs9010087.
  • Mazumdar, Suvodeep, Stuart N. Wrigley, Neil Ireson, and Fabio Ciravegna. 2018. “Harnessing Location-Based Services for Effective Citizen Observatories.” International Journal of Spatial Data Infrastructures Research 13 (0): 101–108. doi:10.2902/ijsdir.v13i0.479.
  • Mietlicki, Fanny, Piotr Gaudibert, and Bruno Vincent. 2012. “HARMONICA Project (HARMOnised Noise Information for Citizens and Authorities).” In Proceedings of the 41st International Congress and Exposition on Noise Control Engineering 2012 (INTER-NOISE 2012), edited by Courtney Burroughs and Steve Conlon, 6160–6172. New York City, New York, USA: Curran. http://toc.proceedings.com/18441webtoc.pdf.
  • Miorandi, D., I. Carreras, E. Gregori, I. Graham, and J. Stewart. 2013. “Measuring Net Neutrality in Mobile Internet: Towards a Crowdsensing-Based Citizen Observatory.” In 2013 IEEE International Conference on Communications Workshops (ICC), 199–203. doi:10.1109/ICCW.2013.6649228.
  • Mominó, Josep M., Jaume Piera, and Elena Jurado. 2016. “Citizen Observatories as Advanced Learning Environments.” In Analyzing the Role of Citizen Science in Modern Research, edited by Luigi Ceccaroni and Jaume Piera, 192–212. Hersehy, PA: IGI Global. https://www.igi-global.com/chapter/citizen-observatories-as-advanced-learning-environments/170190.
  • Montargil, Filipe, and Vitor Santos. 2017a. “Citizen Observatories: Concept, Opportunities and Communication with Citizens in the First EU Experiences.” In Beyond Bureaucracy: Towards Sustainable Governance Informatisation, edited by Alois A. Paulin, Leonidas G. Anthopoulos, and Christopher G. Reddick, 25:167–184. Public Administration and Information Technology. Cham, Switzerland: Springer. doi:10.1007/978-3-319-54142-6_11.
  • Montargil, Filipe, and Vítor Santos. 2017b. “Communication with Citizens in the First EU Citizen Observatories Experiences.” In The Proceedings of 17th European Conference on Digital Government - ECDG 2017, 96–105. https://repositorio.ipl.pt/handle/10400.21/7830.
  • Newman, Greg, Don Zimmerman, Alycia Crall, Melinda Laituri, Jim Graham, and Linda Stapel. 2010. “User-Friendly Web Mapping: Lessons from a Citizen Science Website.” International Journal of Geographical Information Science 24 (12): 1851–1869. doi:10.1080/13658816.2010.490532.
  • Prakash, A., R. Gens, J. Kelley, V. Alexander, L. Johnson, and G. Yanow. 2004. “Space-Based Observations in the International Polar Year: Educational Opportunities to Strengthen the STEM Pipeline.” In Proceedings of the IGARSS 2004. 2004IEEE International Geoscience and Remote Sensing Symposium, 7:1972–1975. Anchorage, AK: IEEE. doi:10.1109/IGARSS.2004.1370733.
  • Preece, Jennifer. 2016. “Citizen Science: New Research Challenges for Human–Computer Interaction.” International Journal of Human-Computer Interaction 32 (8): 585–612. doi:10.1080/10447318.2016.1194153.
  • Robinson, Johanna. 2019. “Deliverable D3.5 Citizen Observatories (COs) Implementation.” SMURBS. https://drive.google.com/file/d/1kaCt4t10V6HXAD_U093q7PxPxRv_yXzi/view.
  • Robinson, Johanna, David Kocman, Milena Horvat, and Alena Bartonova. 2018. “End-User Feedback on a Low-Cost Portable Air Quality Sensor System: Are We There yet?” Sensors 18 (11): 3768. doi:10.3390/s18113768.
  • Sanz, Serrano, Teresa, Holocher-Ertl, Barbara Kieslinger, Francisco Sanz García, and Cândida Silva. 2014. White Paper on Citizen Science for Europe. Socientize Consortium. https://ec.europa.eu/futurium/en/content/white-paper-citizen-science.
  • Sauermann, Henry, and Chiara Franzoni. 2015. “Crowd Science User Contribution Patterns and Their Implications.” Proceedings of the National Academy of Sciences of the United States of America 112 (3): 679–684. doi:10.1073/pnas.1408907112.
  • See, Linda, Steffen Fritz, Eduardo Dias, Elise Hendriks, Bas Mijling, Frans Snik, Piet Stammes, et al. 2016. “Supporting Earth-Observation Calibration and Validation: A New Generation of Tools for Crowdsourcing and Citizen Science.” IEEE Geoscience and Remote Sensing Magazine 4 (3): 38–50. https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=7570340. doi:10.1109/MGRS.2015.2498840.
  • See, L., Peter Mooney, Giles Foody, Lucy Bastin, Alexis Comber, Jacinto Estima, Steffen Fritz, et al. 2016. “Crowdsourcing, Citizen Science or Volunteered Geographic Information? The Current State of Crowdsourced Geographic Information.” ISPRS International Journal of Geo-Information 5 (5): 55. doi:10.3390/ijgi5050055.
  • Shirk, Jennifer, Heidi Ballard, Candie Wilderman, Tina Phillips, Andrea Wiggins, Rebecca Jordan, Ellen McCallie, et al. 2012. “Public Participation in Scientific Research: A Framework for Deliberate Design.” Ecology and Society 17 (2): 29–48 doi:10.5751/ES-04705-170229.
  • Sieber, Renee. 2006. “Public Participation Geographic Information Systems: A Literature Review and Framework.” Annals of the Association of American Geographers 96 (3): 491–507. doi:10.1111/j.1467-8306.2006.00702.x.
  • Silvertown, Jonathan. 2009. “A New Dawn for Citizen Science.” Trends in Ecology & Evolution 24 (9): 467–471. doi:10.1016/j.tree.2009.03.017.
  • Simperl, Elena, Neal Reeves, Chris Phethean, Todd Lynes, and Ramine Tinati. 2018. “Is Virtual Citizen Science a Game?” ACM Transactions on Social Computing 1 (2): 1–39. doi:10.1145/3209960.
  • Simpson, Robert, Kevin R. Page, and David De Roure. 2014. “Zooniverse: Observing the World’s Largest Citizen Science Platform.” In Proceedings of the 23rd International Conference on World Wide Web, 1049–1054. WWW ’14 Companion. Seoul, South Korea: ACM. doi:10.1145/2567948.2579215.
  • Skarlatidou, Artemis, Alexandra Hamilton, Michalis Vitos, and Muki Haklay. 2019. “What Do Volunteers Want from Citizen Science Technologies? A Systematic Literature Review and Best Practice Guidelines.” Journal of Science Communication 18 (01): A02. doi:10.22323/2.18010202.
  • Soleri, Daniela, Jonathan Long, Mónica Ramirez-Andreotta, Rose Eitemiller, and Rajul Pandya. 2016. “Finding Pathways to More Equitable and Meaningful Public-Scientist Partnerships.” Citizen Science: Theory and Practice 1 (1): 9. doi:10.5334/cstp.46.
  • Stefanidis, Anthony, Andrew Crooks, and Jacek Radzikowski. 2013. “Harvesting Ambient Geospatial Information from Social Media Feeds.” GeoJournal 78 (2): 319–338. doi:10.1007/s10708-011-9438-2.
  • Sullivan, Brian L., Christopher L. Wood, Marshall J. Iliff, Rick E. Bonney, Daniel Fink, and Steve Kelling. 2009. “EBird: A Citizen-Based Bird Observation Network in the Biological Sciences.” Biological Conservation 142 (10): 2282–2292. doi:10.1016/j.biocon.2009.05.006.
  • Tapia, Andrea H., Nicolas J. LaLone, Elizabeth MacDonald, Reid Priedhorsky, and Michelle Hall. 2014. “Crowdsourcing Rare Events: Using Curiosity to Draw Participants into Science and Early Warning Systems.” In Proceedings of the 11th International Conference on Information Systems for Crisis Response and Management: ISCRAM 2014. ISCRAM. https://www.semanticscholar.org/paper/Crowdsourcing-rare-events%3A-Using-curiosity-to-draw-Tapia-LaLone/9b906befd780460244e9642b9f45ef6e9a86fc55.
  • Thompson, Jonathan E. 2016. “Crowd-Sourced Air Quality Studies: A Review of the Literature and Portable Sensors.” Trends in Environmental Analytical Chemistry 11 (July): 23–34. doi:10.1016/j.teac.2016.06.001.
  • Tserstou, A., A. Jonoski, I. Popescu, T. H. Asumpcao, G. Athanasiou, A. Kallioras, and I. Nichersu. 2017. “SCENT: Citizen Sourced Data in Support of Environmental Monitoring.” In Proceedings of the 21st International Conference on Control Systems and Computer Science (CSCS), 612–616. Bucharest, Romania: IEEE. doi:10.1109/CSCS.2017.93.
  • Uhrner, Ulrich, Gioavanna Grosso, Anne-Claude Romain, Virginie Hutsemekers, Julien Delva, Arnaud De Groof, Philippe Valoggia, L. Johannsen, Bernard Stevenot, and Philippe Ledent. 2014. “Development of an Environmental Information System for Odour Using Citizen and Technology Innovative Sensors and Advanced Modelling.” In Proceedings of the Workshop “Environmental Information Systems and Services - Infrastructures and Platforms 2013 - with Citizens Observatories, Linked Open Data and SEIS/SDI Best Practices, 1322:10. Neusiedl am See, Austria: CEUR-Workshop Proceedings. http://ceur-ws.org/Vol-1322/paper_4.pdf.
  • Vincent, Bruno, Arnaud Cristini, Julie Vallet, Céline Sales, Hélène Poimboeuf, Claire Sorrentini, and Céline Anselme. 2011. “An Urban Noise Observatory: Scientific, Technical, Strategic and Political Challenges: A Systemic Complementary Approach to the New Requirements of the European Directives.” In Proceedings of the 40th International Congress and Exposition on Noise Control Engineering 2011, INTER-NOISE 2011, 5. Osaka, Japan: INTER-NOISE. https://www.acoucite.org/IMG/pdf/ARTICLE-internoise2011_ENG.pdf.
  • Wald, Dara M., Justin Longo, and A. R. Dobell. 2016. “Design Principles for Engaging and Retaining Virtual Citizen Scientists.” Conservation Biology: The Journal of the Society for Conservation Biology 30 (3): 562–570. doi:10.1111/cobi.12627.
  • Wallace, Kristi, Seth Snedigar, and Cheryl Cameron. 2015. “Is Ash Falling?’ An Online Ashfall Reporting Tool in Support of Improved Ashfall Warnings and Investigations of Ashfall Processes.” Journal of Applied Volcanology 4 (1): 8. doi:10.1186/s13617-014-0022-6.
  • Wehn, Uta, Kevin Collins, Kim Anema, Laura Basco-Carrera, and Alix Lerebours. 2018. “Stakeholder Engagement in Water Governance as Social Learning: Lessons from Practice.” Water International 43 (1): 34–59. doi:10.1080/02508060.2018.1403083.
  • Wehn, Uta, and Jaap Evers. 2015. “The Social Innovation Potential of ICT-Enabled Citizen Observatories to Increase EParticipation in Local Flood Risk Management.” Technology in Society 42 (August): 187–198. doi:10.1016/j.techsoc.2015.05.002.
  • Wehn, Uta, Simon McCarthy, Vita Lanfranchi, and Sue Tapsell. 2015. “Citizen Observatories as Facilitators of Change in Water Governance? Experiences from Three European Cases.” Environmental Engineering and Management Journal 14 (9): 2073–2086. http://www.eemj.icpm.tuiasi.ro/issues/vol14/vol14no9.htm. doi:10.30638/eemj.2015.222.
  • Whitelaw, Graham, Hague Vaughan, Brian Craig, and David Atkinson. 2003. “Establishing the Canadian Community Monitoring Network.” Environmental Monitoring and Assessment 88 (1/3): 409–418. doi:10.1023/A:1025545813057.
  • Wiggins, Andrea. 2013. “Free as in Puppies: Compensating for ICT Constraints in Citizen Science.” In Proceedings of the 2013 Conference on Computer Supported Cooperative Work, 1469–1480. CSCW ’13.New York, NY, USA: ACM. doi:10.1145/2441776.2441942.
  • Wiggins, Andrea, and Kevin Crowston. 2011. “From Conservation to Crowdsourcing: A Typology of Citizen Science.” In Proceedings of the 2011 44th Hawaii International Conference on System Sciences, 1–10. HICSS ’11.Washington, DC: IEEE Computer Society. doi:10.1109/HICSS.2011.207.
  • Williamson, Ray A. 2009. “Earth Observations and Human and Environmental Security: Opportunities and Challenges.” In Proceedings of the 60th International Astronautical Congress 2009, 2445–2451. Daejeon, Republic of Korea: International Astronautical Federation. https://iafastro.directory/iac/archive/browse/IAC-09/B1/5/4125/.
  • Wright, Dale R., Les G. Underhill, Matt Keene, and Andrew T. Knight. 2015. “Understanding the Motivations and Satisfactions of Volunteers to Improve the Effectiveness of Citizen Science Programs.” Society & Natural Resources 28 (9): 1013–1029. doi:10.1080/08941920.2015.1054976.
  • Zaman, J., and W. De Meuter. 2015. “DisCoPar: Distributed Components for Participatory Campaigning.” In Proceedings of 2015 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), 160–165. St. Louis, MO: IEEE. doi:10.1109/PERCOMW.2015.7134012.
  • Zaman, J., and W. De Meuter. 2016. “Crowd Sensing Applications: A Distributed Flow-Based Programming Approach.” In Proceedings of the 2016 IEEE International Conference on Mobile Services (MS), 79–86. San Francisco, CA: IEEE. doi:10.1109/MobServ.2016.22.