Skip to main content
 

Research to Practice, Practice to Research

Citizen Science Framing and Delivery Models: Impacts on Young People’s Environmental Science Learning

Connected Science Learning September–October 2022 (Volume 4, Issue 5)

By Jessica Wardlaw, Ana Benavides-Lahnstein, Lucy Robinson, Julia Lorke, Sasha Pratt-Taweh, Maryam Ghadiri Khanaposhtani, Heidi Ballard, and Victoria Burton

Citizen Science Framing and Delivery Models: Impacts on Young People’s Environmental Science Learning

Citizen science and other participatory approaches to science provide opportunities for the public to collaborate with professional scientists and make tangible contributions to authentic scientific research. Alongside their research goals, citizen science programs may also contribute to wildlife conservation (e.g., Ballard et al. 2017; Chandler et al. 2017), policy processes (e.g., Göbel et al. 2019), and science learning (e.g., Pitt and Schultz 2018). Guidance on measuring learning outcomes in citizen science exists (Phillips et al. 2018), and some studies have examined learning outcomes for adults (Kloetzer et al. 2021; Peter et al. 2021a and b). However, there is an emerging but still limited understanding of what and how young people learn through citizen science and how project design can influence these outcomes (Druschke and Seltzer 2012; Roche et al. 2020). Recent examination of patterns of young people in citizen science identified a drop-off between participation in data collection and participation in data submission to the research project (Lorke et al. 2021); however, this data sharing/submission step has been shown to support young people’s development of Environmental Science Agency (ESA) learning outcomes (Ballard, Dixon, and Harris 2017). The ESA framework (Table 1) encompasses a holistic learning process that citizen science may foster, as observed in formal education settings (Bird et al. 2020; Harris and Ballard 2020).

Table 1. The three aspects of the Environmental Science Agency learning framework (Ballard, Dixon, and Harris 2017), and examples of what this might look like in the context of the Big Seaweed Search.
This table shows the three aspects of the Environmental Science Agency learning framework, and examples of what this might look like in the context of the Big Seaweed Search.

View full version

This study investigates the environmental science learning outcomes of young participants (10–15 years old) in the Big Seaweed Search (BSS) citizen science program during a period of program redesign. Using a case study design, we examine the impacts on learning outcomes for young people and identify which design features afford or constrain learning to address the following research questions:

  1. To what extent did the redesign support youth’s understanding of the program’s purpose and their contribution to science/research?
  2. Which design features appear to be most influential in supporting youth’s understanding of the program’s purpose and their contribution to science/research?
  3. What impact did a shift to virtual delivery have on science learning outcomes?

The Big Seaweed Search

The Big Seaweed Search (BSS) is a UK-based citizen science project run in partnership between the Natural History Museum London (the Museum) and the Marine Conservation Society (MCS), a UK-wide charity. It invites members of the public to search a 5 m transect of the shore, photographing, identifying, and recording the presence/absence and abundance of 14 seaweeds that are indicators of environmental change. Participants enter their data using an online form. Seaweed researchers at the Museum verify the data and use them for climate change research. The program also supports participants’ science learning and engagement and the development of marine stewardship activities. There are three typical instructional strategies to support participation: (1) self-directed individual or family, (2) self-directed community group, and (3) MCS-led community group. We examine this third model in this article.

Two Drivers of Change in Project Design and Delivery

International learning research, led by this study’s co-authors, studied BSS participants and informed an evidence-led redesign of BSS to enhance learning outcomes for young participants (Table 1). After the majority of redesign had been planned, the COVID-19 pandemic began, which required additional changes to comply with social distancing restrictions. These two drivers of change occurred sequentially during the planning process, but the resulting changes were implemented together in summer 2020.

This photo shows youth with a Big Seaweed Search ID guide and recording form on the beach in 2019.
Youth with a Big Seaweed Search ID guide and recording form on the beach in 2019.

Enhancing learning outcomes through design-based research

BSS was one of eight citizen science programs studied as part of an international research project (2017–2022) that investigated the science learning processes and outcomes for young people who participated in museum-led citizen science. Design-based research (DBR) is a collaborative approach used in the learning sciences to iteratively introduce design changes and study their impacts to develop more effective learning environments. Drawing upon Sandoval (2014), Bakker (2018), Penuel et al. (2011), and Fishman et al. (2013), the project team conducted a DBR process to create more opportunities for youth learning and foster the development of ESA in each citizen science context.

The research indicated that youth participants in BSS in 2018 and 2019 lacked a clear understanding of (1) the scientific purpose of BSS and (2) the onward processing and research application of the data they collected, and therefore their personal role and stake in the research. The DBR process addressed these two areas and led to the development of design changes that reframed the activity with a focus on science and enhanced opportunities for young people to submit data to the science research team. Reframing was achieved through refocusing the program’s introduction toward: (1) the authentic scientific research context, (2) the authenticity of the biological field research methods young people would use, (3) the project’s individual lead scientist, and (4) the contribution young people’s data would make to the wider scientific research. This was reinforced by the creation of an introductory video (https://youtu.be/SdHqF38dGtY), which featured the lead researcher and a previous youth participant reiterating these messages. Framing the scientific impact of youth contributions on a global scale and/or understanding of local ecosystems supports youth to develop ESA by motivating youth with broad interests and allowing them to draw on their lived experiences and existing knowledge of a place (Harris and Ballard 2018; Phillips et al. 2019). A final wrap-up session was added to the program, at which facilitators would share results and answer questions.

From field-based to hybrid delivery: The impact of COVID-19 on program delivery

In common with other learning settings, the team was forced to adapt the delivery of BSS in response to social distancing requirements during the COVID-19 pandemic. Pre-pandemic, MCS staff would have led an in-person introduction and training session before youth conducted beach surveys guided by their youth leaders. The introductory session was adapted for online delivery (facilitated through Zoom) attended by staff from the Museum and MCS and youth’s parents, and included the framing video described above. Beach surveys were undertaken in family groups rather than as a youth group. In a post-participation online wrap-up session, facilitators from the Museum and MCS presented the data the youth had collected, described the photo verification process, and answered questions.

Filming of the framing video was also impacted; for example, the lead researcher and youth participant each recorded their segments separately, which were later edited together in a Q&A style. The youth participant segment was filmed by their parent.

 

This photo illustrates the setup at the introductory session in 2019. Youth handled and smelled samples of seaweeds and had access to tools such as ID guides and magnifying glasses to support their identification of the seaweeds.
This photo illustrates the setup at the introductory session in 2019. Youth handled and smelled samples of seaweeds and had access to tools such as ID guides and magnifying glasses to support their identification of the seaweeds.

Methods

We employed a single case study design (Yin 2018) to investigate the learning outcomes of two youth groups recruited to take part in the BSS. We employed pre-post surveys, interviews, and in-the-moment observations of focal youth to explore the impact of the program on science learning during two delivery periods (2019 and 2020), between which the program’s design and delivery were altered. The program (case) had two embedded units of analysis: (1) the 2019 cohort, and (2) the 2020 cohort (Table 2). Groups were recruited via direct email and telephone conversations, which included appraising youth leaders of the broad aims of the BSS and sharing the instruction booklet.

Table 2. Overview of Big Seaweed Search program: 2019 iteration and 2020 iteration.
This table shows an overview of Big Seaweed Search program: 2019 iteration and 2020 iteration.

View larger version

Data collection methods

In 2019 all participating youth completed a pre-survey prior to the introductory session to capture their prior knowledge of science, citizen science experience, and their understanding of the purpose of BSS. At this stage they would only understand the purpose if their group leader had introduced it. Post-participation surveys and semi-structured interviews with selected focal youth (FY) were administered after the final beach survey to examine participants’ experience, participation, and learning.

In 2019 learning researchers conducted in-the-moment observations of nine FY conducting beach surveys to capture information about the activities and the ways that youth participate. FY were selected with the group leader to represent the diversity of participants with respect to age, race, and ethnicity, and prior interest in science. Researchers then tracked one FY for up to 20 minutes, during which they captured what the participant was doing and relevant or important conversations with others. When it would not disrupt the activity, researchers asked FY questions such as, “What are you doing?” and “Why are you doing that?”

In 2020 minor adaptations were made to the 2019 pre- and post-participation survey and interview questions to ensure they remained relevant following the changes to the program. Observations were not possible in 2020 due to COVID-19 restrictions.

Table 3 summarizes the data collected for each cohort. This paper focuses on youth who attended an introductory session and completed both pre- and post-participation surveys (n = 28 for 2019 and n = 7 for 2020, except where youth skipped survey questions). These included any youth whose parents had consented for them to be interviewed. To accommodate the difference in sample sizes, analysis included triangulation of the different data.

Table 3. Summary of case study data collection.
This table shows a summary of case study data collection.

View larger version

Data analysis methods

For analysis, the interview transcriptions and observation files were uploaded to a qualitative data analysis software (Dedoose, Version 8.0.45), and survey responses were entered into Excel spreadsheets. All data sets (Table 3) were coded following principles of Braun and Clarke (2006) to familiarize and analyze qualitative data. The codebook includes both a priori codes (based on the three ESA dimensions of learning) and new codes derived from inductive analysis. Following the inter-rater agreement exercises suggested by Campbell (2013), the analysts conducted a series of coding exercises to maximize their interpretation of the codebook, after which they coded data independently. The focus of the coding analysis was to identify the development of ESA, paying special attention to FY’s understanding of the scientific purpose of the program, and how the shift to virtual delivery may have influenced ESA learning outcomes. To further portray how FY participated and developed science learning outcomes, we created a vignette (see Supplemental Resources) for one FY from each case (Ely et al. 1997). These are based on the coded data sets (Table 3) and notes taken from the original data sources.

Results

“What do you think was the purpose of the Big Seaweed Search?”

In 2019 pre- and post-participation survey free-text responses to this question were grouped into the following broad themes: raise awareness and educate, learn about seaweed, monitor seaweed, and do the activity itself (Figure 1). Pre-participation responses to the same question from youth in 2020 (Figure 2) included much overlap but also included themes of understanding climate change and other environmental issues. Pre-participation, the 2020 cohort had a broader understanding of the potential impact of the research on a real-life problem while the 2019 cohort emphasized the educational goals. The end-to-end consistency of messaging in 2020 was achieved through group leaders effectively communicating the research purpose before the BSS team delivered the introductory session (see vignettes in Supplemental Resources). Post-participation, however, both cohorts focused more on the scientific activity and understood the purpose was monitoring seaweed. Notably, the 2020 cohort articulated why they were monitoring seaweed more than the 2019 cohort in their free-text responses (e.g., “in order to conserve species of animals and fish”).

Figure 1. Youth’s understanding of the purpose of Big Seaweed Search, pre- and post-participation in 2019 (n = 25).
This figure shows youth’s understanding of the purpose of Big Seaweed Search, pre- and post-participation in 2019 (n = 25).

View larger version

Figure 2. Youth’s understanding of the purpose of Big Seaweed Search, pre- and post-participation in 2020 (n = 7).
This figure shows youth’s understanding of the purpose of Big Seaweed Search, pre- and post-participation in 2020 (n = 7).

View larger version

Awareness of data processing and research applications

Two post-participation survey questions probed youth to describe what happened to the seaweed information they collected: “What do you think happens to your photos after you upload them?” (Figure 3) and “How do you think that information will be used?” (Figure 4). Most survey respondents from the 2019 cohort were not aware of how their data would be used, and those who were aware did not articulate their ideas clearly or had only vague notions of what happened to the data. In comparison, responses from the 2020 cohort demonstrated an understanding that the data they uploaded would be used for monitoring seaweed and supporting research (Figure 4).

Figure 3. Youth understanding of what happened to the photos they took of seaweed after they were uploaded to the Big Seaweed Search website, post-participation in 2019 (n = 27).
This figure shows youth understanding of what happened to the photos they took of seaweed after they were uploaded to the Big Seaweed Search website, post-participation in 2019 (n = 27).

View larger image

Figure 4. Youth understanding of how the information they contributed to the Big Seaweed Search will be used, post-participation in 2019 (n = 28) and 2020 (n = 7).
This figure shows youth understanding of how the information they contributed to the Big Seaweed Search will be used, post-participation in 2019 (n = 28) and 2020 (n = 7).

View larger version

Awareness of the authenticity of the methods and their contribution to science

Post-participation surveys asked, “Of all the things you did during the Big Seaweed Search, did any of the activities feel like you were ‘doing science’?” In 2019 most youth answered “no,” citing reasons including that they were “just looking at seaweed” or “just counting seaweed” and “it was fun.” This suggests they did not perceive these as valid or authentic scientific activities. The rest in this cohort reported that they did feel like they were doing science and used verbs that indicated they understood why they were looking for seaweed (e.g., identifying, documenting, recording different seaweeds) to explain this. By contrast, the majority of the 2020 cohort answered “yes” to the same question (Figure 5).

Figure 5. Youth self-reported sense that they were “doing science” in 2019 (n = 28) and 2020 (n = 7).
This figure shows youth self-reported sense that they were “doing science” in 2019 (n = 28) and 2020 (n = 7).

View larger version

In post-participation surveys both cohorts were also asked if they felt like they had made a real contribution to research or monitoring. The 2019 cohort were not wholly agreed, with many responding neutrally; in contrast, the 2020 cohort was in stronger agreement that their contribution felt real (Figure 6).

Figure 6 Youth’s agreement with the statement “I feel like I made a real contribution to the biological research or monitoring that we did during the Big Seaweed Search,” post-participation in 2019 (n = 25) and 2020 (n = 7).
This figure shows youth’s agreement with the statement “I feel like I made a real contribution to the biological research or monitoring that we did during the Big Seaweed Search,” post-participation in 2019 (n = 25) and 2020 (n = 7).

View larger version

Connecting informal science learning with formal school science

As well as feeling more involved in science and broadening their perception of what science is, some youth reported more enjoyment, confidence, and competence in science at school post-participation in BSS. Youths’ ratings of their confidence in science at school before and after their participation in BSS increased, an aspect that was explored further in interviews with the question “Do you think it’s helped you with your confidence about doing science inside school or outside of school?” (all names are pseudonyms):

Erin (2019 cohort): “It has. Because ever since we started, without realising, I just started really liking it. It’s one of my favourite subjects now although before I didn’t really enjoy it…at school we spend more time indoors rather than outdoors when it comes to science.”

Isla (2019 cohort): “I’m not really that good at science in school anyway, so I just like to be actually a part of helping people…It’s made me feel more involved with science.”

Ernest (2020 cohort): “I wasn’t really into science before the Big Seaweed Search and now I’m really into it. I enjoy my science lessons at school now…It was just some fun, but it was actually helpful at the same time so I think it was a really good combination.”

Vignettes

To summarize our findings and present them in context, the document in Supplemental Resources represents the experiences of one Focal Youth in 2019 and 2020 as vignettes.

Discussion

This section reflects on what this study found in terms of (1) the impact of the program design on learning outcomes, (2) connections between informal science activities and formal science education, and (3) the scope for online delivery to scale up citizen science programs.

The impact of program design on learning outcomes

This case study demonstrates that intentional redesign of a program can enhance youth learning outcomes. Despite the core activities (beach surveys) and resources (identification guide and recording form) for both cohorts being very similar, our findings demonstrate a distinct difference between the two cohorts in young people’s understanding of the scientific purpose of their activities, and appreciation of the research context and onward path of their data. Although the 2019 cohort had an in-person introductory session with representatives from the MCS and the Museum and were able to identify different seaweeds, they had a poorer understanding of the research purpose of the program. This finding would suggest that the content, and not the format (in person or online), determines such learning outcomes.

Three design features potentially afforded this outcome: (1) the scientific framing (by staff and the video) in 2020, (2) the “community” or division of labour, and (3) the increased opportunities for youth to actively participate in data submission in 2020—all reporting that they did this with support.

Scientific framing

As well as demonstrating an understanding of why seaweed research is important and the wider context of the research they were contributing to, the 2020 cohort also reported feeling like they “made a real contribution to research or monitoring” more than the 2019 cohort. While this research did not explore the concept of scientific contribution, post-participation survey responses from the 2019 cohort indicated that few understood that the monitoring of seaweeds was science. Participants had perceptions of their biological monitoring/recording activity as both “just looking at seaweeds,” which they did not consider to be science, and “identifying seaweed,” which was one of the top reasons cited for feeling like it was science. This implies that practitioners must frame citizen science activities as “science” intentionally because the youth audience may not think of it as science. Youth are likely to have very different ideas of what science is and is not, and what scientists do and do not do, which program messaging should address from the outset.

Community, or division of labour

The 2020 implementation (with beach surveys conducted as family groups) saw youth taking on, or at least seeing firsthand, a wider variety of roles and tasks within the research process (choosing a beach, selecting a transect and marking it out, identifying, photographing, uploading data afterward). In the 2019 cohort, much of that happened “behind closed doors” by group leaders in advance, or afterward, so the young participants experienced a more limited range of roles and tasks, limiting their experiences of the whole research process and the opportunity to develop skills and expertise in different roles.

Opportunities to participate in data submission

The 2019 cohort had comparatively poor access to mobile phones, which constrained their opportunity to photograph seaweeds and submit their data at the final stage of the program. This likely affected youth’s understanding of the purpose of the project and the onward path of their data, so our findings illustrate the importance of this type of participation for their understanding of the project’s purpose and authenticity.

Our findings support the notion, however, that participation does not automatically lead to learning (Bonney et al. 2014); our results show that participating in citizen science is not as impactful if participants do not perceive or understand that they are taking part in authentic science research. Practitioners can deliberately design for learning outcomes by developing learning supports, in conjunction with routine evaluation using tools like the DEVISE scale (Phillips et al. 2014), to maximize learning outcomes for participants.

All three design features are consistent with the National Academies of Sciences, Engineering, and Medicine’s recommendations (2018), in which “encouraging social interaction,” “supporting multiple types of participant engagement,” and “building learning supports into projects” are all cited as best practices for enhancing opportunities for learning through citizen science. Further work would be worth undertaking to see if and how these aspects of program facilitation are connected to learning outcomes because those choices—in this case—were taken for pragmatic program delivery reasons.

Connection between informal (out-of-school) science activities and formal (in-school) science education

Participation in scientific research nurtures the development of scientific inquiry and scientific habits of mind among young people and provides opportunities to integrate content knowledge with inquiry-based learning. Such learning outcomes are included in both US and UK school curricula, and Bonney et al. (2014) highlight the potential of citizen science to support this. Our research identified the benefits of combining larger group learning—whole-group intro and wrap-up sessions where youth learn and collaborate with their peers—with small group settings where youth deepen their inquiry learning and scientific habits of mind through taking on greater responsibility for fieldwork planning, data collection, and upload (in this case, in a family setting). While dependent upon the support of family or out-of-school carers, it potentially offers an avenue for combining classroom activities with homework or other out-of-school tasks. These tasks could be designed for small groups, in which youth can take more of a leadership role and develop their identities and expertise in different roles within a scientific context. It also demonstrates the benefits of framing scientific activities (in this case fieldwork, but equally relevant to experiments in the classroom) within their real-world context, connecting young people to authentic science experiences and wider applications and implications of their study. This connection of classroom concepts to real-world contexts has the potential to strengthen the outcomes of both.

The increases in confidence, enjoyment, and sense of involvement in school science seen in this study are subtle but provide evidence of the potential for growing this impact through repeated exposure to similar projects and activities. Likewise, participating in real-world scientific activities broadened young people’s understanding of what science as a field encompasses—beyond the topics covered in the classroom—and challenged stereotypes about what professional scientists “do” as part of their job. It is especially beneficial for young people moving through education to develop an awareness of broader career options and understand that members of the public can participate in science research.

Online delivery provides scope to scale up citizen science programs

Our research identified a number of benefits of online training delivery (youth being able to virtually “meet” more scientists and other staff), and while there were some disadvantages (e.g., decreased ability to deliver “hands on” aspects of training such as touching seaweeds), on balance the influence of the actual content and framing of the activity had far greater impact on young people’s learning outcomes than the mode of delivery (online or in person). This successful proof of concept creates an opportunity to scale up citizen science projects—an aspiration of many projects that is seldom realized due to limited staff time and travel costs to deliver in-person training. Exploring online training delivery for young people, with a careful focus on ensuring scientific framing and exposure to different roles within science, provides exciting possibilities for the citizen science practitioner community to significantly scale up participation (in both numbers and geographic scope) and direct resources toward traditionally underserved audiences.

Conclusions

This article reported a case study of a UK-based marine citizen science project as it navigated the dual aims of adapting the program (1) for youth environmental science learning and (2) in response to the COVID-19 pandemic. We found that scientific messaging within a citizen science program is crucial, regardless of how the program is delivered. There are also multiple ways in which citizen science programs can provide learning opportunities for young people, beyond the science content, including opportunities to support the development of science identity and agency; these, however, can only be achieved to their full potential through a clear focus on learning in the design and delivery of citizen science programs. In contrasting the two programs, we have learned lessons in how the design of the program both constrained and afforded learning, from which citizen science practitioners can benefit: 

  • Explore what “scientific” means for participants and provide scientific framing for biological monitoring projects.
  • Build the use of scientific tools into the design of programs.
  • Engaging students in citizen science activities can boost their confidence and interest in science outside of the classroom and could be an engaging and meaningful learning experience.
  • Citizen science projects afford additional learning opportunities, beyond the learning of science content knowledge (e.g., exposure to the scientific process and the importance of biological monitoring).
  • Design supporting material for educators to integrate citizen science projects into the classroom (e.g., connect science content to the national curriculum in the UK) and frame the scientific purpose of projects (e.g., videos, which convey a consistent message).

The pandemic both limited but enabled us to investigate learning outcomes, but this case study presents a clear direction for future research and practice.

Acknowledgments

This material is based upon work supported under a collaboration between the National Science Foundation (NSF), Wellcome, and the Economic and Social Research Council (ESRC) via a grant from the NSF (NSF grant no. 1647276) and a grant from Wellcome with ESRC (Wellcome grant no. 206202/Z/17/Z). Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the view of NSF, Wellcome, or ESRC.

We obtained parental consent and followed an ethical protocol approved by The Open University Human Research Ethics Committee (reference number: HREC/3003/Herodotou) and by the Institutional Review Board (IRB) at the University of California, Davis (reference number: 624197-13).

The authors would like to thank the participating youth, their group leaders and their families, as well as the LEARN CitSci team members. We would also like to thank the Big Seaweed Search team, especially Prof Juliet Brodie at the Natural History Museum and our collaborators at the Marine Conservation Society who helped facilitate this study.

Jessica Wardlaw is Citizen Science Programme Developer, Ana Benavides-Lahnstein is a Postdoctoral Researcher, Lucy Robinson is Citizen Science Manager, Sasha Pratt-Taweh is Project Coordination Officer, and Victoria Burton is Project Coordinator, all at the Natural History Museum in London. Julia Lorke is a Postdoctoral Researcher at the Natural History Museum in London, formerly at IPN–Leibniz Institute for Science and Mathematics Education. Maryam Ghadiri Khanaposhtani is a Postdoctoral Researcher, and Heidi Ballard is Professor of Environmental Science Education, both at the University of California, Davis.


citation: Wardlaw, J., A. Benavides-Lahnstein, L. Robinson, J. Lorke, S. Pratt-Taweh, M. Ghadiri Khanaposhtani, H. Ballard, and V. Burton. 2022. Citizen science framing and delivery models: Impacts on young people’s environmental science learning. Connected Science Learning 4 (5). https://www.nsta.org/connected-science-learning/connected-science-learning-september-october-2022/citizen-science

References

Bakker, A. 2018. Design research in education. London, England: Routledge. https://doi.org/10.4324/9780203701010

Ballard, H.L., L.D. Robinson, A.N. Young, G.B. Pauly, L.M. Higgins, R.F. Johnson, and J.C. Tweddle. 2017. Contributions to conservation outcomes by natural history museum-led citizen science: Examining evidence and next steps. Biological Conservation 208: 87–97. http://dx.doi.org/10.1016/j.biocon.2016.08.040

Ballard H., C. Dixon, and E. Harris. 2017. Youth-focused citizen science: Examining the role of environmental science learning and agency for conservation. Biological Conservation 208: 65–75. https://doi.org/10.1016/j.biocon.2016.05.024

Bird, E., P. Harte, and H. Ballard. 2020. Birds near and far. Science and Children 58 (1): 48–54.

Bonney R., T.B. Phillips, J. Enck, J. Shirk, and N. Trautmann. 2014. Citizen Science and Youth Education. Commissioned by the Committee on Successful Out-of-STEM Learning. http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_089993.pdf

Braun, V., and V. Clarke. 2006. Using thematic analysis in psychology. Qualitative research in psychology 3 (2): 77–101. https://doi.org/10.1191/1478088706qp063oa

Campbell, J.L., C. Quincy, J. Osserman, and O.K. Pedersen. 2013. Coding in-depth semistructured interviews: Problems of unitization and intercoder reliability and agreement. Sociological Methods and Research 42 (3): 294–320. https://doi.org/10.1177%2F0049124113500475.

Chandler, M., L. See, K. Copas, A.M.Z Bonde, B.C. López, F. Danielsen, J.K. Legind, S. Masinde, A.J. Miller-Rushing, G. Newman, A. Rosemartin, and E. Turak. 2017. Contribution of citizen science towards international biodiversity monitoring. Biological Conservation 213: 280–294. https://doi.org/10.1016/J.BIOCON.2016.09.004

Druschke, C., and C. Seltzer. 2012 Failures of engagement: Lessons learned from a citizen science pilot study. Applied Environmental Education and Communication 11 (3–4): 178–188.

Ely, M., R. Vinz, M. Downing, and M. Anzul. 1997. On writing qualitative research: Living by words. London: Falmer Press.

Fishman, B.J. et al. 2013. Design-based implementation research: An emerging model for transforming the relationship of research and practice. Teachers College Record 115 (14): 136–156. https://doi.org/10.1177/016146811311501415.

Göbel, C., C. Nold, A. Berditchevskaia, and M. Haklay. 2019. How does citizen science “do” governance? Reflections from the DITOs project. Citizen Science: Theory and Practice 4 (1): 31. https://doi.org/10.5334/cstp.204.

Harris, E., and H.L. Ballard. 2018. Real science in the palm of your hand. Science and Children 55 (8): 31–37. https://doi.org/10.2505/4/sc18_055_08_31.

Harris, E.M., and H.L. Ballard. 2021. Examining student environmental science agency across school science contexts. Journal of Research in Science Teaching 58: 906–934. https://doi.org/10.1002/tea.21685.

Kloetzer L., J. Lorke, J. Roche, Y. Golumbic, S. Winter, and A. Jõgeva. 2021. Learning in Citizen Science. In The science of citizen science, eds. Vohland K. et al. Springer, Cham. https://doi.org/10.1007/978-3-030-58278-4_15

Kuckartz, U. 2014. Qualitative text analysis: A guide to methods, practice and using software. Los Angeles, CA: Sage.

Lorke, J., H.L. Ballard, A.E. Miller, R.D. Swanson, S. Pratt-Taweh, J.N. Jennewein, L. Higgins, R.F. Johnson, A.N. Young, M. Ghadiri Khanaposhtani, and L.D. Robinson. 2021. Step by step towards citizen science—deconstructing youth participation in BioBlitzes. JCOM 20 (04): A03. https://doi.org/10.22323/2.20040203.

National Academies of Sciences, Engineering, and Medicine. 2018. Learning through citizen science: Enhancing opportunities by design. Washington, DC: The National Academies Press. https://doi.org/10.17226/25183.

Penuel, W.R. et al. 2011. Organizing research and development at the intersection of learning, implementation, and design, Educational Researcher 40 (7): 331–337. https://doi.org/10.3102/0013189X11421826.

Peter, M., T. Diekötter, K. Kremer, and T. Höffler. 2021. Citizen science project characteristics: Connection to participants’ gains in knowledge and skills. PLOS ONE 16 (7) (e0253692). 10.1371/journal.pone.0253692.

Peter M., T. Diekötter, T. Höffler, and K. Kremer. 2021. Biodiversity citizen science: Outcomes for the participating citizens. People and Nature 3: 294–311. https://doi.org/10.1002/pan3.10193.

Phillips, T., M. Ferguson, M. Minarchek, N. Porticella, and R. Bonney. 2014. User’s guide for evaluating learning outcomes in citizen science. Ithaca, NY: Cornell Lab of Ornithology. http://cdn1.safmc.net/wp-content/uploads/2016/11/28101058/CitizenScienceUsersGuide_Evaluation_2014.pdf

Phillips, T., N. Porticella, M. Constas, and R. Bonney. 2018. A framework for articulating and measuring individual learning outcomes from participation in citizen science. Citizen Science: Theory and Practice 3 (2): 3. https://doi.org/10.5334/cstp.126.

Phillips, T.B., H.L. Ballard, B.V. Lewenstein, and R. Bonney. 2019. Engagement in science through citizen science: Moving beyond data collection. Science Education 45 (1): 369. https://doi.org/10.1002/sce.21501

Pitt, A.N., and C.A. Schultz. 2018. Youth-based citizen science monitoring: Case studies from three national forests Journal of Forestry 116 (2): 109–116. https://doi.org/10.1093/jofore/fvx008.

Roche, J., L. Bell, C. Galvão, Y.N. Golumbic, L. Kloetzer, N. Knoben, et al. 2020. Citizen science, education, and learning: Challenges and opportunities. Front. Sociol 5 (613814): 1–10. doi:10.3389/fsoc.2020.613814

Sandoval, W.A., and P. Bell. 2004. Design-Based Research Methods for Studying Learning in Context: Introduction, Educational Psychologist 39(4): 199–201.  https://doi.org.uk/10.1207/s15326985ep3904_1.

Yin, R.K. 2018. Case study research: Design and methods (6th ed.). Sage Publications.

Citizen Science Environmental Science Learning Progression Research Informal Education

Asset 2