Skip to main content
 

Research & Teaching

A Design Heuristic for Analyzing and Interpreting Data

Journal of College Science Teaching—September/October 2022 (Volume 52, Issue 1)

By Sandra Swenson, Yi He, Heather Boyd, and Kate Schowe Good

Students reasoning with data in an authentic science environment had the opportunity to learn about the process of science and the world around them while developing skills to analyze and interpret self-collected and secondhand data. Our results show that nearly 50% of the treatment group responses were accurate when describing the reason for measuring water parameters, compared with 26% in the traditional lab group. When pre- and post-survey scores were compared, students in the treatment group outperformed students in the traditional group on four items: making claims about water pollution based on data; understanding water pollution in the Hudson River; understanding the relationship between temperature, pH, and salinity values; and feeling prepared to justify their reasoning on water pollution. Our evidence points to greater engagement by the treatment group and stronger descriptions about their claims, evidence, and reasoning around measuring water parameters and potential water pollution problems.

 

Studying environmental chemistry offers a positive way for educators to demonstrate the relevance of topics to students’ lives and to integrate coursework that can support socially driven research projects. College courses and programs of study that are authentic within the context of chemistry and environmental science have been implemented across the country and internationally (Aram & Manahan, 1995; Cole, 2007; Hansmann, 2009; Short, 2010; Surpless et al., 2014; Trumbore et al., 1996; Wenzel, 2000; Ali & Khan, 2017). However, colleges with a majority of students who come from low socioeconomic backgrounds face unique challenges in designing and implementing such courses for their students (Bridge, 2001; Tomboulian & Parrot, 1997), mainly due to a lack of adequate funding and resources to educate students from urban areas who also have diverse cultural backgrounds and higher rates of poverty (Tobin et al., 2001). To address this problem, this study introduced undergraduate college students in urban areas to authentic learning and new ways of using scientific technology.

Using technology that incorporates data analysis and interpretation is important for authentic learning. The National Academies of Sciences, Engineering & Medicine (2016); the National Research Council (2002, 2015); and the NGSS Lead States (2013) recommend the use of both data and technology in science education. Data-rich activities are useful for engaging students in inquiry-based learning, developing higher-order thinking skills, and fostering a student’s decision-making and reasoning ability (Manduca & Mogk, 2002). Technology that incorporates data in the form of visuals is especially helpful for science learning, and the use of such technology has grown rapidly in recent years (Gilbert, 2005). Unfortunately, this knowledge content and skills development are sorely lacking in our public school systems (MacKay, 2006). Becoming proficient in data-rich activities at the college level may be challenging for students who have little or no experience with this topic from their middle and high school science classes.

Substantive research has been conducted on middle school students’ interpretations of data, but far less research has been done on college students’ interpretations, perhaps because college students and adults might be expected to have a proficient understanding of this skill. Hug and McNeill (2008) examined whether seventh- and eighth-grade students performed differently in their discussions and interpretations of data when analyzing firsthand versus secondhand data. Two different experiments were conducted, one in chemistry in which students collected and observed their own data and another in biology in which students used secondhand data. By comparing and contrasting student use of firsthand data with the use of secondhand data, Hug and McNeill (2008) found some general similarities and differences. The similarities were that students (when prompted) do take “ownership” of data 76% of the time with firsthand data and 50% of the time with secondhand data. Taking ownership means the students reference where the data came from, and Hug and McNeill (2008) state that for meaningful learning to occur, students need to claim ownership of the data because doing so shows that students recognize the process of experimental research. Additionally, some students seem to recognize patterns for interpretation when they examine first- and secondhand data, as well as sometimes describing content-specific attributes during discussions by creating data-based conclusions. One difference that Hug and McNeill (2008) reported was that when students used firsthand data, they described limitations of the investigations more readily than when they used secondhand data.

Delen and Krajcik (2015, p. 1970) “reached a similar conclusion” as Hug and McNeill (2008) when they examined a small cohort of students who used the same data source (a local water source) for firsthand and secondhand data. Delen and Krajcik (2015) found that middle school students created stronger and higher-quality explanations when they were able to work with firsthand data rather than using other students’ data (secondhand data), even when the data set was large. Priemer and colleagues (2020, p. 2) disagree that the source of data (whether firsthand or secondhand) has “an influence on student learning outcomes (measured by the choice of hypothesis about the result of the experiment in physics).” Using a small cohort of middle school students, Priemer and colleagues (2020) also found that when students used firsthand or secondhand data (supplied by their teacher), there was an increase of 59% in the number of correct solutions on the posttest. Kanari and Millar (2004) found that students between ages 10 and 14 successfully interpret data when there is covariance in the data they collect, but they force their personal interpretations when there is non-covariance in the data. They noted that students “selectively recorded and replaced values in their data set to show a trend” (p. 762).

When examining how characteristics of data may influence how children and adults interpret data, Masnick and Morris (2002) and Masnick and colleagues (2007) found that college students were more confident in their conclusions about the data when the sample size was large (more data points), while third- and sixth-grade students were less confident. The presence of overlapping data points and high variability appears to have made all participants less sure of their conclusions based on the data. Garfield and colleagues (2007) corroborate these findings with their own research on students’ difficulty with reasoning about variability in formal ways. Students are challenged in their understanding of how data may have different values in a particular context and regarding formal measures of variability, as seen in range and standard deviation. Garfield and colleagues (2007) state that some students hold an informal conceptual understanding of variability, such as overall spread, and that not all data values are the same.

In understanding data interpretation, students also have difficulty with drawing conclusions related to their hypothesis and not providing specific evidence to support those conclusions (Germann & Aram, 1996); however, the Germann and Aram study did not utilize first- or secondhand data but instead presented students with written tasks based on hypothetical situations. Similarly, when a hypothetical situation concerning data use was presented, Hogan and Maglienti (2001) found that students and nonscientist adults rely on their personal understandings when making conclusions from data. While using a hypothetical understanding of data is informative, non-hypothetical learning environments (in which students work with actual data) are more valuable for understanding student learning and thus were utilized in this study. Finally, recent research on how students reason with and utilize professionally collected data shows that students often have difficulty with interpreting data maps of Earth phenomena (Kastens et al., 2016; Phipps & Rowe, 2010; Resnick et al., 2018; Swenson & Kastens, 2011). Students may over-interpret data visualizations to believe they contain more information than is actually presented, may not interpret data in the way they are intended, or may misinterpret data due to a misunderstanding of color and scale.

Scaffolding curricula and implementing a heuristic approach for examining scientific evidence (McNeill & Krajcik, 2007; McNeill & Berland, 2017) may support some of the difficult aspects of data analysis and interpretation. Unlike the studies discussed, in the study discussed in this article, we used both student-collected (i.e., firsthand) and professionally collected (i.e., secondhand) data to scaffold student interpretations.

Data analysis and interpretation as authentic science learning

Authentic science learning allows the novice to become part of a social community whose participants practice science in a real context (Calabrese Barton, 2003; Mogk & Goodwin, 2012; Stokes & Boyle, 2009). Students are more likely to be motivated to investigate the science behind the problems in an authentic learning environment; however, students educated in urban public schools are less likely to be prepared to take on the rigors of college-level science courses, as evidenced by their less-developed skills in math, technology, and laboratory investigations (Bailey & Weininger, 2002; Songer et al., 2002). These students would be better served if they were offered the opportunity to excel in a program of study that is recognized as authentic (Hogan & Corey, 2001). Because undergraduate students will become the citizens who shape policy on important issues reliant on basic scientific knowledge, educators must provide resources and use inquiry-based, authentic pedagogies that can support student learning.

The Oceans of Data Institute (n.d.) describes a data-literate individual as someone who is a critical consumer of data by controlling their personal data trail, knows how to document the utility and limitations of data, finds meaning in data, and takes action based on data: “The data-literate individual can identify, collect, evaluate, analyze, interpret, present, and protect data.” But student reasoning with data is problematic for the following reasons:

  1. There is variation and uncertainty in experimental outcomes (Garfield et al., 2007; Masnick et al., 2007).
  2. Complex visual data analysis requires advanced understanding of data origin and interpretation (Kastens et al., 2016; Phipps & Rowe, 2010; Resnick et al., 2018; Swenson & Kastens, 2011).
  3. Ownership of the data—firsthand and secondhand—does not always square with limitations in experimental investigations (Hug & McNeill, 2008).
  4. It is challenging to draw conclusions related to a hypothesis and show evidence supporting a conclusion (Germann & Aram, 1996).
  5. Individuals rely on their personal understandings when interpreting data (Hogan & Maglienti, 2001).

Study population and setting

Student profile

The population studied included nonscience majors from different racial and ethnic backgrounds at a public university in New York City.

Course design

Approximately 96 students were taught using the proposed new method (treatment), while the remainder of students attended traditional lectures and labs. Both the traditional and treatment groups used the Riverkeeper website (https://riverkeeper.org) to find professionally collected data, from which they made graphs and interpreted the results. In the treatment group, the students were divided into laboratory groups of 24, which was then divided further into smaller groups of 4 students each. Each treatment group went into the field to collect data at two sampling locations along the Hudson River. A total of 24 students were at each site with their instructor (4 laboratory classes). For the 96 students in the treatment group, the site visits were spread out over the course of 2 weeks during the middle of the semester. Students used Vernier probes for collecting the water parameters of temperature, pH, and salinity and recorded the data on lab report sheets. The Vernier field study activities gave basic information about each water parameter. Students brought their data back to the laboratory, and all groups added their data to one database using a Google Docs Excel spreadsheet so they could see what everyone had collected and how it compared with their own data (Table 1). These data were then used for analysis by making graphs and for subsequent interpretation of the data.

Students also had access to the College Environmental Laboratory’s high-frequency data collected weekly, year-round, by the principal investigator and a full-time college laboratory technician. These data were collected through more sophisticated instruments (HACH Water Quality Testing) that were used to gather information about temperature, pH, salinity, and dissolved oxygen (Table 2). The principal investigator and the technician were also collecting samples of water to test for Enterococcus bacteria.

Method

Students were able to volunteer to take a pre- and post-assessment survey that included eight Likert-scale (five-point) questions and three open-ended questions. This approach allowed for both quantitative and qualitative assessment (mixed method) to answer the following questions:

Question 1: What pedagogical strategies are needed to foster authentic science learning in an urban liberal arts college? To answer this question, our research examined students’ descriptions of their ability to collect, analyze, and interpret data with new technologies as they learned about water chemistry.

Question 2: How proficient will students become at analyzing and interpreting data within a semester–long course? To answer this question, we examined student descriptions about the analysis and interpretation of the data (both student collected and professionally collected) as seen in student claims, evidence, and reasoning.

We asked these questions to better target our students’ learning outcome. By learning how to collect, analyze, and interpret data using new technologies while participating in field study research, students will be proficient in describing the processes of science: understanding the problem of water pollution, making a claim about water pollution, understanding variable relationships, and being confident in justifying their reasoning.

Data collection

All students were given the opportunity to respond if they agreed with the Institutional Review Board–approved description of the study. The survey questions required self-reporting on eight questions and were not used for assessing knowledge objectively, but rather to understand what students felt and thought about how they reasoned with data. Students were also given three open-ended questions at the end of the Likert questions. All responses were anonymous.

Data sources and analysis

Quantitative data

Quantitative survey data were collected and analyzed using Qualtrics Survey software. The Likert survey questions used a scale ranging from 1 (strongly disagree) to 5 (strongly agree). Qualtrics calculates the minimum and maximum values, mean, variance, standard deviation, and total responses. When the whole-class survey data pre- and post-scores were compared for the treatment group, there were four areas that showed statistical significance (p = < 0.05) compared with the traditional group. Table 3 shows the most significant gains made by the treatment group as compared with the traditional group.

Quantitative findings

As a result of students’ self-evaluative Likert-scale responses in both the traditional group and the treatment group, there were four areas in which students seemed to have made significant gains in their ability to describe what they knew and could do when measuring water parameters, as well as why measuring water parameters was important for understanding water pollution. For the first question, students in the treatment group (p = 0.018) showed a significant change compared with those in the traditional group (p = 0.155) from mid-semester to the end of the semester in that they were able to make a claim about water pollution based on the data and graphs they made from the Riverkeeper Website. Both the traditional group and the treatment group were given the same assignment to download temperature, pH, and salinity data from tables and create graphs in Microsoft Excel in order to examine possible correlations. The data were copied from the Riverkeeper website (professionally collected data), and students were instructed on how to create graphs, including bar graphs, scatter plots, or line graphs, and properly label the information. They were required to do this for two different sites in Riverkeeper’s database. With greater than 95% certainty, the students in the treatment group thought that the graphs made from the Riverkeeper data helped them support their claims about water pollution in the Hudson River. This is also corroborated by the open-ended responses coded from the qualitative data (Table 4, Q13). Second, with more than 95% certainty, the students in the treatment group, compared with the students in the traditional group, thought they understood the problem of water pollution in the Hudson River (Research Question 2). Using the Likert-scale questions to drill down deeper into how students rated their understanding of and claims about water pollution, we asked students to rate whether they felt that they understood the relationship between temperature, pH, and salinity values and if they felt prepared to justify their reasoning about water pollution in general. In both questions, the treatment group rated their confidence change from the middle of the semester to the end of the semester more significantly (see Questions 3 and 4 in Table 3).

Qualitative data

Qualitative survey data were collected using Qualtrics Survey software and analyzed using MAXQDA 2018 software. The open-ended student responses were examined using thematic content analysis in which the themes of the participants’ responses emerged naturally from the data rather than a priori; the themes were linked or reorganized to develop a dominant structure (Chi, 1997; Libarkin & Kurdziel, 2002; Miles & Huberman, 1994). During the analysis, the principal investigator examined key words and phrases that students used to describe both their lecture and lab experience working with data. After reviewing the sample of student responses, the principal investigator created the initial coding scheme and identified specific patterns that were then confirmed by inter-rater reliability as determined by a colleague. If a discrepancy was found, the colleagues discussed it and resolved the coded factor. The inter-rater reliability was greater than 90%. Examples of the questions, coded themes, and key words and phrases used to create the codes are shown in Table 4.

Coding of major themes

Table 5 shows the significant responses of the students’ learning experience, as identified in the open-response survey questions as a result of coding. Six coding categories were identified (two each for Questions 13–15) as providing the most informative responses when students answered the questions about curriculum that involved an authentic, real-world science experience and how proficient students became at collecting, analyzing, and interpreting data in a semester-long course. Question 13 queried students on their experience with using data from the internet as a way to understand water pollution (such data were only used during lecture in the traditional course but in both lecture and lab during the treatment course). The most frequent student responses were that students were able to graph data (from tables) and that they had access to professional data. Question 14 queried students about their use of data in the laboratory as a way to understand water pollution. The most common response in the traditional class was that students learned to “clean” water (few used the word “flocculate”); students in the treatment class said they learned to collect data “firsthand” or that they measured temperature, pH, and salinity. Another frequent response to Question 14 (for both groups) was that students enjoyed the hands-on experience. Question 15 asked students to describe their understanding of water pollution and why we (and scientists) need to measure temperature, pH, and salinity. Responses from both groups showed an understanding of the need to establish a baseline to measure change in water parameters.

Qualitative findings

The three open-ended questions gave students the opportunity to describe their understanding of data used in both the lecture and the lab and then substantiate, in their own words, their understanding of water pollution and why we (and scientists) need to measure temp, pH, and salinity. In both the pre- and post-course surveys for the traditional cohort and the treatment cohort, Question 13 was coded for salient responses. When students were asked to describe their experience with using data from the internet as a way to understand water pollution, “Graph the data” and “Access to professional data” were the two most common responses that showed the students had an understanding of the importance of using professionally collected data (Table 5).

For the first coded response, “Graph the data,” survey responses decreased from 12% at the midpoint of the semester to 6% at the end for the traditional group and from 6% to 4% over the same period for the treatment group. Student responses that were coded for “Access to professional data” were consistently higher in the treatment course at the midpoint and end of the semester, with response rates increasing from 22% to 32% in that course compared with an increase from 12% to 22% in the traditional course.

For Question 14, when students were asked to describe their experience about using data from the laboratory class as a way to understand water pollution (Table 5), students in the traditional group most commonly responded with a description of “Ways to clean water” (the lab was on flocculation, where students used samples of “clean” and “dirty” water they had collected and that were provided in the lab). Students in the treatment group commonly responded, “Collected data firsthand and/or measured temperature, pH, and salinity, ” and both groups responded, “Enjoyed the hands-on experience.” The treatment group consistently described “Collecting data firsthand and/or measured temperature, pH, and salinity” at a high rate at both the midpoint (29%) and end of the semester (33%), and this group showed an increase (from 17% to 23%) in their responses to “Enjoyed the hands-on experience.” The response rate went down in the traditional course.

The third question asked students to describe their understanding of water pollution and why we (and scientists) need to measure temperature, pH, and salinity (Table 5). These water parameters were also part of the lecture assignment to graph data collected from the Riverkeeper website. Both traditional and experimental groups understood that temperature, pH, and salinity are important for establishing a baseline in order to measure any changes, but a larger number of students in the treatment group responded with more description (30% at the midpoint and 47% at the end of the semester, compared with 26% of the traditional group at both times). The coded response “To see how polluted the water is” was too general, and both groups had fewer responses in this area on the end-of-semester survey.

Discussion

When analyzing and interpreting data collected by the students and having students examine the same type of data collected by professionals in a guided inquiry environment, students seem to have a fuller comprehension of the content, as seen in their descriptions. Comparing the student responses to both the Likert-scale and open-ended questions, we found that students in the treatment course described their abilities to collect, analyze, and interpret data with new technologies in a more confident and robust way.

While all students learned how to graph professionally collected data in the lecture class, there were more and stronger descriptive responses for having access to professional data in the treatment group. It may be that these students had a better understanding of the usefulness of professional data for supporting their own data. Additionally, it seems that all students came to understand that establishing a baseline reading of water parameters—which would be used to indicate a possible change in pH, salinity, and temperature—is a reason to examine the evidence, whether it was professionally collected or student collected. The treatment group, however, demonstrated this understanding with more frequency in the surveys at both the midpoint and end of the semester. The treatment group’s frequency of response increased while the traditional group’s frequency of responses on this topic declined, possibly indicating the need for reinforcement in the curriculum about why it is valuable for students to collect their own data and corroborate it with professionally collected data.

Although there were eight Likert-scale questions, four responses demonstrated that students in the treatment group felt they had a strong understanding of water pollution based on data and graphs they made from the Riverkeeper website (professionally collected data). Students felt they understood the relationship between temperature, pH, and salinity in a body of water and that they were prepared to justify their reasoning about water pollution in general. Examining students’ responses to Question 15 supports this self-assessment.

Implications for instructional design

McNeill and Berland (2017, p. 677) recommend a “design heuristic” to support scientific evidence for the K–12 classroom, which we think would benefit liberal arts college students as well—that is, curriculum design should include the following:

  1. The curriculum should be based on empirical data (e.g., observations or measurements) about phenomena in the natural world. The empirical data can be firsthand experiences (e.g., students collecting data), secondhand experiences (e.g., a digital repository of data collected by someone else about the solar system), or a simulation that produces data for students.
  2. Information should be transformable by students when they can find patterns, and evaluate the fit between those patterns and competing claims.
  3. Information should be used dialogically when students work together to make sense of it.

While we agree with the design heuristic, we think that adding concurrent investigations using both student-collected data and professionally collected data would enhance students’ ability to make sense of the natural world. The descriptions about the evidence that supported claims were more detailed and robust for the students who collected the data themselves compared with those who did not. By collecting data themselves and experiencing how scientists collect data, as well as using reliable websites to corroborate their data, students appear to be more confident in describing their results. We understand that this is not always possible when working with all empirical data.

Environmental chemistry can be challenging for undergraduate liberal arts students in an urban area because they may lack experience with conceptualizing a chemical measurement such as the water parameters of pH, salinity, temperature, and oxygen. In New York City, our students are surrounded by water, yet few of them understand why it is important to measure water parameters or the possible problems related to water pollution. Our study approached this problem by using a stand-alone water-collection device with sensors for measuring pH, temperature, and salinity. Students collected data in situ in a field experience at a local river system and concurrently collected professional data from a website and from the college environmental laboratory regarding the same water parameters in the same body of water. This allowed students to take “ownership” of the data (Hug & McNeill, 2008), as evidenced by the quality of student explanations (Delen & Krajcik, 2015), which referenced where the data came from and helped students recognize the process of experimental research.

Student-collected data also help students see firsthand when covariance in data may occur (Kanari & Millar, 2004), such as with temperature and pH or temperature and dissolved oxygen. A student’s sample alone will not show this, but when the student has the opportunity to see a similar sample combined with other student samples, as well as professionally collected samples of the same parameters, and then graph the data, the student will find more meaning in the relationship than they would with just a small sample alone (Masnick & Morris, 2002). Students were required to graph large samples of the same data in both their lecture and laboratory classes, and these graphs presented a visualization of the data that helped verify the source of the data, data as evidence, and graphing relationships to support their claim. The laboratory environment used for collecting, analyzing, and interpreting data is not hypothetical and appears to lend itself to a student’s ability to reason with data.

Increased descriptive engagement was seen among the treatment group members in this study. The results have implications for curricular approaches for liberal arts college science courses whose goal is to foster authentic, real-world experience in a semester-long course using inquiry-based learning.

Limitations

The use of open-ended questions used in the three survey questions at the end of the Likert survey was sufficient for this research; however, an in-depth interview process following up with some of the students would have been beneficial. The survey responses are a result of self-reflection by the students and are not a result of an objective quiz or exam.

Acknowledgments

Funding for this research was provided by the National Science Foundation grant award #1245314. We would like to thank the Department of Sciences for the support given to us and for help acquiring the necessary equipment. We extend a special thanks to Dr. David Warunek, chief college laboratory technician, for his advising and technical support.


Sandra Swenson (sswenson@jjay.cuny.edu) is a lecturer and curriculum designer and Yi He (yhe@jjay.cuny.edu) is a professor of chemistry, both in the Department of Sciences at John Jay College of Criminal Justice at the City University of New York; Heather Boyd (heather.h.boyd@gmail.com) is a program evaluator; and Kate Schowe Good (kgood@mortonarb.org) managed the environmental and natural science teaching labs at John Jay College and is currently a tree conservation research assistant at the Morton Arboretum. 

References

Ali, H., & Khan, E. (2017). Environmental chemistry in the twenty-first century. Environmental Chemistry Letters, 15, 329–346. https://doi.org/10.1007/s10311-016-0601-3

American Association of Colleges and Universities (AAC&U). (2009). VALUE (Valid Assessment of Learning in Undergraduate Education). AAC&U. https://www.aacu.org/initiatives/value

Aram, R. J., & Manahan, S. E. (1995). Environmental chemistry and environmental science: A survey of courses offered in U.S. colleges and universities. Journal of Chemical Education, 72(11), 977–983. https://doi.org/10.1021/ed072p977

Bailey, T., & Weininger, E. B. (2002). Performance, graduation, and transfer of immigrants and natives in City University of New York Community Colleges. Educational Evaluation and Policy Analysis, 24(4), 359–377. https://doi.org/10.3102/01623737024004359

Bridge, G. (2001). Everyday ecologies: Cities, nature, and teaching urban ecology. Journal of Geography, 100(4), 154–165. https://doi.org/10.1080/00221340108978434

Calabrese Barton, A. (2003). Teaching science for social justice. Teachers College Press.

Carlson, C. A. (1999). Field research as a pedagogical tool for learning hydrochemistry and scientific writing skills. Journal of Geoscience Education, 47(2), 150–157. https://doi.org/10.5408/1089-9995-47.2.150

Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. Journal of the Learning Sciences, 6(3), 271–315. https://doi.org/10.1207/s15327809jls0603_1

Cole, A. G. (2007). Expanding the field: Revisiting environmental education principles through multidisciplinary frameworks. The Journal of Environmental Education, 38(2), 35–45. https://doi.org/10.3200/JOEE.38.1.35-46

Delen, I., & Krajcik, J. (2015). What do students’ explanations look like when they use second-hand data? International Journal of Science Education, 37(12), 1953–1973. https://doi.org/10.1080/09500693.2015.1058989

Garfield, J. delMas, R. C., & Chance, B. (2007). Using students’ informal notions of variability to develop an understanding of formal measures of variability. In M. C. Lovett & P. Shah (Eds.), Thinking with data (pp. 117–148). Taylor and Francis.

Germann, P. J., & Aram, R. J. (1996). Student performances on the science processes of recording data, analyzing data, drawing conclusions, and providing evidence. Journal of Research in Science Teaching, 33(7), 773–798. https://doi.org/10.1002/(SICI)1098-2736(199609)33:7%3C773::AID-TEA5%3E3.0.CO;2-K

Gilbert, J. K. (2005). Visualization: A metacognative skill in science and science education. In J. K. Gilbert (Ed)., Visualization in science education (pp. 9–27). Springer.

Hansmann, R. (2009). Linking the components of a university program to the qualification profile of graduates: The case of a sustainability-oriented environmental science curriculum. Journal of Research in Science Teaching, 46(5), 537–569. https://doi.org/10.1002/tea.20286

Hogan, K., & Corey, C. (2001). Viewing classrooms as cultural contexts for fostering scientific literacy. Anthropology and Educational Quarterly, 32(2), 214–243. https://doi.org/10.1525/aeq.2001.32.2.214

Hogan, K., & Maglienti, M. (2001). Comparing the epistemological underpinnings of students’ reasoning about conclusions. Journal of Research in Science Teaching, 38(6), 663–687. https://doi.org/10.1002/tea.1025

Hug, B., & McNeill, K. L. (2008). Use of first-hand and second-hand data in science: Does data type influence classroom conversations? International Journal of Science Education, 30(13), 1725–1751. https://doi.org/10.1080/09500690701506945

Kanari, Z., & Millar, R. (2004). Reasoning from data: How students collect and interpret data in science investigations. Journal of Research in Science Teaching, 41(7), 748–769. https://doi.org/10.1002/tea.20020

Kastens, K. A., Shipley, T. F., Boone, A. P., & Straccia, F. (2016). What geoscience experts and novices look at, and what they see, when viewing data visualizations. Journal of Astronomy & Earth Sciences Education, 3(1), 27–58. https://doi.org/10.19030/jaese.v3i1.9689

Libarkin, L., & Kurdziel, J. P. (2002). Research methodologies in science education: Qualitative data. Journal of Geoscience Education, 50(2), 195–200. https://doi.org/10.1080/10899995.2002.12028052

MacKay, B. (2006). Teaching with visualizations. Starting Point: Teaching Entry Level Geoscience, Science Education Resource Center at Carleton College. http://serc.carleton.edu/introgeo/visualizations/index.html

Manduca, C. A., & Mogk, D. W. (2002). Using data in undergraduate science classrooms. Science Education Resource Center at Carleton College. https://serc.carleton.edu/resources/870.html

Masnick, A. M., Klahr, D., & Morris, B. J. (2007). Separating signal from noise: Children’s understanding of error and variability in experimental outcomes. In M. C. Lovett & P. Shah (Eds.), Thinking with data (pp. 3–26). Taylor and Francis.

Masnick, A. M., & Morris, B. J. (2002). Reasoning from data: The effect of sample size and variability on children’s and adults’ conclusions. Proceedings of the Annual Meeting of the Cognitive Science Society, 24. https://escholarship.org/uc/item/30r8f0kp

McNeill, K. L., & Berland, L. (2017). What is (or should be) scientific evidence use in K–12 classrooms? Journal of Research in Science Teaching, 54(5), 672–689. https://doi.org/10.1002/tea.21381

McNeill, K. L., & Krajcik, J. (2007). Middle school students’ use of appropriate and inappropriate evidence in writing scientific explanations. In M. Lovett & P. Shah (Eds.), Thinking with data: The proceedings of the 33rd Carnegie Symposium on Cognition (pp. 233–266). Lawrence Erlbaum Associates.

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Sage.

Mogk, D. W., & Goodwin, C. (2012). Learning in the field: Synthesis of research on thinking and learning in the geosciences. In K. A. Kastens & C. A. Manduca (Eds.), Earth and mind II: A synthesis of research on thinking and learning in the geosciences research (pp. 131–164). Geological Society of America.

National Academies of Sciences, Engineering, and Medicine. (2016). Science literacy: Concepts, contexts, and consequences. National Academies Press. https://doi.org/10.17226/23595

National Research Council. (2002). Enhancing undergraduate learning with information technology: A workshop summary. National Academies Press. https://www.nap.edu/catalog/10270/enhancing-undergraduate-learning-with-information-technology-a-workshop-summary

National Research Council. (2015). Reaching students: What research says about effective instruction in undergraduate science and engineering. National Academies Press. https://www.nap.edu/catalog/18687/reaching-students-what-research-says-about-effective-instruction-in-undergraduate

NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. National Academies Press. https://www.nextgenscience.org

Oceans of Data Institute (ODI). (n.d.). Building global interest in data literacy: A dialogue [Workshop report]. Education Development Center. http://oceansofdata.org/sites/oceansofdata.org/files/ODI%20Data%20Literacy%20Report_0.pdf

Phipps, M., & Rowe, S. (2010). Seeing satellite data. Public Understanding of Science, 19(3), 311–321. https://doi.org/10.1177%2F0963662508098684

Priemer, B., Pfeiler, S., & Ludwig, T. (2020). Firsthand or secondhand data in school labs: It does not make a difference. Physical Review Physics Education Research, 16(1), 1–6. https://doi.org/10.1103/PhysRevPhysEducRes.16.013102

Resnick, I., Kastens, K. A., & Shipley, T. F. (2018). How students reason about visualizations from large professionally collected data sets: A study of students approaching the threshold of data proficiency. Journal of Geoscience Education, 6(1) 55–76. https://doi.org/10.1080/10899995.2018.1411724

Short, P. (2010). Responsible environmental action: Its role and status in environmental education and environmental quality. The Journal of Environmental Education, 41(1), 7–21. https://doi.org/10.1080/00958960903206781

Songer, N. B., Lee, H.-S., & Kam, R. (2002). Technology-rich inquiry science in urban classrooms: What are the barriers to inquiry pedagogy? Journal of Research in Science Teaching, 39(2), 128–150. https://doi.org/10.1002/tea.10013

Stokes, A., & Boyle, A. P. (2009). The undergraduate geoscience fieldwork experience: Influencing factors and implications for learning. In S. J. Whitmeyer, D. W. Mogk, & E. J. Pyle (Eds.), Field geology education: Historical perspectives and modern approaches (pp. 291–311). Geological Society of America.

Surpless, B., Bushey, M., & Halx, M. (2014). Developing scientific literacy in introductory laboratory courses: A model for course design and assessment. Journal of Geoscience Education, 62(2), 244–263. https://doi.org/10.5408/13-073.1

Swenson, S., & Kastens, K. A. (2011). Student interpretation of a global elevation map: What it is, how it was made, and what it is useful for. In A. Feig & A. Stokes (Eds.), Qualitative inquiry in geoscience education research (pp. 189–211). Geological Society of America.

Tobin, K., Roth, W.-M., & Zimmerman, A. (2001). Learning to teach science in urban schools. Journal of Research in Science Teaching, 38(8), 941–964. https://doi.org/10.1002/tea.1040

Tomboulian, P., & Parrot, K., (1997). Chemical education for toxic substance control. Journal of Chemical Education, 74(12), 1434–1436. https://doi.org/10.1021/ed074p1434

Trumbore, C. N., Bevenour, J., & Scantlebury, K. (1996). Chemistry and the human environment: A course for non-science majors. Journal of Chemical Education, 73(11), 1012–1017. https://doi.org/10.1021/ed073p1012

Wenzel, T. J. (2000). Cooperative student activities as learning devices. Analytical Chemistry, 72(7), 293–295.

Teacher Preparation Teaching Strategies Technology

Asset 2