Skip to main content
 

teaching teachers

Making C-E-R 'Attractive' for Elementary Teacher Candidates

Using an inquiry approach and the Claim-Evidence-Reasoning framework in an elementary science methods class to learn about magnets

Science and Children—July/August 2022 (Volume 59, Issue 6)

By Julie Robinson

Candidates_test_magnetic_force_through_cups_of_water.

Teacher candidates explore magnets. Photos courtesy of the author.

Implementing the Next Generation Science Standards (NGSS Lead States 2013) requires that all teachers rethink their own experiences with science to develop instruction that incorporates performance expectations, practices, and cohesive content across curricula. As a science teacher educator, I strive to show teacher candidates how to create opportunities for their future elementary students in which they explore real-world phenomena and construct arguments from evidence.

McNeill and Krajcik (2012) discuss how the construction of scientific explanations benefits students by fostering deeper understanding of concepts, evidence-based reasoning, critical thinking, and exposure to the nature of science. The Claim-Evidence-Reasoning (C-E-R) framework provides a format for supporting students’ creation of arguments. Additionally, C-E-Rs give teachers an artifact for assessing their students’ understandings and align directly to the NGSS science and engineering practices (NGSS Lead States 2013).

To model this process for teacher candidates, I designed an opportunity for them to experience the three-dimensional approach using familiar content: a two-day lesson sequence comprised of two one-hour sessions on magnets. The lessons allowed candidates to wear both their learner and teacher “hats,” gaining content and pedagogical knowledge while also experiencing a lesson sequence adaptable for their own classrooms.

Day One

Magnet Preassessments

Despite the prominence of magnets in elementary school, teacher candidates often lack a thorough understanding of magnetism (Keeley 2017). To effectively address their grade level standards, teachers must understand key foundational concepts about magnetism, including why only certain metals (iron, nickel, cobalt) are magnetic, how the structure and mass of the magnetic material impact its strength, that magnetism is a non-contact force that interacts with objects at a distance, and how like magnetic poles repel while opposite poles attract. To allow my candidates to access their prior knowledge, I began by using a selection of Page Keeley’s formative assessment probes from the Uncovering Student Ideas in Science series (2009; 2013; 2014). These probes built on each other conceptually by addressing performance expectations across grade levels (specifically K-PS2-1: Plan and conduct an investigation to compare the effect of different strengths or different directions of pushes and pulls on the motion of an object and 3-PS2-3: Ask questions to determine cause and effect relationships of electric or magnetic interactions between two objects not in contact with each other). They also provided my teacher candidates with a valuable formative assessment tool for use in their own elementary classrooms. They discussed their responses in small groups and by listening in, I learned that they held a range of understandings. Many assumed larger magnets were stronger than smaller ones or that all metal was attracted to magnets. After discussing, candidates entered their responses into an online polling app that I displayed for the class so that we could see the whole group’s ideas. In the case of the “Can you pick it up with a magnet?” probe, they could submit suggestions multiple times about items that were attracted to magnets, and the poll created a “wordle” or word cloud (Figure 1) where the most common responses grew on the screen with more votes. The class exclaimed delightedly as they saw new ideas emerge and as words like “paperclips” became larger.

Figure 1
Figure 1. Word cloud for “What sticks to a magnet?”

Word cloud for “What sticks to a magnet?”

Magnet Investigations

The teacher candidates were now eager to answer questions that had emerged through this preassessment. I created four stations (Online Table 1) with materials that allowed them to further explore the concepts elicited from the probes, all appropriate and relevant for elementary students. Each station had a guiding question aligned to both the formative assessment probes and the NGSS for which candidates would gather evidence.

While I wanted the candidates to design investigations to answer the questions, I also wanted to facilitate inquiry. Scientific inquiry in the classroom can be defined as, “the activities of students—posing questions, planning investigations, and reviewing what is already known in light of experimental evidence—that mirror what scientists do” (Martin-Hansen 2002). I allowed candidates to explore the stations as they chose without creating groups, rotations, or any required procedures. They could visit each station for as little or as long as they wished within a 30-minute block. My constraints were only that their explorations remain focused on collecting evidence to answer the guiding question and that they record data in their science notebooks. Candidates could move between stations at their own pace, use the materials however it seemed most relevant, and add other materials or tools as needed. This design not only provided the opportunity for them to plan and carry out investigations but also created inherent differentiation because they were able to pursue what interested them, collaborate to support their scientific discourse and language, and design their investigations and record evidence however was most accessible to them.

In addition to these expectations, I explained that, depending on the grade level of the elementary students with whom they might work, they would need to consider additional management and safety precautions. Magnets and materials for testing them, like paper clips, can provide a choking hazard, so it is imperative that the types of materials and magnets chosen are appropriately suited for the age of the children. Further, additional constraints on children’s movement between stations would be necessary to maintain safe traffic flow and equity of the materials.

As I observed the teacher candidates working at each station, I noticed the guiding questions focused them on key ideas but allowed them to develop their own line of inquiry, which deepened their investigations. One such example occurred when they were exploring if magnet size determined magnet strength. Many expected this to be true and anticipated their data would show this. However, after exploring the effects of simply putting the magnets into piles of paper clips, candidates noted that they were not getting conclusive evidence to differentiate the magnets’ strength. Working together, they decided to be more strategic and to count the number of paper clips each magnet would pick up. However, they realized they still could not make any conclusive claims because they questioned if the surface area of the big magnets was simply providing space to hold more paper clips and was not actually reflecting strength.

Finally, the candidates redesigned their strategy for attracting paper clips by picking them up one at a time, allowing the magnetic force to pass through the line of paper clips. Through this process, they uncovered through their data that the size of the magnet did not appear to determine its strength. Why this was true would emerge later, but for now, the candidates had arrived at an understanding that they had co-created through observations, questioning, discussion, planning, and explaining. It was fascinating that three of the candidates who began at this station did not leave for the entire session, so engaged were they in their investigation.

These practices could be observed at all stations. Candidates exploring magnetic interactions through other substances added a variety of new materials to test and investigated under what conditions and to what extent the magnetic force was affected. Those exploring magnetic interactions between objects not in contact investigated the motion they could create by changing the orientation and distance of the magnets, including spinning magnets, “levitating” paper clips, and changing the speed and direction of a toy car. In other words, inquiry was not limited by the guiding questions but rather provided a foundation on which to design investigations to produce a reasonable claim.

Introducing the C-E-R Framework

When the teacher candidates concluded their investigations, we reconvened as a class to reflect on their data from each station. I introduced candidates to the concept of a claim and asked them to create statements that we believed to be true answering each station’s guiding question. Together, we created initial claims that they all agreed on based on their collective experiences and data. Next, I asked them to provide the evidence that they collected which supported each claim. Together, we crafted additional sentences incorporating this evidence, thus creating the beginnings of C-E-Rs, a process that could be recreated with elementary students. At this point, I allowed the claims and evidence to come directly from the candidates without imposing any additional content so that, with further research and investigation, they could themselves evaluate and refine their claims for greater accuracy and clarity as part of their knowledge-building process. This also allowed me to formatively assess their current and developing understandings about magnetism to plan for further instruction.

I asked the candidates what they thought was still missing in terms of their understandings related to these claims. They readily explained that they didn’t know the “why” behind them, that the scientific context was still missing. This authentic sequence of investigation, discussion, and reflection provided the perfect juncture to introduce them to the Claim-Evidence-Reasoning framework and the rationale for its use as tool for creating scientific explanations and a source of assessment. Their concern about neglecting a more scientific basis for their ideas allowed me to address the “reasoning” aspect of the C-E-R in a way that was meaningful, relevant, and directly tied to scientific content.

My introduction to the C-E-R included an overview of its purpose and rationale and ideas for teaching its structure to students. In preparation for the next class, I asked my teacher candidates to choose one of our magnet claims (as constructed by the whole group; see Online Table 2) and prepare their own draft C-E-R. I provided samples, templates, and additional resources on magnets to support their process and advised them that their initial claims may need revision to more accurately reflect their research as they developed their reasoning and honed their conceptual understandings related to their investigations. Depending on the elementary grade, teachers of younger students could construct C-E-Rs whole group as charts focused on one specific concept at a time, or, in the upper grades, have students create their own drafts similarly to the teacher candidates with appropriate scaffolding and instruction in the C-E-R elements.

Day 2

Evaluating CERs

My teacher candidates arrived for our next class each having prepared C-E-Rs from their magnet investigations. Many were concerned if they “had done it right,” but they clearly had invested substantial effort and had enthusiastically delved into the scientific content about magnets because they were inherently interested in understanding and explaining their prior evidence. As I listened to their conversations, I noticed already an increase in their content knowledge through the construction of their C-E-Rs, thus providing additional formative assessment data. While I had assured them that these were still drafts, many questioned if they were missing information or if their reasoning was clear, which was the perfect launch for our next steps of finalizing our C-E-Rs and consolidating our understandings about magnetism.

To help them frame their thinking about C-E-Rs and to prepare them for sharing and evaluating their own, I provided candidates with several elementary student C-E-Rs to review. I asked them in pairs to identify the claim, evidence, and reasoning in each and to score them based on a simple rubric. Through these discussions, they refined and expanded their understanding of the C-E-R format and considered critical aspects of the process, including students’ grade- level expectations and the location within a unit that a C-E-R may have been written.

Revising CERs

At this point, my teacher candidates were ready to return to their own C-E-Rs. I grouped them based on which of the four magnet claims they had chosen from our initial list so that all members of a group had the same claim. For example, all students who created C-E-Rs for the original claim “Magnetic force will still work through other substances…” were grouped together. They each shared their C-E-Rs and then had 30 minutes to create a “master” C-E-R on chart paper that reflected their collective thinking. Through this process, they once again revised their claims for scientific accuracy, compared and compiled their evidence for consistency, and synthesized their reasoning to best reflect the group’s understanding of the underlying concepts. Materials were again provided so that they could repeat any trials if there was discrepancy and retest to confirm their claims relative to their reasoning. An array of trade books and websites were available so they could conduct additional research if they felt they needed more information to strengthen their explanations. For the candidates, this allowed them to browse a range of trade books about magnetism that they could incorporate into their future science instruction. In an elementary classroom, teachers would select such resources strategically for students’ reading and grade levels and based on information gleaned from formative assessments, which would meaningfully integrate Common Core English Language Arts reading standards as well.

As I circulated, I was delighted to hear candidates discussing and debating. Through refining and improving their C-E-Rs together, they were faced with ideas they could not yet confidently assert and concepts they still wanted to better understand. For example, while one group could articulate from their investigations and subsequent research that it was the structure, material, and mass of a magnet that determined its strength, rather than size, they wondered if size would play a role in strength between magnets that were made of the same material. Would a bigger magnet of the same type as a smaller one be stronger simply by containing more of the same magnetic material? They tested this new question and confirmed this idea by replicating the strategy that they had used previously, this time isolating very specific magnets. By continuing the inquiry process and their work on improving their C-E-R, they were able to further refine their claim and strengthen their reasoning to indicate a more complex and scientifically situated understanding (Figure 2). This phenomenon occurred in all groups, with the claims evolving, evidence growing, and reasoning elaborating to more meaningfully reflect candidates’ learning of magnetism concepts. Active construction of the C-E-Rs allowed candidates to master the content we had been exploring from the moment they discussed the preassessment probes, thus engaging them in the NGSS practices of analyzing and interpreting their data, constructing explanations, and engaging in argument from evidence (NGSS Lead States 2013). This process, implemented in an elementary classroom, would also integrate grade level-appropriate CCSS-ELA standards for both speaking and listening (engaging in collaborative discussions), as well as for production and distribution of writing (respond to questions and suggestions from peers to add details and strengthen writing). The iterations of the C-E-Rs documented candidates’ evolution in thinking from the initial prompts to their current understandings.

Figure 2

Sample anchor chart (see NSTA Connection). Sample C-E-R.

Question: Does the size of a magnet determine how strong it is?

Claim: The materials that make up a magnet and its mass are more important than size in determining its strength.

Evidence: The large magnetic paddle picked up 42 paper clips that were scattered across the table. The magnet needed to be moved around slowly to pick up all of the paper clips. The smaller, silver bullet-shaped magnet also picked up 42 paper clips with a smaller surface area. It pulled the paper clips across the table at a faster rate, could attract the paper clips from farther away than the larger magnet, and required less movement.

Reasoning: The surface area of the two magnets used was different, yet they both picked up the same number of paperclips off the table. This was possible because of the type of material each magnet was made out of. The smaller bullet magnet had a smaller surface area but a stronger magnetic force because it is made of neodymium, which has a structure that makes it more resistant to being demagnetized. The paddle had a weaker magnetic force but enough surface area to accumulate the same amount of paper clips as the silver bullet. It is a ceramic magnet, which is weaker than the neodymium one. If the one magnet was adjusted to match the other in size, the number of paper clips collected would have been different.

When each group had finalized their C-E-Rs, the whole class circulated the “gallery” of C-E-Rs and used sticky notes to add comments and questions to each. Each group then presented their C-E-R and described their process, thus teaching the rest of the class about their magnetism concept. This final stage allowed the students to reflect meaningfully on the content of the other groups in addition to having become experts on their own claims, as well as provided summative assessment data to reflect the culmination of their learning. This model is appropriate for elementary students as well, in that it allows them to actively and collaboratively take ownership over constructing and presenting their knowledge.

Reflecting on the Process

Overwhelmingly, candidates were positive about the opportunity for learning and collaboration this process had provided. From their teacher “hats,” they greatly appreciated the C-E-R format as a way to incorporate literacy into the science curriculum. They discussed how it provided a structure for creating an effective scientific argument that also would allow teachers to both formatively and summatively assess students’ evolving understandings to plan for future instruction. Yet from a learner’s perspective, they asserted that the process of moving from inquiry to the C-E-R draft and then to co-creating the final C-E-Rs helped deepen their conceptual understandings about magnetic force in a way that they could not have achieved on their own. While aspects of the process would be modified for specific grade levels and developmental stages, the overall approach and specific components of the lesson design could be directly replicated to support the implementation of NGSS across elementary grade levels. Through these interconnected science practices, they constructed knowledge together in a manner that replicated the collaborative work of scientists and provides a model of instruction for elementary teachers that can be adapted for any science content or “bundle” of standards.

Conclusions

While magnets are alluring for both teachers and students, this focused approach created a necessary balance between engaging in scientific inquiry, mastering scientific concepts, and creating scientific arguments. Inherently a student-centered lesson, it also held these teacher candidates accountable for evidence-driven learning that is directly tied to existing knowledge and reasoning. The integration of the Claim-Evidence-Reasoning structure provided an overarching framework to ensure the three-dimensional approach of NGSS.


Julie Robinson (julie.robinson@und.edu) is an assistant professor of elementary STEM education at the University of North Dakota in Grand Forks.

References

Keeley, P. 2013. Uncovering student ideas in primary science, volume 1: 25 new formative assessment probes for grades K–2. Arlington, VA: NSTA Press.

Keeley, P. 2017. Uncovering preservice teachers’ ideas about magnetism and formative assessment. Science and Children 55 (1): 20–21.

Keeley, P., and R. Harrington. 2014. Uncovering student ideas in physical science, volume 2: 39 new electricity and magnetism formative assessment probes. Arlington, VA: NSTA.

Keeley, P., and J. Tugel. 2009. Uncovering student ideas in science, volume 4: 25 new formative assessment probes. Arlington, VA: NSTA Press.

Lee, O., E. Miller, and R. Januszyk. 2015. NGSS for all students. Arlington, VA: NSTA Press.

Martin-Hansen, L. 2002. Defining inquiry. The Science Teacher 69 (2): 34–37.

McNeill, K., and J. Krajcik. 2012. Supporting grade 5–8 students in constructing explanations in science: The claim, evidence, and reasoning framework for talk and writing. Upper Saddle River, NJ: Pearson.

National Governors Association Center for Best Practices, Council of Chief State School Officers (NGAC and CCSSO). 2010. Common Core State Standards. Washington, DC: NGAC and CCSSO. 

NGSS Lead States. 2013. Next Generation Science Standards: For states, by states. Washington, DC: National Academies Press. www.nextgenscience.org/next-generation-science-standards.

Curriculum Preservice Science Education Teacher Preparation Elementary

Asset 2