By Peggy Ashbrook
Posted on 2020-03-09
A guest post by Cindy Hoisington (email@example.com), an early childhood science educator and researcher at Education Development Center Inc. in Waltham MA; Regan Vidiksis, a researcher at Education Development Center with a focus on STEM teaching and learning in early education environments; and Sarah Nixon Gerard, an education researcher at SRI Education with a focus on early learning. Welcome Cindy, Regan, and Sarah!
Chances are, since you are reading this blog, you know how important early science experiences are for children’s future learning and achievement. You know they promote children’s critical thinking, collaboration, communication, and creativity; nurture children’s interests in science; and fuel their developing science identities (Center for Childhood Creativity, 2018). But are you sometimes challenged to figure out what children, especially children between the ages of 3 and 8, are actually gaining from the science experiences you provide in your early childhood (EC) setting? Assessing young children’s learning in science can be a complex process, especially in physical science—a domain that many teachers are less familiar with than life science.
The Cat in the Hat Study
As part of a team of Early Childhood researchers at Education Development Center (EDC) and SRI Education, we had the opportunity to take a deep dive intoEC science assessment during a study of physical science and engineering resources associated with the PBS KIDS multi-platform media property, The Cat in the Hat Knows a Lot About That!™ (Cat in the Hat). PBS, in partnership with the Corporation for Public Broadcasting (CPB), developed the resources (including videos, digital games, and offline activities) under the 2015–2020 Ready To Learn Initiative, which is funded through the U.S. Department of Education. Our study (Grindal, et al., 2019) was designed to explore how providing families with access to Cat in the Hat videos and digital games, along with hands-on activities, might support 4- and 5-year-old children’s learning in the areas of physical science and engineering. We think that what we learned as we developed the Cat in the Hat assessment tasks can help you think about how and why you assess children’s science learning in your EC setting.
The Cat in the Hat Assessment Tasks
We developed three individual assessment tasks—referred to here as Bridges, Slides, and Sorting, but more formally called “The Hands-On Preschool Assessments of Physical Science and Engineering”—to measure children’s learning in relation to three distinct sets of Cat in the Hat resources. Since all of the Cat in the Hat resources were developed to align with the Next Generation Science Standards (NGSS Lead States, 2013), we first analyzed the resources. We wanted to ensure that the assessment tasks would be aligned with the same NGSS Disciplinary Core Ideas (DCIs) and Science and Engineering Practices (SEPs) emphasized in the videos, games, and activities that children would engage with during the study.
All three types of resources included in the study emphasize physical science concepts related to the DCIs PS1.A Different types of matter exist and Different properties are suited to different purposes and PS2.A Pushes and pulls have different strengths and directions, as well as a range of NGSS SEPs. The NGSS SEPs are the activities that scientists and engineers use as they do their work and that all children can engage in at developmentally appropriate levels. Our tasks emphasized Planning and Carrying Out Investigations, Using Mathematics and Computational Thinking, Analyzing and Interpreting Data, and Constructing Explanations and Designing Solutions. See Table 1 for a brief description of the assessment tasks and the DCIs, and SEPs associated with each one. (To access a more fulsome description of each task, see the link at the end of this post.)
Table 1: Assessment Task Alignment to NGSS
The study included over 450 children and families in five locations across the United States. We found that access to the Cat in the Hat resources over an eight-week period substantively improved children’s understanding of:
Effect sizes from the Cat in the Hat resources on the four assessments ranged from small to large impacts (from 0.11 to 0.40). You can view a summary of the study report here.
In this study we used performance-based assessment to measure children’s science knowledge and skills in order to test the effectiveness of the Cat in the Hat resources. Performance-based assessment can also help educators uncover children’s knowledge and skills at different points in time and in a way that enables them toplan relevant and responsive curriculum. What we learned in the Cat in the Hat study may be helpful to you in developing assessments in physical science for the preschool children at your setting.
1. Start with the Next Generation Science Standards (NGSS)
It is not by chance that the Cat in the Hat resources and our assessment both align with the NGSS. These standards represent the most current and comprehensive science standards available. They make it clear that, in order to learn science, children need many opportunities to DO science and to communicate what they are doing, noticing, and thinking about with interested adults. Children need teachers who facilitate their direct explorations, draw out their emerging science ideas (whether or not they are scientifically correct), and use those ideas as launching pads for further exploration. The NGSS also emphasize the close relationships between and among the STEM disciplines (science, technology, engineering, and mathematics). As an early childhood teacher in PreK-Grade 2, the NGSS can help you identify the important science “big ideas” appropriate for your children to be working toward. Although it is important not to use performance expectations beyond children’s current grade levels, all young children are developing an understanding of physical science concepts from an early age. For example, Structure and Properties of Matter does not come up in the NGSS until Grade 2, but all teachers know that children are building their understanding of properties of objects (color, size, shape) and materials (texture, hardness, flexibility) from a very young age. PreK teachers can view state or local standards aligned to NGSS or use other resources that make NGSS connections such as the NSTA Position Statement on Early Childhood Science Education, the NSTA Position Statement on Elementary School Science, or the “Early Years”column in NSTA’s Science and Children journal. What the NGSS standards don’t tell you is exactly what to teach.
2. Choose a Topic of Study
During our study, families and children engaged with Cat in the Hat resources—including videos, interactive digital games, and hands-on activities—centered on specific topics. When you engage children in topics of study (for example, light, sound, structures, bridges, ramps, water) in your EC setting over a period of weeks, they have opportunities to gain deeper conceptual understanding of relevant concepts and practice doing science. In a study of bridges, for example, you can support children as they ask investigable questions (“What building materials make the strongest, most stable bridge?”); plan and carry out investigations (such as building bridges out of a variety of blocks and other materials and testing them by adding toy cars and other vehicles); and construct evidence-based explanations (“Heavy, hard blocks work best at the bottom of a bridge because they are strong enough to hold the bridge up.”). Questions to consider when choosing topics of study include: Are there important science concepts connected to the topic and are they developmentally appropriate? Does the topic provide many opportunities for direct investigation? Can children explore this topic from a variety of perspectives? Does the topic connect to children’s everyday lives and experiences? Is the topic interesting and engaging to children and teachers? (Worth, 2010).
The topic of Bridges, for example, meets all of these criteria in the following ways:
The topic of Ramps also meets these criteria. It connects to physical science core ideas related to Structure and Properties of Matter and Forces and Motion. Children experience how objects roll, slide, or stop on ramps with different inclines and textures and engineer ramp systems. As an added bonus, physical science topics—includingBridges and Ramps—lend themselves well to engineering challenges. Questions such as, “How can you make a bridge as long as your arm that will hold six cars?” or “How can you make a ramp system that will get a ball into a bucket on the other side of the room?”challenge children to apply their learning about properties of matter and pushes and pulls to develop a solution to a problem.
3. Plan Learning Goals and Connected Activities
As we developed assessment tasks for the Cat in the Hat study, we had to come up with indicators—what we expected children to do in each task that would represent an understanding of the relevant NGSS core ideas and practices. For example, for the Slides task, we had to figure out how 4- to 5-year-old children might show and talk about their early understanding of the NGSS DCIs PS1.A Different properties of objects and materials make them suitable for different purposes and PS2.A Pushes and pulls can have different strengths and directions. We also had to decide how we thought children might demonstrate their ability to plan and carry out an investigation. We decided we would ask children to identify differences in how the slides felt to the touch (smooth, sticky, and rough); to predict which slide a toy figure would move down the fastest (the smooth slide); and to do a test to figure out which slide was “fastest” (test the toy figure on each slide).
Once you have decided on a topic of study that incorporates big science ideas, it is time to think about your specific learning goals, the learning activities you will provide to address them, and how you will assess children’s learning. It can be tempting to focus your goals and assessments on what children know. However, it is critical to incorporate goals and assessment tasks that enable children to show what they can do—how they go about looking for answers to questions and solutions to problems. With that in mind, sometimes teachers are tempted to use NGSS performance expectations (for instance, Kindergarten: K-PS2-1 Plan and conduct an investigation to compare the effects of different strengths or different directions of pushes and pulls on the motion of an object) as both the learning goal and the assessment indicator for single activities, but this represents a misunderstanding. The performance expectations tell you what you can expect children to know and be able to do by the end of a grade or grade band when they have reached mastery. They don’t elucidate the complex and often subtle progression of understanding and skills that occurs before children reach mastery. In the first activity of a Ramps study, for example, you might choose a goal such as “children will notice that changing the incline of the ramp changes how fast a ball rolls down it,” and you might expect them to try rolling balls on different ramps and noticing which ones go faster and farther. As the unit progresses, and children have more experience with investigating ramps, your goals will change to reflect their growing understanding of forces and motion and their use of NGSS Practices.
4. Use Performance-based Assessments
Once we had aligned the Cat in the Hat resources and the associated assessment tasks, we began thinking about what type of assessment would be most effective. We did a review of available EC science assessments and decided that a performance-based assessment would best meet our needs. This type of assessment is more developmentally appropriate than other types because it enables children to demonstrate what they know and what they can do in the context of play or exploration. An educational focus on 21st-century science skills favors performance-based tasks because they provide opportunities for children to apply and demonstrate critical thinking, problem solving, and innovation in action (Fadel et al., 2007). Other currently available science assessments for young children tend to prioritize content over practice, which can be problematic because knowing science concepts and knowing how to investigate science phenomena are inextricably linked. A benefit to using performance-based assessments at your setting is that they can be administered as part of the curriculum rather than separate from it. This is more developmentally appropriate for young children, who may be anxious in a test-like situation.
During a unit on ramps, for example, you could set up an activity in which children are asked to test a set of objects on a ramp, sort them into groups based on whether they roll or slide, and record their results on a chart. As children rotate through the activity, you or another adult can observe and record what they do and say. This allows you to collect data about the following:
Performance-based assessments work particularly well to assess children’s learning in an ongoing way because they can be set up fairly easily at different points during a topic of study.
5. Construct Assessments that Work for All Children
As we developed the assessment tasks, we aimed to meet three additional criteria. We wanted to minimize the language load, measure children’s learning at different levels of mastery, and break tasks down to put the focus on NGSS Practices and to uncover increments of learning.
Minimize the Language Load
Minimizing how language-dependent the assessment was meant designing prompts that would enable children to demonstrate, point, and use body language as well as oral language to communicate. For example, in one part of the Sorting task, children are asked to show how to fix an incorrect sort. Instead of having to explain their response, they are asked to point to a picture card. In EC classrooms, there is sometimes a tendency to confuse language skills with cognitive skills and to focus on oral language as the primary way for children to communicate what they know. Performance-based assessments level the playing field somewhat for less-verbal children and Dual Language Learners, but it is important to create intentional opportunities for children to demonstrate knowledge nonverbally.
Measure Learning at Different Levels of Mastery
Measuring children’s knowledge and skills at different levels meant incorporating follow-up prompts when children either did not respond or gave an incomplete response to the question. For example, in one part of the Sorting task, children were asked to sort objects into three containers by shape. If the child did not respond, the assessor would place one round, one square, and one triangle-shaped object into each of the correct containers and repeat the prompt. In the classroom, you can make tasks more or less complex as needed. You can increase the complexity of the Ramps assessment task described above, for example, by including objects that both roll and slide, depending on how they are placed on the ramp, or by setting up the ramp so that sliding objects stop on the ramp. This would enable you to collect additional data about children’s understanding that some shapes roll or slide depending on how they are placed on the ramp, or that changing the incline can change an object’s motion. You could simplify the task by fixing the ramp in place or removing the recording component.
As we began to develop the assessment used in our Cat in the Hat study, we also decided that we would use objects, materials, and storylines that were similar to, but not the same as, the ones used in the Cat in the Hat resources. We wanted to avoid giving children the opportunity to merely copy what they had done in the games or observed in the videos, but instead to apply what they had learned to a new situation. Consider doing this in the classroom when you think children are ready. If children have mastered sorting, for example, consider asking them to sort a different set of objects than the ones they typically use. You might introduce buttons for sorting by color, shape, or size, or a set of play, art, and eating objects to sort by use. If children have difficulty sorting the new objects, you can infer they need more practice with familiar ones.
Break Tasks Down to put the Focus on NGSS Practices and Uncover Increments of Learning
In the Cat in the Hat study’s assessment tasks, we broke tasks down so we would be sure to uncover what children were able to do, even if they didn’t know the content. For example, in the Slides task, children were asked to (1) predict which slide a toy character would slide down the fastest, (2) explain why they thought so, and (3) state or show how they might test which slide the character would slide down the fastest. After testing, children were asked how their prediction compared to what happened (“Is that what you thought would happen?”). Breaking down the components of the NGSS Practice planning and carrying out an investigation enabled us to discern what aspects of the practice children were able to engage with. Although a performance-based assessment in the classroom will likely not be as structured as the study’s assessment, you can employ a similar strategy by probing children before and after they investigate. As a unit on Bridges progresses, for example, you might ask children to make and explain their predictions before they investigate. Their responses to these questions can give you a lot of information about what they understand about the properties of materials.
6. Go Forth and Assess!
The assessment tasks we developed were useful to us because they enabled children to talk about and show what they understood and were able to do in relation to the NGSS DCIs and SEPs emphasized in the Cat in the Hat resources. They allowed us to determine that the Cat in the Hat videos, games, and offline activities were effective in supporting children’s understanding of physical science core ideas and their ability to do science. The process of developing the tasks also extended our own understanding of the variety of ways in which children express their learning and the importance of creating intentional opportunities for children to demonstrate, as well as talk about, what they know and can do. If our assessment tasks are useful in relation to topics children are exploring at your EC setting, consider using them. Or better yet, use them as models for developing your own performance-based assessments! Remember that our assessments are not meant to be used to evaluate children’s learning in EC settings. Rather they are meant to be used formatively, to give you the information you need to plan curriculum that is relevant and responsive to what children already know and can do. There is a brief description of each of the tasks in the section that follows and a complete description of our assessments can be found here: http://cct.edc.org/rtl/data-collection-tools.
Hands-On Preschool Assessments of Physical Science and Engineering
Bridges (Length, Strength, and Stability): designed to assess a child’s understanding of how the properties of objects (such as size and shape) and materials (for instance, hardness and flexibility) make them suitable for building a bridge that can hold weight. Children were provided with a group of objects of different lengths and strengths, including an 8” piece of aluminum foil, 8” composition notebook cover (made of cardboard), 8” piece of laminated paper, 6” ruler, 6” piece of cardstock, 6” lasagna noodle, and a small toy car with a Duplo character driver. They were asked to investigate the bridge-building materials and figure out the most suitable object for building a bridge that could support the weight of the car and driver.
Slides (Surfaces and Friction): designed to assess a child’s understanding of how the properties of materials and forces—friction in particular—influence movement on a slide. Children were provided with three 12” wooden ramps, each with the same incline but with a different texture of material stapled on them, including rubber, felt, and steel wool, and a toy Duplo character. They were asked to describe the three different slide textures, and then to predict and justify which of these three slides would allow the toy character to slide down the fastest.
Sorting (Colors, Shapes, and Uses): designed to assess a child’s understanding of how different objects can be described and categorized based on their observable properties and common uses. Children were provided with 21 common childhood objects that incorporated a variety of different colors, shapes, and uses, including a square blue napkin, a round red plate, a plastic orange, a yellow felt triangle, a red triangle block, and a blue Cookie Monster character. Children were asked to identify similarities and differences among the color of the objects, sort the objects on the basis of shape (with picture cues), complete a sort based on use, and fix a sort based on color.
Ashbrook, Peggy. The Early Years column collection. Science and Children. National Science Teaching Association (NSTA) Early Years Column (PreK-Grade 2 collection)
Center for Childhood Creativity. (2018). The roots of STEM success: Changing early learning experiences to build lifelong thinking skills. https://centerforchildhoodcreativity.org/roots-stem-success/
Fadel, C., Honey, M., & Pasnik, S. (2007). Assessment in the age of innovation. Education Week, 26(38), 34-40.
Grindal, T., Silander, M., Gerard, S., Maxon, T., Garcia, E., Hupert, N., Vahey, P., Pasnik, S. (2019). Early science and engineering: The impact of The Cat in the Hat Knows a Lot About That! on learning. Education Development Center, Inc., & SRI International.
National Research Council. (2013). Next Generation Science Standards: For states, by states. The National Academies Press.
National Science Teaching Association (NSTA) (2014). Position Statement on Early Childhood Science Education
National Science Teaching Association (NSTA) (2018). Position Statement: Elementary School Science
Worth, K. (2010). Science in early childhood classrooms: Content and process. Early Childhood Research & Practice (ECRP), 12(2).