Journal of College Science Teaching—March/April 2020 (Volume 49, Issue 4)
By Katherine McCance, Timothy Weston, and Emily Niemeyer
The President’s Council of Advisors on Science and Technology (PCAST) (2012) highlights the importance of improving undergraduate STEM education particularly within the first two years of college, recommending that faculty adopt evidence-based teaching methods that actively engage students. Active learning can take many forms in the scientific curriculum, but growing evidence widely supports the benefits of student-centered pedagogies over lecture-based modes of instruction (PCAST, 2012). When students are actively engaged in the classroom, they remain in their program of study at higher rates (Braxton, Jones, Hirschy & Hartley, 2008) and their performance in science courses improves (Freeman et al., 2014).
A recent study showed, however, that most science courses are taught using lecture-based methods at North American universities (Stains et al., 2018). When instructors revise their courses to include active learning, the processes of evaluation, feedback, and reflection on instructional practices become critically important to their success (Henderson, Beach, & Finkelstein, 2011). Student and faculty surveys are the most common way to evaluate instructional methods and provide instructor feedback, but they may suffer from issues of validity and bias (Williams, Walter, Henderson, & Beach, 2015). In contrast, observational techniques offer a more objective and nuanced documentation of instructional practices in the classroom and serve as potentially powerful professional development resources for faculty.
We describe here how classroom observations can characterize instructional practices among introductory science courses across several disciplines (e.g., biology, chemistry, and computer science). Specifically, we used the Teaching Dimensions Observation Protocol (TDOP) to study differences in student and faculty behaviors and dialogues within undergraduate science classrooms. The TDOP was selected to support a pedagogical reform initiative at our institution because it contains mostly “low- inference” categories (Hora & Ferrare, 2013) that provide understandable information for faculty professional development, and it had readily available training materials, which aided in implementation. As our science faculty revised their courses to incorporate active learning, we provided them with formative feedback reports using both individual and aggregate observational data collected using the TDOP (an example report is provided at ). Although observational protocols such as the TDOP and Classroom Observation Protocol for Undergraduate STEM (COPUS) have been used to characterize various aspects of classroom instruction (Hora & Ferrare, 2014; Stains et al., 2018), previous applications are limited to large-scale studies (> 50 instructors) across multiple institutions. Here, we report on the ability of the TDOP to detect differences in student and faculty behaviors and dialogues using a much smaller sample—six courses taught by five faculty members in three disciplines—at a single undergraduate institution. The smaller size and scope of our study has important implications for expanding the use of classroom observations as a professional development tool for college science faculty.
An overview of participating courses is provided online (). All courses were introductory level, similar in size (< 30 students), designed for science majors, and had comparable meeting schedules. Courses were classified as revised if the instructor had: (1) participated in a 2.5-day pedagogical workshop the previous summer, (2) created a course module with active learning components, and (3) integrated the active learning module into the introductory course that semester. If an instructor did not meet these criteria, the course was designated as unrevised.
Observers participated in all suggested training outlined in the TDOP user’s guide (Wisconsin Center for Education Research, 2014) and inter-rater reliability was determined prior to data collection. A total of 41 observations were conducted for the six introductory courses. Each course was observed 5 to 13 times during a given semester to ensure that we were capturing classroom behaviors that were indicative of the course as a whole. Observed behaviors and dialogue were recorded once per two-minute interval using predefined codes on the online TDOP platform (Wisconsin Center for Education Research, 2010). Multiple codes were recorded if different actions or dialogue occurred within the same interval.
Observation data were downloaded from the TDOP website, then combined for individual courses within a spreadsheet. For a particular TDOP code, a “0” indicated that a behavior was not observed during a two-minute course interval, while a “1” indicated its presence. The proportion of two-minute intervals that a particular code was observed across all class periods for a given course was calculated and reported as a percentage. Results were compared among individual courses in the unrevised and revised categories (data tables are available at ). Courses within a particular category (revised or unrevised) were then aggregated for further analysis. The proportion of two-minute intervals that a particular code was observed across all class periods for a given course category (revised or unrevised) was subsequently calculated and reported as a percentage. Additional details regarding participating courses, data collection, and statistical analyses are available at .
Several differences in instructor-based behaviors and dialogue were observed between unrevised and revised introductory science classrooms (Table 1). For unrevised courses, lecturing with pre-made visuals—such as slides, transparencies, or models—was the dominant teaching method observed, occurring in 55.6% of two-minute intervals, and PowerPoint was used during a similar majority of class time (50.8%). In courses revised to incorporate active learning, however, the use of PowerPoint (26.0%) and lecturing with premade visuals (23.6%) during class was significantly lower. Standard lecturing, in which the instructor spoke to students with no other instructional behaviors (such as actively writing or using visuals), was also observed significantly less often in revised courses (13.5% for unrevised courses, 3.3% for revised). Lecturing with handwritten visuals, which involves the instructor speaking while writing on a whiteboard, chalkboard, or document camera, occurred at similar frequencies in unrevised and revised courses. However, instructors in revised courses wrote on the chalkboard significantly more (26.7%) than instructors in unrevised courses (15.6%). Assessment, defined as the instructor collecting data from students in the form of a test, quiz, or clicker question, was also more frequently employed in revised (21.7%) than in unrevised (6.4%) courses. Additionally, clickers, a common form of instructional technology in active learning classrooms, were used significantly more often in revised (9.8%) than unrevised courses (3.1%). These results are not surprising because the courses in this study that were revised to incorporate active learning primarily used a flipped classroom model in which students watched videos or completed assigned reading on lecture topics prior to coming to class. Students were then regularly assessed on their at-home learning using individual or group quizzes and clicker-based questions.
Table 1. Percentage of all two-minute intervals that an instructor-centered code was observed across all class periods for unrevised and revised introductory science courses. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Differences in student-focused behaviors and dialogue were also observed between unrevised and revised courses (Table 2). In courses revised to incorporate active learning, the most frequently observed teaching method was student- centered: small group work, occurring in 63.5% of all two-minute intervals (compared to 2.5% for unrevised courses). Correspondingly, peer interactions (61.8%) and problem solving (70.9%) occurred significantly more often in revised compared to unrevised courses. Particular student behaviors are indicative of active learning environments, as Hora (2015) demonstrated by mapping TDOP codes onto elements of the differentiated overt learning activities (DOLA) framework (Chi & Wylie, 2014). For example, the TDOP problem-solving code is an “active” modality, while peer interaction is noted as an “interactive” modality. The prevalence of these two codes in revised courses suggests that the classroom activities in revised courses engaged students in active and interactive modes more frequently than in unrevised courses.
Table 2. Percentage of all two-minute intervals that a student-centered code was observed across all class periods for unrevised and revised introductory science courses. | |||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
Hora (2015) also mapped the student response TDOP code as “active” and the student question code as a “constructive” modality within the DOLA framework. Student questions—in which students seek new information—were more frequently observed in unrevised (27.0%) than revised courses (11.7%). Additionally, student responses were observed significantly more often in unrevised (32.7%) than revised courses (21.2%). When students are placed in small group work environments in revised courses and are more frequently engaged in student–student interactions, they may correspondingly pose questions to instructors less often because their peers are answering their questions more frequently. Students in unrevised courses, however, are much more likely to be engaged in individual deskwork (29.7%) than collaborating in small group work (2.5%), which suggests that students interact with their instructor more frequently than their peers within that structured classroom environment. While students in unrevised courses were engaged in active and constructive DOLA modalities (Chi & Wylie, 2014) for almost one-third of the classroom intervals observed—highlighting the complexity of the classroom environment in lecture-based courses—the TDOP passive listening code was still recorded in nearly every observed interval (94.0%) within unrevised courses. Comparatively, students in revised courses were engaged in active and interactive DOLA modalities (Chi & Wylie, 2014) for more than twice as much class time (greater than two-thirds of all intervals) than in unrevised courses. Correspondingly, passive listening was observed much less frequently (48.7%) in revised than unrevised courses.
Through hierarchical cluster analysis, student and faculty behaviors and dialogues were grouped based on similarities in the proportion of two-minute intervals that TDOP codes were observed for both unrevised and revised introductory science courses (dendrogram plots are available at ). For unrevised courses, a four-cluster solution was determined with the largest group containing 17 TDOP codes that were seldom observed (average = 6.5% of two-minute intervals), a cluster with nine codes that occurred about one-third of the time (28.4%), and two groups containing single behaviors observed the majority of the time, lecturing with pre-made visuals (55.6%) and passive listening (94.0%). Again, student-centered behaviors and dialogues—such as students responding to instructor questions, solving problems, and posing questions to their instructor—regularly occurred in unrevised courses, in about one-third of observed class time. These unrevised courses are best classified as “interactive lecture” (lecture interspersed with student-centered pedagogies) based on instructional categories developed in a large-scale study of science classrooms using the similar COPUS assessment tool (Stains et al., 2018).
Revised courses also had a four-cluster solution with the largest group containing 19 infrequently observed TDOP codes (average = 4.0% of two-minute intervals), a cluster containing five behaviors that occurred about one-quarter of the time (22.7%), a group with only one code, passive listening, that was observed during about half of class time (48.7%), and a group with three behaviors that were observed during the majority of intervals (65.4%). Because the majority of class time was spent engaged in student-centered pedagogies and behaviors—small group work, peer interaction, and problem solving—these revised courses are classified as “student-centered” (Stains et al., 2018), a different instructional profile than the “interactive lecture” style of unrevised courses. According to our cluster analysis, only five TDOP codes were observed during the majority of class time in either unrevised or revised courses. Figure 1 provides a comparison of the percentage of class time spent engaged in these most commonly observed behaviors and dialogues for unrevised and revised courses. Most notably, a substantial difference in small group work was observed for revised courses. Review of the active learning modules developed by faculty who teach these revised courses confirmed that their course materials emphasized small group work (e.g., worksheets, group quizzes, and clicker questions) to create a more student-centered learning environment. In fact, small group work in revised courses was associated with more frequent peer interactions, as well as increased student problem solving. Additionally, students in revised courses spent less time in class passively listening and instructors made less frequent use of premade visuals (such as PowerPoint) when compared to unrevised courses.
Comparison of the percentage of class time engaged in the most commonly observed Teaching Dimensions Observation Protocol (TDOP) codes in this study for unrevised and revised courses.
Our results highlight the effectiveness of the TDOP observational protocol to identify variations in student and faculty behaviors and dialogues occurring within science courses spanning multiple disciplines. Even with a relatively small sample size, significant differences existed in instructional practices among courses that used active learning pedagogies and those that did not. We found that revised courses exhibited more class time devoted to student-centered activities (e.g., problem solving, peer interaction, and small group work) along with less time spent with students passively listening to the instructor. Importantly, revised courses had a “student-centered” instructional profile while unrevised courses used the more instructor-centered “interactive lecture” approach.
Limitations to our study do exist. Although variations in instructional behaviors were noted among the revised courses in this study, all were taught within a single discipline, chemistry, which may have led to an overestimation of the amount of small group work and problem solving typically occurring in a revised course. It is also important to note that although the TDOP characterizes classroom practices, it cannot determine whether particular pedagogies are used successfully or capture the many other dimensions of effective teaching and student learning (Hora, 2013).
In summary, observational assessments such as the TDOP can effectively detect differences in instructional practices among a small comparison group of science courses at a single undergraduate institution. Previous research using observational protocols has focused on multi-institutional studies of science classrooms at doctorate-granting universities (Stains et al., 2018). Our results therefore provide an important foundation to expand the applicability of observational protocols to characterize classroom behaviors across a greater range of institution types and sizes. Observational techniques may supplement more frequently used instruments, such as student and faculty surveys, as a multidimensional professional development tool to provide formative feedback for college science instructors implementing pedagogical changes within their courses.
Acknowledgments
This work was supported by the Howard Hughes Medical Institute through the Undergraduate Science Education Program, grant number 52007558. Support was also provided by Southwestern University’s Herbert and Kate Dishman endowment. The authors wish to acknowledge Meredith Rollins and Nicole Hewitt for their collection of the classroom observation data. Nicole Hewitt also wrote the instructor report provided in the supplementary material to this article.
Katherine McCance is a doctoral student in the Department of STEM Education at North Carolina State University in Raleigh, North Carolina. Timothy Weston is research faculty at the National Center for Women and IT (NCWIT), School of Engineering, at the University of Colorado, Boulder. Emily Niemeyer (niemeyee@southwestern.edu) is a professor in the Department of Chemistry and Biochemistry at Southwestern University in Georgetown, Texas.
Research Teaching Strategies Postsecondary