Skip to main content
 

Research & Teaching

STEM Faculty Institute

An Intensive Interdisciplinary Effort to Improve STEM Faculty Adoption of Evidence-Based Instructional Practices

Journal of College Science Teaching—January/February 2022 (Volume 51, Issue 3)

By Richard E. West, Jamie L. Jensen, Michael Johnson, Jennifer Nielson, Rebecca Sansom, and Geoffrey Wright

As change agents at our university, we sought to facilitate a transition within our STEM college toward more extensive use of evidence-based instructional practices (EBIPs) that promote student-centered learning. We sought a multifaceted approach for this challenge, with the goal of achieving lasting change among the faculty, resulting in the creation of the STEM Faculty Institute (STEMFI). Supported by multiyear funding from the National Science Foundation, STEMFI provides instruction on EBIPs through a summer workshop, along with yearlong support in redesigning curriculum through one-to-one mentoring of each faculty participant. The first cohort of STEMFI participants has been successful, with most faculty showing significant shifts toward more student-centered practices in their teaching. Feedback on the post-workshop survey indicated participants considered the material helpful and engaging, encouraging a shift in their attitudes toward supporting EBIPs in their own classrooms. This article describes factors critical to the STEMFI approach and reports on the initial effectiveness observed.

 

In 2012, the President’s Council of Advisors on Science and Technology reported that the United States would need 1 million additional STEM graduates within the next decade, an increase of 34% annually. Although some progress is evident for increasing the overall number of STEM graduates, too many students do not complete STEM programs they have begun—particularly students from minority and diverse backgrounds. The National Research Council (2012) also reported that the research “clearly shows that research-based instructional strategies are more effective than traditional lecture in improving conceptual knowledge and attitudes about learning” (p. 3). These strategies are often referred to as student-centered learning; however, we use the term evidence-based instructional practices (EBIPs) to emphasize the strong foundation in research demonstrating their effectiveness.

Despite frequent calls for more integration of EBIPs into the classrooms of higher education, the barriers for change are high. Borrego and Henderson (2014) reviewed 191 articles on instructional change, published between 1995 and 2008, and reported their conclusion of four categories of strategies for creating effective instructional change: (a) diffusion (increasing awareness of quality strategies), (b) implementation (putting the theories into practice), (c) teacher reflection, and (d) policy change at all levels of an academic institution to encourage new teaching. At our university, we focused on these four strategies to create a multifaceted approach to the challenge, which would be durable and lasting. This effort led to the creation of STEMFI: the STEM Faculty Institute.

Components and scheduling

Supported by multiyear funding from the National Science Foundation, STEMFI provides diffusion of EBIP strategies by conducting a summer workshop, implementing these strategies through the revision of faculty courses, enabling teacher reflection supported by focused one-to-one mentoring through the year of change, and supporting environmental change through administrative support and policies. In this article, we describe critical factors to the STEMFI approach, then report on the initial effectiveness we have observed with the first cohort.

Administrative leadership support

Lasting instructional change requires initial “buy-in” from university professors and administrators. Initial support for STEMFI was achieved by conducting two focus group interviews with approximately 30 faculty and administrators who provided feedback on the proposed design and implementation. The feedback was used to further refine the curriculum materials and research protocols.

STEMFI was designed primarily for the faculty of the colleges of life science, engineering and technology, and physical and mathematical sciences. The STEMFI team furnished the modified STEMFI documents to each of the college deans and department heads whose units would be involved, as well as the university’s central administration for approval. The ensuing letters of support helped assure faculty that STEMFI was encouraged at all levels of the institution, along with providing documentation of the value of STEMFI that faculty could include in their tenure and promotion process. The deans helped disseminate STEMFI by having each of their department heads send personal notifications encouraging their faculty to enroll.

Administrative support and leadership for STEMFI have been critical to its success. In both STEMFI cohorts to date, there have been more applicants than spaces available. As faculty have been concerned about ways pedagogical change might affect their ability to meet promotion and tenure requirements, this support from all levels of the institution has been valuable.

Focused faculty mentoring

Mentoring has been vital to the success of the STEMFI program. Strong mentoring relationships are essential for faculty to successfully overcome personal and social barriers to student-centered teaching and implement lasting change. Researchers at the Summer Institute for Undergraduate Education in Biology (sponsored by the National Academies, Howard Hughes Medical Institute, and the University of Wisconsin-Madison) concluded from their project that feedback from mentors was necessary for supporting change (Ebert-May et al., 2011; Pfund et al., 2009).

Each STEMFI workshop participant has been assigned a mentor from the same discipline who assists the participant for one academic year with designing and implementing active learning strategies in the participant’s course. The mentors in the first cohort were members of the STEMFI team, along with four senior faculty identified as master teachers. All mentors have gone through an intensive 1-day training prior to the initial workshop through Brigham Young University’s Center for Teaching and Learning.

During the last half of the workshop, mentors are paired with participants to assist with redesigning their courses and learning activities. Mentors continue their support after the workshop by providing one-on-one attention, attending the mentee’s class when an activity is piloted, and meeting with their mentee after the class to review the implementation, including celebrating what went well and strategizing about how to improve.

In addition, mentors have helped their mentee prepare a presentation for one of the monthly cohort meetings to share a new activity they have implemented. The cohort meetings enable participants to reconnect and peer mentor each other. The general cohort meeting structure is a modified consultancy and tuning protocol (Allen & Blythe, 2004), in which the presenter shares concerns and shows a short clip of the EBIP executed in class. The cohort discusses the issues and the presenter concludes with a response to the feedback and final comments.

The selection, training, and role of the mentors during the second year have been improved using feedback from the first cohort. We reimagined the role of the mentor and changed the name to peer-teaching partner (PTP). We nominated exceptional faculty graduates from the first STEMFI cohort to be PTPs for the second cohort, training them during a breakout session during the second summer workshop. These mentors participated in a panel for the new cohort, ate lunch together, and worked with their participants to develop activities. An important component was for the PTPs to attend the culminating workshop activity in which the participants taught a newly created activity to the group. These changes have created more space and time for the participants and the PTPs to build relationships, and both groups have been invested to work together throughout the year.

Course redesign and summer workshop

The summer workshop was intended to address personal factors relevant to pedagogical change, which often begins with redesigning a course. The workshop has promoted learner-centered beliefs by engaging the participants as learners (Loucks-Horsley et al., 2003) and has fostered positive attitudes about EBIPs by sharing a deep understanding of the theoretical underpinnings of student-centered teaching (Prather & Brissenden, 2008). It has also improved self-efficacy by providing opportunities during the workshop and the mentored implementation phase for faculty to experiment, receive feedback, and modify their instruction. To address pedagogical dissatisfaction that “results when one recognizes the mismatch between the stated teaching beliefs, goals, instructional practices, and student learning outcomes” (Gess-Newsome et al., 2003, p. 762), the workshops have provided training for faculty in articulating student learning goals and sharing their personal classroom observation data, as well as time to reflect on the ways their actions might not match their beliefs and goals.

The workshop was also designed to enable participants to learn how to transform their teaching with EBIPs. Instruction has included introducing Bloom’s taxonomy (Anderson & Krathwohl, 2001) and the scientific teaching worksheet (Handelsman et al., 2007) to help participants write better learning outcomes to align their classroom activities with their goals for student learning. A broad range of EBIPs have been incorporated to help faculty design more effective and engaging instruction for their courses: think-pair-share (Allen & Tanner, 2002); paired verbal fluency (Llewellyn, 2013); peer instruction (Crouch & Mazur, 2001); the 5E learning cycle (Bybee, 1993); “I do, we do, you do” (Bowgren & Sever, 2010); spaced retrieval and interleaved practice (Hopkins et al., 2016; Rohrer et al., 2015); jigsaw (Doymus, 2008); one-minute papers (Nilson, 2010); and decision-based learning (Sansom et al., 2019).

The STEMFI summer workshops have consisted of 5 days of teaching, practicing, and developing a course. Table 1 provides a brief outline of the schedule used for the first cohort. In the workshop, the participants have explored evidence supporting student-centered teaching with readings and discussions each day. Every concept or activity has been taught and modeled using EBIPs, followed by participant reflection, so participants have experienced EBIPs while learning the workshop content. Each morning has begun with an EBIP strategy requiring participants to work in small groups to summarize the readings from the night before (e.g., four-sentence summaries, jigsaw).

The final day of the workshop explored assessment strategies, teaching the participants about backward design in education research (Jensen et al., 2017) for assessing their teaching scientifically. Formative assessment strategies such as using classroom response systems (e.g., ABCD cards, clickers; Bruff, 2009), writing exam questions (Tarrant & Ware, 2008), and implementing exit tickets (Dixson & Worrell, 2016) were demonstrated and emphasized as crucial for learning how to gauge student understanding during a unit and modify instruction accordingly.

Summative assessment strategies to help faculty measure student learning at the end of a unit were briefly addressed, including peer social assessment (Panadero & Jonsson, 2013), calibrated peer review (Robinson, 2001), and assessment blueprints (Fowell et al., 1999)—strategies that help faculty visualize the distribution of learning outcomes and levels of thinking (Bloom’s taxonomy) across an assessment. The three-dimensional learning assessment protocol can be used for both formative and summative assessments to ensure that items address core ideas in addition to science and engineering practices and crosscutting concepts (Laverty et al., 2016).

We improved the workshop structure and scope during the second year. We began the 5-day workshop on a Wednesday so that a weekend separated the first 3 days from the last 2. We reduced the number of activities and EBIPs demonstrated and incorporated opportunities for participants to demonstrate their teaching each day of the workshop, culminating in a realistic teaching experience on the last day. This restructuring allowed participants more time to process their own learning and manage the intense experience. Combined with the changes we made in mentoring, our design-based research cycle is helping us iteratively improve the STEMFI experience for faculty.

Data-driven instruction and practice

Part of the STEMFI approach to supporting change among faculty has been helping them better understand their teaching practices and beliefs and recognize through design-based research how the STEMFI intervention has affected these practices and beliefs. To provide participants with formative feedback and gather summative information about the program’s success, we utilized the Classroom Observation Protocol for Undergraduate STEM (COPUS; Smith et al., 2013). This protocol involves trained observers recording behaviors according to a detailed rubric every 2 minutes throughout the class period, giving a code for both student and instructor behaviors. Results of the COPUS instrument give the instructor (and the observer) a quantitative picture of time spent in three main categories: presenting/receiving (a more teacher-centered approach), guiding/having students work (a more student-centered approach), and other (e.g., administrating, waiting, and conducting non-course-related activities).

We observed faculty participants three or four times before they participated in the summer workshop, where we gathered baseline data on their typical teaching approaches. We showed these data to participants on Day 5 of the workshop to help them reflect on their teaching behaviors and recognize areas for improvement. We measured participants four additional times during the semester of reform, uploading successive data to COPUSprofiles.org (Stains et al., 2018) and analyzing for changes in overall teaching behavior. These data were provided to faculty participants and included in a letter of support for rank advancement, as well as used to validate the effectiveness of the STEMFI program.

To ensure that the workshop was responsive to participant needs, in addition to considering the COPUS data, we interviewed every faculty member prior to the workshop to discuss factors that influenced their instructional decisions (see Figure 1 for interview questions). Broadly, these included personal factors (e.g., attitudes, beliefs, self-efficacy), social factors (e.g., pressure from students, colleagues, administration), and environmental factors (e.g., time, resources, student characteristics). Based on the responses, we created a participant profile for each faculty member that included a 2-page summary of the individual’s personal, social, and environmental factors. The purpose of this profile was to create a single narrative for each faculty member about their challenges, beliefs, and goals. For example, some participants might have positive attitudes about EBIPs but fear low student ratings that would influence their tenure or promotion. We also analyzed the interviews to suggest which types of interventions would be most useful or helpful to the interviewees, and we included this information in the profile so the faculty members’ mentors or PTPs could understand more accurately how to support their growth. Each participant was furnished with a profile copy and asked to read it and modify it if needed for accuracy.

The primary purpose of these profiles, as explained, was to assist us in preparing the workshop to meet participants’ needs and to assist mentors in understanding how to support the faculty members. In addition, we did closing interviews with each faculty member and asked questions similar to those from the pre-workshop interview. Comparing these closing interviews to the initial interviews and profiles helped us see patterns in faculty growth through the course of the STEMFI intervention.

Findings on STEMFI success

Results of workshop surveys

Participants completed a survey at the end of the weeklong workshop to provide affective data about their impressions and changes in beliefs, attitudes, and confidence. Figure 2 shows responses to a selected set of questions concerning the overall quality of the workshop. In the second part of the survey, participants reflected on their attitudinal changes. To guide them in doing this, we asked them to compare their impressions prior to the STEMFI workshop with their impressions after having completed it. As these responses were collected simultaneously, reports of prior attitudes may be influenced by current attitudes; however, we consider this adequate for allowing participants to express the change they felt they experienced. Figure 3 shows participants’ reported shifts in motivation, confidence, and support, and Figure 4 shows their shifts in knowledge and skill regarding student-centered practices, resources available, and beliefs about the effectiveness of student-centered teaching. As shown in Figures 3 and 4, on average, participants expressed positive shifts in each of these factors, leading us to conclude that the workshop was successful in accomplishing its goals.

Findings from COPUS profiles

Copusprofiles.org performs a latent profile analysis grouping COPUS observations into clusters defining a particular way of teaching (Stains et al., 2018). For the first STEMFI cohort, the seven resulting clusters were grouped into three broad teaching styles: (a) didactic teaching, defined by 80% or more lecture; (b) interactive lecture, defined as mostly lecture but including student-centered strategies such as clicker questions or group work added to the lecture period; and (c) student-centered teaching, defined as incorporating significant student-centered strategies into large portions of the class. From this cohort, the 14 participants’ pretreatment and posttreatment observations were analyzed, clustered, and compared to assess change. Eight participants made significant shifts toward student-centered teaching, with two moving from solely lecture to interactive lecture, three moving from solely lecture to student-centered teaching, and three moving from interactive lecture to student-centered teaching (see Figure 5). Four participants were already implementing significant student-centered practices; they continued student-centered teaching after STEMFI. Two participants were difficult to profile, having primarily didactic profiles with occasional student-centered practice—both moving to a mix of didactic practice and interactive lecture. Thus, findings indicated that 86% of participants remained in or moved to more student-centered teaching profiles.

Conclusions

In this article, we have discussed the need for and initial design of STEMFI—an initiative to support faculty in significant change toward more evidence-based instructional practices. We explained the key elements of STEMFI, including administrative support, faculty mentoring, a weeklong summer workshop, and data-driven reflection and practice.

Our initial findings after the first cohort were that STEMFI has been successful. Most faculty experienced significant shifts in their teaching toward more student-centered practices. Feedback on the survey after the workshop indicated participants felt the material was helpful and engaging, encouraging a shift in their attitudes toward supporting EBIPs in their classes.

We are actively developing the intervention through several enhancements. First, we are developing a STEMFI workshop to enhance administrators’ understanding of EBIPs, including how EBIPs can be effective within each college and how administrators can support their faculty in adopting these practices. We also plan to develop a self-sustaining interdisciplinary faculty community of practice that can continue processes of peer mentoring, support, and feedback as faculty continue evolving in their EBIP implementation. In addition, we are transforming the role of the peer-teaching partners (i.e., mentors) to support their progression in this crucial role. Finally, we plan to collect resources along with models of effective EBIPs that can be showcased and disseminated throughout the university community via a website and thus shared with the larger STEM community.

Acknowledgments

Funding for this project was provided by the National Science Foundation.


Richard E. West (rickwest@byu.edu) is a professor in the Department of Instructional Psychology and Technology, Jamie L. Jensen is an associate professor in the Department of Biology, Michael Johnson is a teaching and learning consultant with the Center for Teaching and Learning, Jennifer Nielson is the associate dean of the College of Physical and Mathematical Sciences, Rebecca Sansom is an associate teaching professor in the Department of Chemistry and Biochemistry, and Geoffrey Wright is an associate professor in the Department of Technology and Engineering Studies, all at Brigham Young University in Provo, Utah.

References

Allen, D., & Blythe, T. (2004). The facilitator’s book of questions: Tools for looking together at student and teacher work. Teachers College Press.

Allen, D., & Tanner, K. (2002). Approaches in cell biology teaching. Cell Biology Education, 1(1), 3–5. https://doi.org/10.1187/cbe.02-04-0430

Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. Longman.

Auchincloss, L. C., Laursen, S. L., Branchaw, J. L., Eagan, K., Graham, M., Hanauer, D. I., Lawrie, G., McLinn, C. M., Pelaez, N., Rowland, S., Towns, M., Trautmann, N. M., Varma-Nelson, P., Weston, T. J., & Dolan, E. L. (2014). Assessment of course-based undergraduate research experiences: A meeting report. Life Science Education 13(1), 29–40. https://doi.org/10.1187/cbe.14-01-0004

Borrego, M., & Henderson, C. (2014). Increasing the use of evidence-based teaching in STEM higher education: A comparison of eight change strategies. Journal of Engineering Education, 103(2), 220–252. https://doi.org/10.1002/jee.20040

Bowgren, L., & Sever, K. (2010). 3 steps lead to differentiation. Journal of Staff Development, 31(2), 44–47, 58. https://search.proquest.com/docview/870747163?accountid=4488

Bruff, D. (2009). Teaching with classroom response systems: Creating active learning environments. Jossey-Bass.

Bybee, R. (1993). An instructional model for science education. Developing biological literacy. Biological Sciences Curriculum Studies.

Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970. https://doi.org/10.1119/1.1374249

Dixson, D., & Worrell, F. C. (2016). Formative and summative assessment in the classroom. Theory Into Practice, 55(2), 153–159. https://doi.org/10.1080/00405841.2016.1148989

Doymus, K. (2008). Teaching chemical bonding through jigsaw cooperative learning. Research in Science & Technological Education, 26(1), 47–57. https://doi.org/10.1080/02635140701847470

Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., & Jardeleza, S. E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience, 61(7), 550–558. https://doi.org/10.1525/bio.2011.61.7.9

Fowell, S. L., Southgate, L. J., & Bligh, J. G. (1999). Evaluating assessment: The missing link? Medical Education, 33(4), 276–281. https://doi.org/10.1046/j.1365-2923.1999.00405.x

Gess-Newsome, J., Southerland, S. A., Johnston, A., & Woodbury, S. (2003). Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. American Educational Research Journal, 40(3), 731–767. https://doi.org/10.3102%2F00028312040003731

Handelsman, J., Miller, S., & Pfund, C. (2007). Scientific teaching. W. H. Freeman.

Hopkins, R. F., Lyle, K. B., Hieb, J. L., & Ralston, P. A. S. (2016). Spaced retrieval practice increases college students’ short- and long-term retention of mathematics knowledge. Educational Psychology Review, 28, 853–873. https://doi.org/10.1007/s10648-015-9349-8

Jensen, J. L., Bailey, E. G., Kummer, T. A., & Weber, K. S. (2017). Using backward design in education research. Journal of Microbiology and Biology Education, 18(3). https://doi.org/10.1128/jmbe.v18i3.1367

Kimmons, R. (2018). Technology integration: Effectively integrating technology in educational settings. In A. Ottenbreit-Leftwich & R. Kimmons (Eds.), The K–12 educational technology handbook. EdTech Books. https://edtechbooks.org/k12handbook/technology_integration

Laverty, J. T., Underwood, S. M., Matz, R. L., Posey, L. A., Carmel, J. H., Caballero, M. D., Fata-Hartley, C. L., Ebert-May, D., Jardeleza, S. E., & Cooper, M. M. (2016). Characterizing college science assessments: The three-dimensional learning assessment protocol. PLoS ONE, 11(9), e0162333. https://doi.org/10.1371/journal.pone.0162333

Llewellyn, D. (2013). [Review of Success in science through dialogue, reading and writing, by A. Beauchamp, J. Kusnick, & R. McCallum]. Science Scope, 36(9), 95–96. http://www.jstor.org/stable/43184834

Loucks-Horsley, S., Love, N., Stiles, K. E., Mundry, S. E., & Hewson, P. (2003). Designing professional development for teachers of science and mathematics (2nd ed.). Corwin.

National Research Council. (2012). Discipline‐based educational research: Understanding and improving learning in undergraduate science and engineering. National Academies Press.

Nilson, L. B. (2010). Teaching at its best: A research-based resource for college instructors (3rd ed.). Jossey-Bass.

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9, 129–144. https://doi.org/10.1016/j.edurev.2013.01.002

Pfund, C., Miller, S., Brenner, K., Bruns, P., Chang, A., Ebert-May, D., Fagen, A. P., Gentile, J., Gossens, S., Khan, I. M., Labov, J. B., Pribbenow, C. M., Susman, M., Tong, L., Wright, R., Yuan, R. T., Wood, W. B., & Handelsman, J. (2009). Summer institute to improve university science teaching. Science, 324(5926), 470–471. https://doi.org/10.1126/science.1170015

Prather, E. E., & Brissenden, G. (2008). Development and application of a situated apprenticeship approach to professional development of astronomy instructors. Astronomy Education Review, 7(2), 1–17. http://dx.doi.org/10.3847/AER2008016

President’s Council of Advisors on Science and Technology. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. U.S. Government Office of Science and Technology.

Robinson, R. (2001). Calibrated Peer Review™: An application to increase student reading and writing skills. The American Biology Teacher, 63(7), 474–480. https://doi.org/10.2307/4451167

Rohrer, D., Dedrick, R. F., & Stershic, S. (2015). Interleaved practice improves mathematics learning. Journal of Educational Psychology, 107(3), 900–908. https://doi.org/10.1037/edu0000001

Sansom, R. L., Suh, E., & Plummer, K. J. (2019). Decision-based learning: “If I just knew which equation to use, I know I could solve this problem!” Journal of Chemical Education, 96(3), 445–454. https://doi.org/10.1021/acs.jchemed.8b00754

Simonson, S. S. (2019). POGIL: An introduction to Process Oriented Guided Inquiry Learning for those who wish to empower learners. Stylus.

Smith, M. K., Jones, F. H. M., Gilbert, S. L., & Wieman, C. E. (2013). The classroom observation protocol for undergraduate STEM (COPUS): A new instrument to characterize university STEM classroom practices. CBE-LSE, 12(4), 618–627. https://doi.org/10.1187/cbe.13-08-0154

Stains, M., Harshman, J., Barker, M. K., Chasteen, S. V., Cole, S. E., DeChenne-Peters, M. K., Eagan Jr., M. K., Esson, J. M., Knight, J. K., Laski, F. A., Levis-Fitzgerald, M., Lee, C. J., Lo, S. M., McDonnell, L. M., McKay, T. A., Michelotti, N., Musgrove, A., Palmer, M. S., Plank, K. M., . . . Young, A. M. (2018). Anatomy of STEM teaching in North American universities. Science, 359(6383), 1468–1470. https://doi.org/10.1126/science.aap8892

Tarrant, M., & Ware, J. (2008). Impact of item-writing flaws in multiple-choice questions on student achievement in high-stakes nursing assessments. Medical Education, 42(2), 198–206. https://doi.org/10.1111/j.1365-2923.2007.02957.x

Curriculum Preservice Science Education STEM Teacher Preparation

Asset 2