Skip to main content


Negative Student Response to Active Learning in STEM Classrooms:

A Systematic Review of Underlying Reasons

Journal of College Science Teaching—July/August 2020 (Volume 49, Issue 6)

By Prateek Shekhar, Maura Borrego, Matt DeMonbrun, Cynthia Finelli, Caroline Crockett, and Kevin Nguyen

Recent research has supported the use of student-centered teaching practices, such as active learning, because of its effectiveness in improving student learning and retention when compared with traditional, lecture-based teaching practices. Despite evidence supporting the effectiveness of active learning in improving STEM undergraduate education, the adoption of active learning by instructors has been slow for reasons, including negative student response to active learning. In this systematic literature review, we examine students’ negative responses to active learning and reasons for the negative responses noted in 57 published STEM studies. Our findings identify three types of negative responses: affect, engagement, and evaluation. The reasons behind negative response represented six overarching categories based on student feedback: limited value, lack of time, difficulty and increased workload, lack of guidance, logistical difficulties, unfamiliarity with active learning, lack of preparation, and confidence. We leverage different theoretical perspectives to explain the reasons behind negative responses and offer insights for lowering the barrier for instructors to adopt active learning in STEM classrooms.


An important goal for higher education, particularly for science, technology, engineering, and mathematics (STEM) fields, is ensuring that graduates develop the skills to succeed in the workplace (AAAS, 2010; Freeman et al., 2014; NAE, 2004). One common approach to skill development is to improve student learning in undergraduate STEM courses by using student-centered teaching practices (Jamieson & Lohmann, 2012; Kuh, 2008; NASEM, 2016; Seymour & Hewitt, 1997). While a variety of student-centered teaching practices are noted in the literature such as think-pair-share, group discussions, and project-based learning (Prince, 2004), we use the term “active learning” (AL) to include instruction where students participate in class activities rather than watching the instructor lecture. Recent research has demonstrated that many student-centered teaching practices lead to better learning outcomes and increased student retention in STEM programs (Barnett, 2014; Braxton et al., 2008; Freeman et al., 2014; Haak et al., 2011; Prince, 2004). As a result, there have been calls for the increased use of AL in STEM classrooms across several national platforms (e.g., ASEE, 2012; NSF, 2013; PCAST, 2012; Singer et al., 2012).

Despite evidence on the effectiveness of AL and its benefits for student learning and retention, the adoption of AL among STEM instructors has been slow (Friedrich et al., 2007; Handelsman et al., 2004; Hora et al., 2012; PCAST, 2012; Singer et al., 2012). A number of factors may influence an instructor’s adoption and continued use of AL in the classroom. Some factors serve as positive motivators, such as introducing instructors to positive research on AL, a flexible curriculum that allows for innovation in the classroom, and a community of colleagues with whom to engage during the adoption process (Eddy et al., 2015; Finelli et al., 2014; Froyd et al., 2013; Shekhar & Borrego, 2016b). Other factors can serve as barriers, including familiarity with how to implement AL in the classroom, the limited time available to develop AL practices, and negative student response to AL (Finelli et al., 2014; Froyd et al., 2013; Kiemer et al., 2015).

Although some studies have offered best practices to help improve students’ response to AL (Arum & Roksa, 2011; Borrego et al., 2013; Felder, 2011; Johnson et al., 1991; Lake, 2001; Michael, 2007), we have found little research that systematically examines how negative student response can impact the implementation of AL in the STEM classroom. We address this gap in the literature by reviewing the research on students’ negative response to AL in the STEM classroom. The purposes of our paper are: (1) to identify the types of negative responses to AL that have been published in the literature; (2) to understand the reasons for the negative student response, and; (3) to apply a theoretical lens to explain the mechanisms of negative student response. We identified three relevant theories from the literature on learning, motivation, and instructional change that help explain students negative respond to AL: Expectancy Value Theory, Zone of Proximal Development, and Expectancy Violation Theory. Expectancy Value Theory posits that a student’s choice to participate in an activity is informed by value and competence beliefs (Wigfield & Eccles, 2000). In the context of student response to AL, value beliefs include whether students perceive there to be a benefit in the activities, and competence beliefs include whether students perceive they have the ability to complete the activities. Furthermore, value beliefs include aspects such as students’ perceived usefulness of the activity (utility value) and cost associated with participation in the activity such as loss of time and effort requirements (cost value). Along similar lines, the concepts of Vygotsky’s (1987) Zone of Proximal Development, particularly the idea of scaffolding, offer theoretical insight to the reasons reported for negative student responses to AL (Vygotsky, 1987). In the context of AL, scaffolding can be described as a process of providing support to assist student learning during activities, for example through a process that breaks an activity into smaller steps. This theory implies that, for students to learn effectively through AL, instructors should provide some guidance through the activities to avoid overwhelming them cognitively. Lastly, Expectancy-Violation Theory is the final theory that provides insight into negative student response to AL (Gaffney et al., 2010). This theory argues that students may respond negatively because they expect to be taught using passive, lecture-based teaching methods.

We use 57 published studies identified through a systematic literature review (SLR) to explore how and why students negatively respond to AL and examine our findings in the context of the three theories. Through this analysis, we hope to provide explanations as to why students respond negatively to AL and assist instructors in mitigating students’ negative response to AL. We note that many of the primary studies we cite had an original intention of reporting the learning impacts of AL, but we believe these studies can build knowledge about student response to AL by examining additional empirical evidence presented in the studies, which addresses our aims about student negative response.


The 57 studies analyzed in this paper are a subset of 431 full papers we identified in a SLR about students’ noncognitive response to AL (Borrego et al., 2018; Crockett et al., 2018). The inclusion criteria for our initial search were that a study must: (1) describe an in-class AL intervention, (2) include some systematic data collection on students’ noncognitive response to the intervention, (3) be in an undergraduate STEM classroom, and (4) be published in English between 1990–2015. A librarian helped to define search terms for each inclusion criterion (see ) and to conduct searches on six different databases, including Academic Search Complete, Education Source, ERIC, Compendex, Inspec, and Web of Science. We also solicited studies via relevant STEM education email lists.

Our initial search returned 2,364 studies, which we reduced to 431 qualifying papers based on multiple rounds of screening of both the abstracts and full texts of the papers based on the inclusion criteria. We developed a coding system, which we applied to each of the 431 full papers to document information about basic characteristics of each study (e.g., STEM discipline studied, course level, and the type of AL) and whether the study reported student response to be positive or negative as a result of the AL intervention. Of the 431 papers, 57 studies described negative student responses. Thus, as this paper is focused on understanding students’ negative response, this subset of 57 studies forms the final sample for our analysis.

Finally, we used first and second cycle coding methods to examine the 57 studies for the type of negative student response and reasons behind it (Saldaña, 2010). In the first cycle, four researchers assigned descriptive codes to capture type of negative student response and underlying reasons as reported in the studies. Descriptive codes involve assigning basic labels to summarize the content in the data (Saldaña, 2010). In the second cycle, the first cycle codes were grouped to form overarching categories by two researchers. Specifically, for type of negative student response, students’ noncognitive responses were categorized into three constructs, as described in the literature (Burroughs et al., 1989; Fredricks et al., 2004; Kearney et al., 1991; Seidel & Tanner, 2013; Weimer, 2013): affect, engagement, and evaluation. Affect includes students’ satisfaction toward the course and the type of instruction, the value students perceive in the activities, and students’ overall attitude towards the AL. Engagement includes the extent to which students participate in the activities and their receptiveness to the instruction. Evaluation includes students’ end-of-term course and instructor evaluations gathered through formal or informal methods. For reasons behind negative student response, because the literature does not offer specific constructs, a focused coding approach was used in which the first cycle codes emerging from the studies were categorized based on conceptual similarity to form overarching categories through several iterations and discussions between two researchers (Saldaña, 2010).


Study characteristics

The studies presented in this analysis (N = 57) include a range of STEM disciplines, class level, type of AL, and study methodology. The courses ranged from a variety of STEM disciplines—mostly engineering and computer science (N = 29), biology and health sciences (N = 14), mathematics and statistics (N = 7), and physics (N = 7); and from different undergraduate academic levels, including first year (N = 19), second year (N = 11), third year (N = 10), and fourth year (N = 9). The studies represented a breadth of AL types such as working in groups or pairs (N = 43), in-class problem-solving (N = 30), working individually on exercises (N = 19), project or problem-based learning (N = 17), answering questions posed by instructor (N = 14), and discussions (N = 14). In regard to methodology, the studies used quantitative (N = 39), qualitative (N = 4), and mixed methods (N = 14) approaches.

Negative student responses

Using the three main categories of student’s noncognitive response—affect, engagement, and evaluation—the most frequently described type of negative student response was affective (N = 43), which was commonly reported as student preferences for different types of instruction, lack of enjoyment, and disinterest in activities. For example, researchers reported that working with other students in a cooperative learning environment diminishes the value of active learning instruction for students (Machemer & Crawford, 2007). The second most common negative student response was engagement (N = 11), and this was reported through nonparticipation in the activities, decreased receptiveness, and lack of student interaction during relevant activities. For instance, in an inquiry-based learning classroom, researchers reported that instructors struggled with “getting students to present their solutions at the board” (Cooper et al., 2012, p. 396). Finally, some studies (N = 8) indicated negative student response on end-of-term student evaluations and other less formal feedback surveys. For example, in the institution’s end-of-course faculty evaluation, 41% students reported that they least liked the AL-based workshops that were included in an organic chemistry course (Rein & Brookes, 2015). The reasons behind these negative responses and the underlying learning and motivation theories are discussed in the section that follows.

Reasons behind negative student responses

We grouped the negative student responses found in our 57 studies into six broad categories (Table 1). The most common category discussed regarding negative response was students’ perception of activities being of limited value to their learning or success in the course. This included student concerns about whether or how AL would help them achieve course learning outcomes, cover important course content, succeed on exams, or achieve a good course grade. The second most-common category included student concerns that activities were time-consuming and inappropriately difficult, and thus increased their workload. The third most-common category was the lack of guidance for AL exercises, including involvement, scaffolding, and facilitation during activities. The fourth most-common category noted logistical issues associated with the use of AL, including technology, classroom layout, class size, and group work issues. Finally, the least common categories (categories five and six) captured students’ unfamiliarity with AL due to prior experience with traditional, lecture-based instruction and students’ feeling unprepared to complete AL exercises due to lack of background knowledge. A full list of studies is provided online.

Reasons behind negative responses.


Description of codes

Overall frequency

Perception of limited value

Students did not appreciate the value of AL in helping them learn, achieve course learning outcomes, cover course content, get a good grade in exams/course, or enhance interest in the topic


Lack of time, difficulty, and increased workload

Students complained that AL activities were time-consuming, were difficult to complete, and increased their workload.


Lack of guidance

Students were concerned about limited guidance, lack of scaffolding, a low degree of instructor involvement, and self-directed learning.


Logistical difficulties

Students were concerned regarding the technology/instrument/tools used in AL, classroom features such as layout and class size, scheduling conflicts, group/team work, and the quality of videos used in the flipped classroom setting.


Unfamiliarity with AL

Students were not used to AL and perhaps were expecting to sit in a lecture.


Lack of student preparation and confidence

Students were unprepared to do the activities due to insufficient background knowledge or lack of review material.



In general, the studies we reviewed did not cite theories to explain the reasons students responded negatively to AL; however, we believe that using theory to explain the underlying mechanisms of negative student responses could be an important step in identifying strategies to overcome those negative responses. The associations between the three theories (Expectancy Value Theory, Zone of Proximal Development, and Expectancy Violation Theory) and the six categories established in our systematic literature review are provided in Table 2.

Theories explaining student negative response.

Reasons for negative student response to AL

Theoretical framework offering potential insight

Perception of limited value

Expectancy Value Theory (utility value)

Lack of time, difficulty, and increased workload

Expectancy Value Theory (cost value)

Lack of guidance

Expectancy Value Theory (competence beliefs)Zone of Proximal Development

Logistical difficulties

(No theory)

Unfamiliarity with AL

Zone of Proximal DevelopmentExpectancy Violation Theory

Lack of student preparation and confidence

Expectancy Value Theory (competence beliefs)

Expectancy Value Theory (Wigfield & Eccles, 2000) argues that a student’s participation is influenced by how they perceive the usefulness of AL (utility value), time and effort incurred in participation (cost value), and their ability to perform the tasks involved in the AL exercise (competence beliefs). The data from the 57 studies that we coded resonate with these arguments (Table 2). “Perception of limited value” corresponds to utility value, concerns regarding “lack of time, difficulty and increased workload” correspond to cost value, and “lack of guidance” and “lack of student preparation and confidence” address competence beliefs. Framed this way, Expectancy Value Theory explains that students are more likely to respond negatively to AL when they question the value or their ability to complete the activities, and the theory serves as a good framework to both understand the reasons for negative student responses and develop strategies to address those negative responses. This highlights that instructors should be cognizant of aspects of instruction that are valuable to students, namely how active learning is contributing to their learning goals. Future research should focus on identifying factors that instructors should incorporate when developing active learning instruction that add value to participation in active learning exercises for students.

The concept of scaffolding, as described in Vygotsky’s (1987) Zone of Proximal Development, argues that lack of guidance or scaffolding may impede effective student learning in an AL environment (Vygotsky, 1987). Much of the data from the studies that reported negative student resistance can be explained through these ideas (Table 2). Specifically, “lack of guidance” and “unfamiliarity with AL” have connections with the Zone of Proximal Development. For instance, because AL often promotes students taking responsibility for their learning, it is likely that students may negatively respond to AL if appropriate scaffolding is not provided to guide them during the self-directed learning process. Vygotsky’s theory offers a framework for reducing negative student responses by scaffolding AL activities to increase student engagement and learning. This finding also calls for more theoretical and empirical research toward devising heuristics for scaffolding AL to mitigate negative student response.

Expectancy-Violation Theory explains student negative response to AL from an instructional change perspective and argues that negative response is received when AL violates their expectation of receiving a passive lecture (Gaffney et al., 2010). Thus, students may respond negatively because they have an “unfamiliarity with AL” (Table 2). This theory suggests that instructors should take time to align student expectations with the types of activities they should anticipate in class. Of the five studies coded for lack of familiarity, two were studies of SCALE-UP introductory physics courses conducted by the authors of this theory. While this theory may apply in some instructional situations, other studies have concluded that undergraduate students often come with high expectancy of AL instruction (Nguyen et al., 2017). These diverging findings call for more research examining which situations violate students’ expectancies for AL, and when this theory is an appropriate one to apply. Researchers may consider conducting a more granular analysis for different AL techniques (e.g., think-pair-share, self-directed learning, and problem-based learning) and identify the extent by which they violate student instructional expectations in STEM classrooms.

The themes that emerged from the negative types of student response reported in these 57 papers suggest a key opportunity for future work. Negative affect, such as students not believing they can complete an activity or that it is worth their time, represents a common negative response to AL, and delving deeper into the root causes of these responses is an important area for future work. There is some support for this line of study in our previous work on student resistance to AL (Finelli et al, 2018; Tharayil et al, 2018). We considered value (whether time spent on the activities was worthwhile) and positivity (how students felt about the activities and the instructor) as measures of student affect toward AL, and we found that value was a statistically significant positive predictor of participation, distraction, and overall course evaluation, while positivity predicted overall course evaluation (Finelli et al., 2018). This suggests that affective responses such as value and positivity may prove to be mediators between characteristics of the AL instruction and student responses to AL. Thus, learning more about students’ negative affective response to AL, particularly in regard with satisfaction, value, and interest will allow the development of theory-based approaches to directly address affective response. Theories about student learning and motivation may serve as a starting point in future studies that focus on understanding the mechanisms that might trigger negative student responses to AL.

Implication for practice

In our own recent research, we identified specific strategies that an instructor can use to improve students’ response to AL (Finelli et al., 2018; Tharayil et al., 2018). We identified two main categories of instructor strategies (explanation and facilitation), and we have evidence that both correlate with lower levels of negative student responses to AL (Finelli et al., 2018). Explanation strategies include describing to students how the activity relates to their learning, as well as making sure students understand how to complete the activity. These align with the theoretical underpinnings of Expectancy Value Theory, which posits increasing students’ perceived value of active learning as one plausible way for mitigating negative response. In the studies reviewed in this paper, the most common reason for negative student responses for AL was students’ perception of limited value, particularly with regard to their learning and success in the course (utility value). Thus, explaining how the activities relate to learning, as well as carefully planning activities that align with graded assignments would help to alleviate this concern (Shekhar & Borrego, 2016a). In addition, the explanation strategy involving communication of overall course expectations for student participation at the beginning of the semester is one way for mitigating negative response due to mismatch in student expectations (Expectancy-Violation Theory) that instructors could use to encourage participation among students who might not have prior experiences with AL.

Facilitation strategies include monitoring students during an activity, carefully planning activities, aligning assessment with activities, and seeking feedback from students about the activities. These strategies fit well with the theories of Zone of Proximal Development and Expectancy Value Theory (cost value), offering remedies for reducing negative response. For example, the second most-common reason for negative student response was that activities were difficult or time-consuming. This could be addressed through the facilitation strategies of carefully planning and scaffolding activities that guide students in their learning. Similarly, the third most-common reason for negative student response was lack of guidance, which can be eliminated through facilitation strategies such as explaining the activity, walking around the room while students are working to answer their questions, and planning activities that scaffold learning by breaking a task into manageable steps (Tharayil et al., 2018).


This study gathered and synthesized empirical data from 57 published studies to support specific theories of negative student response to AL. Although instructors’ perceptions that students will respond negatively to AL is a common barrier to adoption of AL, our systematic literature review found little support for these perceptions; just 57 of 412 studies reported negative responses to AL, and only a few of these negative responses manifested in poor end-of-term course evaluations (reported in 8 of 57 studies, and the smallest category overall). A much more frequently described type of negative student response (43 studies) was affective, including student preferences for different types of instruction, lack of enjoyment, and disinterest in activities. Eleven studies described lack of engagement, such as nonparticipation, decreased receptiveness, and lack of interaction. Because these studies were not conducted to specifically examine students’ negative response, we do not claim that there is a lack of negative response from students. Nonetheless, we reiterate the lack of work examining negative student response to AL in STEM classrooms and call for targeted research in the area.

Within the 57 published studies, we identified six categories of reported reasons for these responses, many of which can be directly addressed by instructors. Several learning and motivation theories help to interpret the results and suggest implications for teaching. Eccles’ Expectancy Value Theory of Motivation suggests that students must perceive value in the activity (i.e., it contributes to learning or earning a good grade) and feel confident they can complete the activity (Wigfield & Eccles, 2000). Vygotsky’s Zone of Proximal Development suggests that AL activities should be scaffolded to challenge students through activities that are somewhat different from what they have seen before, but not entirely unfamiliar (Vygotsky, 1987). Gaffney’s Expectancy Violation Theory suggests that instructors should take time at the beginning of a course to norm student expectations (Gaffney et al., 2010).

Overall, this study replicates and expands the generalizability of prior student resistance studies by finding for example that explanation and facilitation instructor strategies can reduce student resistance to AL, and that students less frequently penalize instructors trying new AL in their end-of-term evaluations less often than feared by instructors (Finelli et al., 2018; Nguyen et al., 2017). Taken together, these results suggest that instructors can, over time, refine their use of AL to provide appropriate supports, explanations, and alignment with assessment and grading to reduce negative student reactions to AL.


This project is funded by the U.S National Science Foundation through grant number 1744407. The opinions are those of the authors and do not necessarily represent the National Science Foundation. We thank our collaborators Cynthia Waters, Robyn Rosenberg, Michael Prince, and Charles Henderson for their contributions to this project. 

Prateek Shekhar ( is an assistant professor of engineering education in the School of Applied Engineering and Technology at the New Jersey Institute of Technology in Newark, New Jersey. Maura Borrego is a professor in the Department of Mechanical Engineering and STEM Education at University of Texas in Austin, Texas. Matt DeMonbrun is a senior statistician and associate director of the Enrollment Management Research Group at Southern Methodist University in Dallas, Texas. Cynthia Finelli is professor of both electrical engineering and computer science and education and director of engineering education research at University of Michigan in Ann Arbor, Michigan. Caroline Crockett is a PhD student in the Department of Electrical Engineering and Computer Science at University of Michigan in Ann Arbor, Michigan. Kevin Nguyen is an assistant professor in the Hutchins School of Liberal Studies at the Sonoma State University in Rohnert Park, California. 


American Association for the Advancement of Science (AAAS). (2010). Describing and measuring undergraduate STEM teaching practices. The National Meeting on the Measurement of Undergraduate STEM Teaching.

American Society for Engineering Education. (2012). Innovation with impact: Creating a culture for scholarly and systematic innovation in engineering education. Washington, DC.

Arnesen K., Korpas G. S., Hennissen J. E., & Stav J. B. (2013). Experiences with use of various pedagogical methods utilizing a student response system—Motivation and learning outcome. Electronic Journal of E-Learning, 11(3), 169–181.

Arum R., & Roksa J. (2011). Academically adrift: Limited learning on college campuses. University of Chicago Press.

Autin M., Bateiha S., & Marchionda H. (2013). Power through struggle in introductory statistics. PRIMUS, 23(10), 935–948.

Bailey M., & DeBartolo E. (2007). Using the experiential learning model and course assessment to transform a multidisciplinary senior design course sequence. Proceedings of the 2007 American Society of Engineering Education (ASEE) Annual Conference & Exposition, Honolulu, HI.

Bailey R. T. (2015). Using 3D printing and physical testing to make finite-element analyis more real in a computer-aided simulation and design course. Proceedings of the 2015 American Society of Engineering Education (ASEE) Annual Conference & Exposition, Seattle, WA.

Barnett D. R. (2014). Academic and social integration of nontraditional students: The role of active learning strategies and sense of belonging in integration and persistence. (Paper 936). [Doctoral dissertation, Southern Illinois University at Carbondale]. OpenSIUC.

Berjon R., Beato M. E., Mateos M., & Fermoso A. (2015). Using emerging mobile technologies to enhance collaborative learning. A study case in a university environment. International Journal of Education and Information Technologies, 9, 151–158.

Berkling K., & Zundel A. (2013). Understanding the challenges of introducing self-driven blended learning in a restrictive ecosystem—step 1 for change management: Understanding student motivation. Proceedings of the 5th International Conference on Computer Supported Education (pp. 311–320).

Borrego M., Nguyen K., Crockett C., DeMonbrun R.M., Shekhar P., Tharayil S., Finelli C. J., Rosenberg R., & Waters C. (2018). Systematic literature review of students’ affective responses to active learning: Overview of results. Proceedings of the 2018 IEEE Frontiers in Education Conference, San Jose, CA.

Borrego M., Cutler S., Prince M., Henderson C., & Froyd J. E. (2013). Fidelity of implementation of researchbased instructional strategies (RBIS) in engineering science courses. Journal of Engineering Education, 102(3), 394–425.

Braxton J. M., Jones W. A., Hirschy A. S., & Hartley H. V.III (2008). The role of active learning in college student persistence. New Directions for Teaching and Learning, 2008(115), 71–83.

Breckler J., & Yu J. R. (2011). Student responses to a hands-on kinesthetic lecture activity for learning about the oxygen carrying capacity of blood. Advances in Physiology Education, 35(1), 39–47.

Bunting C. F., & Cheville R. A. (2009). VECTOR: A hands-on approach that makes electromagnetics relevant to students. IEEE Transactions on Education, 52(3), 350–359.

Burroughs N. F., Kearney P., & Plax T. G. (1989). Compliance-resistance in the college classroom. Communication Education, 38(3), 214–229.

Chen J.-Y., Lee M.-C., Lee H.-S., Wang Y.-C., Lin L.-Y., & Yang J.-H. (2006). An online evaluation of problem-based learning (PBL) in Chung Shan Medical University, Taiwan—a pilot study. Annals-Academy of Medicine Singapore, 35(9), 624.

Chen L., Chen T.-L., & Chen N.-S. (2015). Students’ perspectives of using cooperative learning in a flipped statistics classroom. Australasian Journal of Educational Technology, 31(6), 621–640.

Chini J. J., Gaffney J., & Al-Rawi A. (2013). Expectancy violation in traditional and studio-mode introductory physics courses. Proceedings of the 2013 Physics Education Research Conference, Portland, OR.

Cicek M. J. S. (2015). Student experiences in a structural engineering course: Responses of violation and grief when a novice instructor implements project-based learning. Proceedings of the 2015 American Society of Engineering Education (ASEE) Annual Conference & Exposition, Seattle, WA..

Cilli-Turner E. (2015). Measuring learning outcomes and attitudes in a flipped introductory statistics course. Primus, 25(9–10), 833–846.

Cooper T. E., Bailey B., & Briggs K. (2012). The impact of a modified Moore method on efficacy and performance in precalculus. PRIMUS, 22(5), 386–410.

Crockett C. E., Nguyen K. A., Borrego M., Shekhar P., De Monbrun R. M., Tharayil S., Rosenberg R., Finelli C. J., & Waters C. (2018). Work in progress: How do students respond to active learning? A coding guide for a systematic review of the literature. Proceedings of the 2018 American Society of Engineering Education (ASEE) Annual Conference & Exposition, Tampa, FL.

Crossgrove K., & Curran K. L. (2008). Using clickers in nonmajors- and majors-level biology courses: Student opinion, learning, and long-term retention of course material. CBE—Life Sciences Education, 7(1), 146–154.

Dal M. (2013). Teaching electric drives control course: Incorporation of active learning into the classroom. IEEE Transactions on Education, 56(4), 459–469.

Dori Y. J., Hult E., Breslow L., & Belcher J. W. (2007). How much have they retained? Making unseen concepts seen in a freshman electromagnetism course at MIT. Journal of Science Education and Technology, 16(4), 299–323.

Eddy S. L., Converse M., & Wenderoth M. P. (2015). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes. CBE—Life Sciences Education, 14(2), 1–16.

Felder R. M. (2011). Hang in there! Dealing with student resistance to learner-centered teaching. Chemical Engineering Education, 43, 131–132.

Finelli C J, Daly S. R., & Richardson K. M. (2014). Bridging the research-to-practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education, 103(2), 331–361.

Finelli C. J., Nguyen K., DeMonbrun M., Borrego M., Prince M., Husman J., Henderson C., Shekhar P., & Waters C. K. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching, 47(5), 80–91.

Fredricks J. A., Blumenfeld P. C., & Paris A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109.

Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H., & Wenderoth M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415.

Friedrich K., Sellers S. L., & Burstyn J. (2007). Thawing the chilly climate: Inclusive teaching resources for science, technology, engineering, and math. To Improve the Academy: Resources for Faculty, Instructional, and Organizational Development, 26, 133–144.

Froyd J., Borrego M., Cutler S., Prince M., & Henderson C. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education, 56(4), 393–399.

Gaffney J. D. H., Gaffney A. L. H., & Beichner R. J. (2010). Do they see it coming? Using expectancy violation to gauge the success of pedagogical reforms. Physical Review Special Topics - Physics Education Research, 6(1), 1–16.

Gaffney J. D. H., & Gaffney A. L. H. (2013). Student satisfaction and perceptions of instructor support in studio physics. Proceedings of the 2015 Physics Education Research Conference, College Park, MD.

Galand B., Raucent B., & Frenay M. (2010). Engineering students’ self-regulation, study strategies, and motivational believes in traditional and problem-based curricula. International Journal of Engineering Education, 26(3), 523–534.

Gaskins W. B., Johnson J., Maltbie C., & Kukreti A. (2015). Changing the learning environment in the college of engineering and applied science using challenge based learning. International Journal of Engineering Pedagogy, 5(1), 33–41.

Gok T. (2012). The effects of peer instruction on students’ conceptual learning and motivation. Asia-Pacific Forum on Science Learning and Teaching, 13, 1–17.

Golter P. B., Van Wie B. J., & Brown G. R. (2007). Comparing student experiences and growth in a cooperative, hands-on, active, problem-based learning environment to an active, problem-based environment. Proceedings of the 2007 American Society of Engineering Education (ASEE) Annual Conference & Exposition, Honolulu, HI.

Haak D. C., HilleRisLambers J., Pitre E., & Freeman S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216.

Handelsman J., Ebert-May D., Beichner R., Bruns P., Chang A., DeHaan R., Gentile J., Lauffer S., Stewart J., Tilghman S. M., & Wood W. B. (2004). Scientific teaching. Science, 304(5670), 521–522.

Hora M. T., Ferrare J. J., & Oleson A. (2012). Findings from classroom observations of 58 math and science faculty.

Jamieson L. H., & Lohmann J. R. (2012). Innovation with impact: Creating a culture for scholarly and systematic innovation in engineering education. American Society for Engineering Education.

Johnson D. W., Johnson R. T., & Smith K. A. (1991). Active learning: Cooperation in the college classroom. Interaction Book Co.

Jun H., & Guang-ping C. (2012). Improving undergraduates’ engineering abilities with chinese situation. Proceedings of the 2012 7th International Conference on Computer Science & Education (ICCSE).

Kearney P., Plax T. G., & Burroughs N. F. (1991). An attributional analysis of college students’ resistance decisions. Communication Education, 40(4), 325–342.

Khanova J., Roth M. T., Rodgers J. E., & McLaughlin J. E. (2015). Student experiences across multiple flipped courses in a single curriculum. Medical Education, 49(10), 1038–1048.

Khoo E., Scott J., Peter M., & Round H. (2015). Evaluating flipped classrooms with respect to threshold concepts learning in undergraduate engineering. Proceedings of the 2015 IEEE Frontiers in Education Conference, El Paso, TX.

Kiemer K., Gröschner A., Pehmer A.-K., & Seidel T. (2015). Effects of a classroom discourse intervention on teachers’ practice and students’ motivation to learn mathematics and science. Learning and Instruction, 35, 94–103.

King S. O., & Robinson C. L. (2009). ‘Pretty lights’ and maths! Increasing student engagement and enhancing learning through the use of electronic voting systems. Computers & Education, 53(1), 189–199.

Kuh G. D. (2008). Excerpt from high-impact educational practices: What they are, who has access to them, and why they matter. Association of American Colleges and Universities, 14(3), 28-29.

Laatsch L., Britton L., Keating S., Kirchner P., Lehman D., Madsen-Myers K., Milson L., Otto C., & Spence L. (2005). Cooperative learning effects on teamwork attitudes in clinical laboratory science students. Clinical Laboratory Science, 18(3), 150–159.

Lake D. A. (2001). Student performance and perceptions of a lecture-based course compared with the same course utilizing group discussion. Physical Therapy, 81(3), 896–902.

Lawanto O. (2011). The use of enhanced guided notes in an electric circuit class: An exploratory study. IEEE Transactions on Education, 55(1), 16–21.

Li J., Zhao Y., & Shi L. (2009). Interactive teaching methods in information security course. Proceedings of the 2009 International Conference on Scalable Computing and Communications.

Lunsford B. E., & Herzog M. J. R. (1997). Active learning in anatomy & physiology: Student reactions & outcomes in a nontraditional A&P course. The American Biology Teacher, 59(2), 80–84.

Lykke M., Coto M., Mora S., Vandel N., & Jantzen C. (2014). Motivating programming students by problem based learning and LEGO robots. Proceedings of the 2014 IEEE Global Engineering Education Conference.

Machemer P. L., & Crawford P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education, 8(1), 9–30.

Meyers K., & Cripe K. (2015). Prior educational experience and gender influences on perceptions of a first-year engineering design project. International Journal of Engineering Education, 31(5), 1214–1225.

Michael J. (2007). Faculty perceptions about barriers to active learning. College Teaching, 55(2), 42–47.

National Academies of Sciences, Engineering, and Medicine (NASEM). (2016). Barriers and opportunities for 2-year and 4-year stem degrees: Systemic change to support students’ diverse pathways. National Academies Press.

National Academy of Engineering. (2004). The engineer of 2020: Visions of engineering in the new century. National Academies Press.

National Science Foundation. (2013). Common guidelines for education research and development. National Science Foundation.

Nepal K. P. (2013). Comparative evaluation of PBL and traditional lecturebased teaching in undergraduate engineering courses: Evidence from controlled learning environment. International Journal of Engineering Education, 29(1), 17–22.

Nguyen K., Husman J., Borrego M., Shekhar P., Prince M., DeMonbrun M., Finelli C. J., Henderson C. R., & Waters C. (2017). Students’ expectations, types of instruction, and instructor strategies predicting student response to active learning. International Journal of Engineering Education, 33(1), 2–18.

Nomme K., & Birol G. (2014). Course redesign: An evidence-based approach. The Canadian Journal for the Scholarship of Teaching and Learning, 5(1), 2.

Örnek F., Robinson W. R., & Haugan M. P. (2008). Students’ expectations about an innovative introductory physics course. Journal of Turkish Science Education, 5(1), 49–59.

President’s Council of Advisors on Science and Technology (PCAST). (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. PCAST.

Pearce R. S. (2009). A compulsory bioethics module for a large final year undergraduate class. Bioscience Education, 13(1), 1–21.

Prince M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231.

Ramlo S. (2015). Student views about a flipped physics course: A tool for program evaluation and improvement. Research in the Schools, 22(1), 44-54.

Reddy P. D., Mishra S., Ramakrishnan G., & Murthy S. (2015). Thinking, pairing, and sharing to improve learning and engagement in a data structures and algorithms (DSA) class. Proceedings from the 2015 International Conference on Learning and Teaching in Computing and Engineering.

Rein K. S., & Brookes D. T. (2015). Student response to a partial inversion of an organic chemistry course for non-chemistry majors. Journal of Chemical Education, 92(5), 797–802.

Robson N., Dalmis I. S., & Trenev V. (2012). Discovery learning in mechanical engineering design: Case-based learning or learning by exploring? American Society for Engineering Education.

Rockland R. H., Hirsch L., Burr-Alexander L., Carpinelli J., & Kimmel H. S. (2013). Learning outside the classroom—Flipping an undergraduate circuits analysis course. Proceedings of the 2013 American Society of Engineering Education (ASEE) Annual Conference & Exposition, Atlanta, GA.

Saldaña J. (2010). The coding manual for qualitative researchers. Sage.

Schoening A. M., Selde M. S., Goodman J. T., Tow J. C., Selig C. L., Wichman C., Cosimano A., & Galt K. A. (2015). Implementing collaborative learning in prelicensure nursing curricula: Student perceptions and learning outcomes. Nurse Educator, 40(4), 183–188.

Seidel S. B., & Tanner K. D. (2013). What if students revolt?”—Considering student resistance: Origins, options, and opportunities for investigation. CBE-Life Sciences Education, 12(4), 586–595.

Self B., & Widmann J. (2010). Dynamics buzzword bingo: Active/collaborative/inductive learning, model eliciting activities, conceptual understanding. American Society for Engineering Education.

Seymour E., & Hewitt N. M. (1997). Talking about leaving: Why undergraduates leave the sciences. Westview.

Shankar P. R., Palaian S., Gyawali S., Mishra P., & Mohan L. (2007). Personal drug selection: Problem-based learning in pharmacology: Experience from a medical school in Nepal. PLoS One, 2(6), e524.

Shekhar P., & Borrego M. (2016a). After the workshop: A case study of post-workshop implementation of active learning in an electrical engineering course. IEEE Transactions on Education, 60(1), 1–7.

Shekhar P., & Borrego M. (2016b). “Not hard to sway”: A case study of student engagement in two large engineering classes. European Journal of Engineering Education, 43(4), 585–596.

Shekhar P., DeMonbrun M., Borrego M., Finelli C., Prince M., Henderson C., & Waters C. (2015). Development of an observation protocol to study undergraduate engineering student resistance to active learning. International Journal of Engineering Education, 31(2), 597–609.

Simpson V., & Richards E. (2015). Flipping the classroom to teach population health: Increasing the relevance. Nurse Education in Practice, 15(3), 162–167.

Singer S. R., Nielsen N. R., & Schweingruber H. A. (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering. National Academies Press.

Soto-Johnson H., Dalton C., & Yestness N. (2009). Assessing multiple abstract algebra assessments. Investigations in Mathematics Learning, 1(3), 1–26.

Stehling V., Schuster K., Richert A., & Isenhardt I. (2016). Please vote now! Evaluation of audience response systems—First results from a flipped classroom setting. Springer.

Tharayil S., Borrego M., Prince M., Nguyen K. A., Shekhar P., Finelli C. J., & Waters C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education, 5(1), 1–16.

Triantafyllou E., Timcenko O., & Busk Kofoed L. (2015). Student behaviors and perceptions in a flipped classroom: A case in undergraduate mathematics. Proceedings of the 2015 Annual Conference of the European Society for Engineering Education, Orleans, France.

Turner M. J. (2015). A flipped course in modern energy systems: Preparation, delivery, and post-mortem. Proceedings of the 2015 ASEE Annual Conference & Exposition, Seattle, WA.

Van Dijk L., Van Der Berg G. C., & Van Keulen H. (2001). Interactive lectures in engineering education. European Journal of Engineering Education, 26(1), 15–28.

Vygotsky L. (1987). Zone of proximal development. Mind in Society: The Development of Higher Psychological Processes, 5291, 157.

Walker J. D., Cotner S. H., Baepler P. M., & Decker M. D. (2008). A delicate balance: Integrating active learning into a large lecture course. CBE—Life Sciences Education, 7(4), 361–367.

Weaver G. C., & Sturtevant H. G. (2015). Design, implementation, and evaluation of a flipped format general chemistry course. Journal of Chemical Education, 92(9), 1437–1448.

Weimer M. (2013). Learner-centered teaching: Five key changes to practice. Jossey-Bass.

Wigfield A., & Eccles J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68–81.

Wilke R. R. (2003). The effect of active learning on student characteristics in a human physiology course for nonmajors. Advances in Physiology Education, 27(4), 207–223.

Yadav A., Shaver G. M., & Meckl P. (2010). Lessons learned: Implementing the case teaching method in a mechanical engineering course. Journal of Engineering Education, 99(1), 55–69.

Teacher Preparation Postsecondary

Asset 2