Research & Teaching
Journal of College Science Teaching—January/February 2023 (Volume 52, Issue 3)
By Adam T. Murry, Nan Yuan, and Dean Atkinson
Undergraduate Research Experience (URE) programs have been shown to improve retention (Eagan et al., 2013; Elrod et al., 2010; ) and academic performance () in science, technology, engineering, and mathematics (STEM) programs. The efficacy of URE programs is important because these programs require considerable time, effort, and financial support and show promise in curtailing the falling numbers of STEM graduates in the United States (). In addition to research about program design, research that supports student preparation is an important component to student success.
A proximal source of information to help URE participants prepare comes from URE participants themselves. Participant advice is both descriptive (students’ impressions) and prescriptive information, in that current student experiences foreshadow what future students can expect. Unlike advice from advisors and coordinators, advice from fellow participants comes from lived experiences within the program from a similar perspective. Such advice has been found to increase academic preparedness (Savitz-Romer et al., 2009) and support coping with program workload (Linn et al., 2015) in other areas of education. In addition, URE designers, staff, and faculty advisors also benefit from participant advice, as it can help leadership anticipate the needs of participants to provide a more effective and valuable learning experience.
This study analyzes 6 years of advice from participants of National Science Foundation–funded URE (i.e., Research Experience for Undergraduates, or REU) hosted at Portland State University’s (PSU) Center for Climate and Aerosol Research (CCAR). The site facilitates undergraduate atmospheric science research with faculty from a wide range of disciplines (e.g., chemistry, biology, physics, mechanical engineering, environmental science). During an exit survey after the 10-week summer program, participants provided advice for their peers in future cohorts. This study identifies categories of advice to answer the question “What would help undergraduates prepare for UREs?”
Candid advice from faculty advisors can be invaluable for a graduate student’s adjustment, motivation, and performance (see the debate between Huey, 1987, and Stearns, 1987). Program peers provide relatable and applicable insights unique to the recipients’ perspectives (Drane et al., 2014). Advice is perceived as more helpful coming from those with relevant knowledge and experiences (Yaniv & Milyavsky, 2007) and is more likely to be accepted when advice-givers are similar to the advice-recipients (Silvia, 2005). In line with social learning theory (Vygotsky, 1978), past participants can provide information that scaffolds novice to intermediate perspectives and may even provide metacognitive strategies for self-direction and assessment (Sanders & Welk, 2005). Participant advice can also assist coordinators’ refinement of program delivery, providing insights into difficult or confusing program elements.
Unfortunately, little research has been conducted on peer advice specific to UREs. One exception is Camacho et al.’s (2016) panel discussion summary showcasing advice from four recent graduates of an undergraduate mathematics research program. The panelists advised future URE participants to (i) persevere when facing difficulties; (ii) look after one’s own academic responsibility; (iii) proactively ask questions; (iv) apply patience in teamwork scenarios; and (v) socialize with peers, mentors and supervisors. No other published research on peer advice specific to STEM UREs could be found as of the writing of this article.
In a broader educational context, McClure et al. (2006) analyzed advice to incoming freshmen from 185 engineering students. Students were asked, “What advice would you give a freshman or high school student considering your major, or engineering in general, at your institution?” Six themes emerged: (i) choose and prepare for one’s major, (ii) network and get involved, (iii) engage with peers and make friends, (iv) establish relationships with faculty, (v) join student and professional disciplinary associations, and (vi) assume academic responsibility.
Although McClure et al.’s (2006) study pertained to advice for an engineering degree rather than a URE, there was overlap with the themes identified by Camacho et al. (2016)—specifically, taking responsibility and building relationships with peers and faculty. This overlap may be due to similar challenges and expectations between STEM UREs and engineering programs, or perhaps some types of advice are beneficial across educational contexts. Three of the five types of advice from Camacho et al.’s panel study did not overlap with McClure et al.’s study (persevering through difficulties, applying patience as a team member, and proactively asking questions). Given the small sample (i.e., one cohort representing one discipline), it is premature to speculate how well the advice themes identified would transfer to other programs or disciplines. In addition, the panel summary method did not allow for an assessment of themes’ relative importance compared with the others, which would enable resources to be allocated to higher-priority recommendations.
This study builds on Camacho et al.’s (2016) study by expanding the sample’s heterogeneity, applying a multimethod approach, and adding an additional level to the research question beyond advice and advice themes to include themes’ ranked importance. Our research questions were the following:
Six 10-student cohorts composed our sample (N = 60). All participated in the PSU CCAR REU program during summers from 2013 through 2019. As a testimony to the program’s charter to select a diverse range of applicants, and in support of the transferability of our results to other student populations, participants were equally distributed across several demographic categories. About one out of eight participants were freshmen (n = 7, 12.1%), with equal portions of sophomores (n = 17, 29.3%), juniors (n = 16, 27.6%), and seniors (n = 18, 31%). Ages ranged from 15 to 33 (M = 22.5, SD = 4.04). Entrants had high grade point averages (GPA) on average (M = 3.48, SD = 0.43), ranging between 2.13 (C) and 4.00 (A). Academic-year living arrangements included living with parents (n = 16, 27.6%), with a spouse or partner (n = 16, 27.6%), with roommates (n = 16, 27.6%), or alone (n = 10, 17.2%). Half of the participants were employed at least part time before the program (n = 29, 50%). Only one participant had a child. Surprisingly for a STEM program, female (43%, n = 26) and male (48%, n = 29) students were almost equally represented, with five students choosing a third gender option (8.3%). About 40% of participants were from racial or ethnic minority groups (Native American, n = 7 [12%]; Asian, n = 6 [11%]; Latino, n = 9 [16%]; Black, n = 1 [2%]).
Data were collected to evaluate the PSU CCAR REU, a 10-week, on-site, faculty-guided research program designed to model a graduate-level research experience. The URE required lab work (40 hours per week); weekly professional development meetings; and a culminating research paper, poster, and oral presentation. Support included student housing, a monthly stipend and food allotment, and a modest materials budget. For formative and summative evaluation purposes, participants completed surveys before, during, and at the end of the program to track change on variables of interest from pre- to post-program and explore mediators of this change. In line with ethics review board stipulations, an external evaluator informed participants of their rights, acquired informed consent before each data collection, and managed all data collection and handling separate from program coordinators and faculty advisors. This decision was made to ensure participants’ confidentiality and avoid impacting faculty- or coordinator-participant relationships, should feedback be negative. Full details of the evaluation design and the variables included are described in another manuscript that focuses on longitudinal analyses of a larger set of variables (Murry et al., 2022). For the current research questions, only open-ended survey data from the exit, or post-program, survey will be the focus.
During exit surveys at the end of a URE, students were asked, “What advice do you have for future participants in the program?” All but one student provided input (n = 59), with some students making multiple recommendations. Responses that included more than one type of advice were broken down into individual statements or meaning units (similar to the unitizing process in content analysis; e.g., Krippendorff, 2004), totalling 90 separate pieces of advice.
Advice was analyzed using Lincoln and Guba’s (1985) cutting and sorting technique. This technique is appropriate to use when qualitative data are in the form of statements or other relatively concise units of meaning that can be organized by underlying themes (Ryan & Bernard, 2003). The first step was to divide responses into singular statements that could be coded with one mutually exclusive definition (e.g., “I would advise to stay organized and on track and to have open communication with your mentor” would be divided into “I would advise to stay organized, on track” and “I would advise open communication with your mentor”). This step ensures categorization based on relatively pure units of meaning and helps with proper labeling and unidimensional category definitions.
Statements of advice were grouped into clusters (whose numbers and content were not predetermined) according to their shared or similar content (Lincoln & Guba, 1985). This grouping was conducted independently by two researchers familiar with the literature on URE programs in STEM and the specifics of the CCAR URE program. One researcher was an assistant professor and the second an undergraduate assistant. To add transparency in the thematic clustering process, agreement was coded when the team clustered items similarly (i.e., co-occurrence). When both team members placed an item in a similar group, it was coded as agreement (= 1). If an item was placed under different themes, it was coded as disagreement (= 0). Clusters of items with similar content were then labeled according to the identified theme. To evaluate our process, we calculated inter-rater agreement (i) across advice and (ii) within each identified theme using percentage agreement and Kappa statistic. If inter-rater agreement met conventional cutoffs (McHugh, 2012), we labeled the theme. Following the development of themes, the researchers repeated the cut-and-sort process within each theme. This secondary sort allowed us to identify subthemes to increase the specificity of the higher-order advice themes. Agreement was not explored during the creation of subthemes. We used frequency-based percentages (i.e., the number of items in a theme or number of items in the data set) to rank importance. We viewed the number of times a type of advice appeared independently as a fair proxy for its salience to participant experience (i.e., we made the a priori assumption that something that was talked about by more people was a more common experience). Because these data were collected over the period of 6 years (10 students per year) and at the end of students’ participation in the program, neither the data nor the students were together at the same time to rank all the items.
The 90 statements of advice were grouped into six themes. Sorters achieved consistency 88% of the time (79/90), above the acceptable rate for percentage correct (> 70; Stemlar, 2004) and significantly better than chance (Kappa [N = 90] = 0.84, p < 0.001; McHugh, 2012). Within themes, percentage agreements ranged between 62% and 92%. (See Table 1; note that Table 1 includes disagreements, so item counts are slightly different than in Table 2.) Some themes had a smaller number of items in the denominator so the same number of disagreements had greater impact than themes with many prescriptions. Although the global assessment of reliability is a more accurate picture of inter-rater agreement across items, the within-theme percentage agreements show that some themes were more clearly articulated. Specifically, “motivate yourself,” “proactively manage time,” “be diligent,” and “communicate with your team” were clearer than “have fun” or “accommodate changes in lifestyle.”
After assessing inter-rater reliability, the 11 items that were sorted differently (disagreements) were discussed and placed into existing themes. We established themes before this step so no theme depended on debated items. In rank order of contribution, the themes were to (i) proactively manage your time (i [items] = 29; 32%); (ii) communicate with your team (i = 22; 24%); (iii) motivate yourself (i = 13; 14%); (iv) have fun (i = 10; 11%); (v) be diligent (i = 10; 11%); and (vi) accommodate lifestyle (i = 6; 7%).
In the second level of analysis, each theme was further sorted into subthemes for heightened clarity of recommendations (see Table 2). “Proactively manage your time” contained five subthemes: Start work early in the program (i = 11; 38%), strategize ways to manage time (i = 10; 34%), maintain self-care (i = 3; 10%), buffer for unexpected delays (i = 3; 10%), and prepare before work start date (i = 2; 7%). Four subthemes were identified within the theme “communicate with your team” specific to different purposes: for lab performance (i = 10; 45%), self-advocacy (i = 6; 27%), networking (i = 4; 18%), and psychosocial support (i = 2; 9%). The third most frequently mentioned theme, “motivate yourself,” had four advice subthemes: Set goals (i = 5; 38%), be open to learning and new experiences (i = 3; 23%), persevere through difficulty (i = 3; 23%), and be passionate about your work (i = 2; 15%). “Be diligent” targeted tasks (e.g., reading, writing, record keeping; i = 6; 60%), working independently (i = 3; 30%), and acquiring new skills (i = 1; 10%). “Have fun” included enjoy the experience (i = 7; 70%), engage in recreational activities alone and with others (i = 2; 20%), and avoid negativity (i = 1; 10%). The theme “accommodate lifestyle” had three subthemes it encouraged: dormitory residence (i.e., living in the dorm; i = 3; 50%), use of public transportation (i = 2; 33%), and exposure to local food culture (i.e., food trucks; i = 1; 17%).
This study answered our research questions as to the kinds of advice undergraduate summer research fellows would offer future cohorts, underlying themes of that advice, and which themes are most salient (i.e., frequently mentioned). We identified six higher-order themes with 22 subthemes detailing ways that undergraduate interns can prepare to thrive in STEM UREs, then ranked the themes and subthemes in terms of frequency. This information should assist future and current undergraduate researchers, program coordinators and administrators, and precollege programs feeding the STEM pipeline.
The higher-order themes most frequently mentioned were “proactively manage time” and “communicate with your team,” making up a little more than half of student-to-student advice. This suggests that student preparation and program resources to support time management and team communication are the most pressing needs. Future research fellows should reflect on strategies to address these problems with the aid of materials and instruction from the program directors. Within “proactively manage time,” two subthemes represented about three-quarters of the advice: (i) start work early in the program (e.g., “start your reports and presentations early”), and (ii) strategize ways to manage time (e.g., “stay organized,” “plan,” and “don’t procrastinate”). Within “communicate with your team,” two subthemes again represented about three-quarters of the advice. The first subtheme pertained to communication to aid lab performance (e.g., “Go into your first mentor meeting with a list of what you want to get out of it: goals, timeline, practical first steps)” and for self-advocacy. The latter included bold statements such as “Bug the hell out of your principal investigator to start as early as possible.”
The remaining four themes made up the other 43% of peer advice. “Motivate yourself” was largely composed of goal-oriented advice (e.g., “If you’re not the best at self-motivating, get paired with the most involved mentor you can”), along with openness to experience, perseverance, and passion. “Be diligent” and “have fun” tied for fourth place, each with 10 items. However, “be diligent” advice was split between the task-specific diligence and diligence working independently, with the emphasis on tasks (e.g., “Take notes and pictures of everything you do [in lab]”). “Have fun” was largely represented by the broad clause to simply enjoy the experience. Going out for recreational activities, whether alone or with members of the cohort, was also recommended. The last theme, “accommodate lifestyle,” contained items that encouraged living in a dormitory due to its benefit to the experience (e.g., “[Take] housing even if you live in Portland!”), using the public transit system (e.g., “Don’t bring your car!”), and taking advantage of the culinary options in the city.
This study has several implications for how we think about research on STEM education in general and on short-term UREs in particular. Research on the antecedents and predictors of STEM retention has flourished in typical spaces for education, such as primary and secondary schools, colleges, and universities (e.g., Herrera et al., 2011; Osborne et al., 2003; Rodgers et al., 2014; Stake & Mares, 2001; Wang, 2013), but research into less typical contexts are needed. In a longitudinal study, Robinson et al. (2019) showed that students enter STEM with different science identities and perceived competencies, and these differences impact retention positively or negatively. Examination of UREs is valuable because hands-on and experiential learning have been shown to predict retention (Sheu et al., 2018) and can appeal to a wider range of identities and competencies. Research illustrates the positive impacts of UREs for STEM retention (Toven-Lindsey et al., 2015) and its correlates (e.g., Kuh, 2008; Lopatto, 2007) and even shows their potential for addressing gender equity issues in STEM (MacPhee et al., 2013).
Having identified undergraduate research environments as a rich point of intervention, this study is unique in its use of peer advice to inform interns and program preparation. It capitalizes on the scaffolding insights of Vygotsky’s (1978) social learning theory, where more knowledgeable others are better able to set purposeful, incremental achievement goals than are individuals working independently. Although our peer advice did not take place in a live interaction, as Vygotsky’s zone of proximal development is conceived, the participants are closer to the URE and the incoming cohort than any other stakeholders (e.g., coordinators, faculty advisors) in position and experience. Future research should evaluate whether this type of advice is better received than advice from faculty or coordinators because of participants’ perceived similarity with other participants (Silvia, 2005) or if it has the same beneficial effects of live peer mentoring (i.e., capitalizing on more opportunities and deriving more satisfaction; Holland et al., 2012). Interestingly, little of the advice had to do with technical skills (e.g., math, computer operation). Much of the advice pertained to the demands of working under tight deadlines, in unfamiliar settings, and with more autonomy than traditional educational settings.
The findings of our study largely replicated the advice identified by Camacho and colleagues (2016), as well as some from the McClure et al. (2006) study findings, despite the different targets. This study extended Camacho et al.’s findings in many ways. Our data were collected over six summer programs (rather than just one) in a program that was multidisciplinary (rather than one discipline) in a sample that was larger and more diverse (N = 60 v. N = 4), and it used a more systematic empirical process. By using a quasi-multi-method similar to content analysis (Krippendorff, 2004; Morgan, 1993), we were able to use qualitative data and qualitative processes together with quantitative summaries. This is significant because the approach allowed (i) participant concerns to emerge without imposing a priori structure on the data, while (ii) still making it possible to quantitatively rank and compare qualitatively derived themes. Our approach of using content frequency as a proxy for priority, or salience, is open to criticism, but in terms of directing attention of resource allocation toward things that matter for students, this study is better able to start the conversation than a qualitative study where all themes are essentially equivalent. Ultimately, how well the frequency of a theme’s mention reflects student priorities is an empirical question that deserves future research.
Finally, it is likely that our findings transfer beyond the context of UREs. For example, similar advice has been discussed on graduate school blogs (e.g., https://www.grad.ubc.ca/current-students/newly-admitted/tips) and media sources (Corcoran, 2018; Martin, 2020), albeit not through an empirical lens. Future research should assess how well our findings extend to other contexts and settings versus ones that are unique to UREs. To replicate this research in another URE, evaluators need only to ask participants completing the program, via survey, interview, or focus group, “What advice would you give to future participants of this URE?” then analyze the data using Lincoln and Guba’s (1985) cutting and sorting technique.
Many applications of this study are straightforward because of the evaluative design and immediate need for the data to assist program design and student experience. For example, advice about what to expect from a summer STEM URE, and how to respond to those expectations, is immediately relevant to potential applicants, incoming fellows, and struggling research interns. Although students’ challenges are easy to identify through these advice items, they contain prescriptive information rather than mere barrier descriptions. As action-oriented recommendations, advice is helpful because it translates easily to practice.
Less directly, program administrators, coordinators, and volunteering faculty can all read this advice to get an idea of where students have the most challenges or feel the least prepared. Perhaps even more helpful, coordinators and students can review the list together as a conversation starter to identity which items are most relevant to them or if something is not on the list that should be added. This knowledge may inform program design or grant-writing practices such that services are offered or funding is requested to specifically address the issues identified. It may be beneficial, especially early in the program, to offer workshops on time management and communication within a lab or professional academic setting in the interest of performance and self-care. Program coordinators might consider pre-program advisor trainings about student need or the use of student-advisor agreements to stimulate conversations around expectations and commitments.
This study is one of few empirical investigations on prescriptive peer advice for researchers in UREs. Despite its relative strengths in data quality, there are several limitations that temper our confidence that the findings are truly indicative of undergraduate researchers’ advice for a summer research internship.
The first is our assumption that content frequency serves as a legitimate proxy for salience, relevance, priority, or importance. Due to practical constraints, there are reasons to suspect that we are capturing a survey timing artifact. For example, the exit survey that provided the data for our analysis took place the day before the closing symposium, where students were to present a poster and oral presentation of their summer’s work to their supervisors, program staff, and families. Ideally, advice would have been collected after this stressful rite of passage was over. Unfortunately, the student dormitory agreement ended during the symposium, meaning students had to have their belongings packed up and moved out before the symposium. Collecting data after the closing presentations and congratulatory catered symposium may have resulted in different advice, but to have attempted it would have come at great inconvenience to our participants, especially since they were no longer obligated to the program.
Another example is that PSU is on the quarter system, so some students in the semester system had to arrive late to fall semester classes (by 1 or 2 weeks) to complete the URE program, further straining any request to be surveyed postprogram. On the other hand, there is also reason to believe that getting advice after a closure experience could bias the data in the opposite direction (Festinger & Carlsmith, 1959). In terms of properly articulating the URE’s realistic demands and associated stressors, it may be the case that the timing served to properly emphasize anxiety-causing elements of the program and coping mechanisms that help address them.
A second limitation is that this sample included high-achieving students. Higher GPA is often correlated with urban or metropolitan area of residency, personal or family income, and other such demographic factors. Per this URE’s charter, the selection committee made conscious efforts to evaluate candidates beyond simple metrics of academic performance (e.g., GPA) in the interest of providing more equitable access. Although our sample was diverse in some respects (e.g., Native Americans participated in this URE at rates about 22 times higher [13% v. 0.6%] than their participation rates in STEM degree programs nationally), the applicant pool was still unique in that individuals were willing to spend all summer doing atmospheric science research instead of pursuing regular summertime activities. On the other hand, and to the extent that the selection process did accept a more diverse range of student backgrounds (e.g., rural students) than a typical URE, it is possible that our sample provided more or different types of advice than other students would have given. In either direction, the advice depicted here may not hold across groups. Ideally, advice would be collected from students with a range of backgrounds in math, computer science, early education, family support for education, and intelligence, along with other relevant variables, and advice could be compared within and between categories. This point supports our earlier suggestion on how to apply our list of advice (i.e., where URE students are shown the list and reflect on their strengths and weaknesses) rather than assume our list applies to everyone equally.
Other limitations include design features (e.g., cross-sectional data, no control group, no randomized selection or assignment), analytical constraints (descriptive advice statistics did not control for advisor style), nestedness (some advisers participating multiple years), cross-contamination (students within a cohort talking to one another), or individual differences. Some of these limitations were concessions for the benefit of other aspects of the program (faculty anonymity, student bonding), whereas others were not within the scope of this evaluation. Despite its limitations, our findings help direct URE students’ and program leaders’ efforts to make their experiences as rich and meaningful as possible.
This project was funded by NSF Award No. AGS-1659655
Adam T. Murry (email@example.com) is an assistant professor of psychology and Nan Yuan is a master’s student in industrial-organizational psychology, both at the University of Calgary in Calgary, Alberta, Canada. Dean Atkinson is an associate professor in the Department of Chemistry at Portland State University in Portland, Oregon.
Camacho, A., Davis, J. L., Klett, S., Medina, H., Pineda, A. R., & VanSchalkwyk, S. (2016). Undergraduate research: Viewpoints from the student side. Math Horizons, 24(1), 23–25.
Corcoran, V. R. (2018, October 31). What I wish I knew before starting grad school. Inside Higher Ed. https://www.insidehighered.com/advice/2018/10/31/advice-about-grad-school-phd-holder-looking-back-decade-later-opinion
Drane, D., Micari, M., & Light, G. (2014). Students as teachers: Effectiveness of a peer-led STEM learning programme over 10 years. Educational Research and Evaluation, 20(3), 210–230.
Eagan, M. K., Jr., Hurtado, S., Chang, M. J., Garcia, G. A., Herrera, F. A., & Garibay, J. C. (2013). Making a difference in science education: The impact of undergraduate research programs. American Educational Research Journal, 50(4), 683–713.
Elrod, S., Husic, D., & Kinzie, J. (2010). Research and discovery across the curriculum. Peer Review, 12(2), 4–8.
Festinger, L., & Carlsmith, J. M. (1959). Cognitive consequences of forced compliance. The Journal of Abnormal and Social Psychology, 58(2), 203–210.
Herrera, F. A., Hurtado, S., & Chang, M. J. (2011, November 17–19). Maintaining career aspirations in science, technology, engineering, and mathematics (STEM) among college students [Paper presentation]. Annual Conference of the Association for the Study of Higher Education, Charlotte, NC, United States. https://www.heri.ucla.edu/nih/downloads/ASHE2011HerreraSTEMCareers.pdf
Holland, J. M., Major, D. A., & Orvis, K. A. (2012). Understanding how peer monitoring and capitalization link STEM students to their majors. The Career Development Quarterly, 60(4), 343–354.
Huey, R. B. (1987). Reply to Stearns: Some acynical advice for graduate students. Bulletin of the Ecological Society of America, 68(2), 150–153.
Krippendorff, K. (2004). Content analysis: An introduction to its methodology (2nd ed.). Sage Publications.
Kuh, G. D. (2008). Excerpt from high-impact educational practices: What they are, who has access to them, and why they matter. Association of American Colleges and Universities, 14(3), 28–29.
Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Sage Publications.
Linn, M. C., Palmer, E., Baranger, A., Gerard, E., & Stone, E. (2015). Undergraduate research experiences: Impacts and opportunities. Science, 347(6222). https://doi.org/10.1126/science.1261757
Lopatto, D. (2007). Undergraduate research experiences support science career decisions and active learning. CBE—Life Sciences Education, 6(4), 297–306.
MacPhee, D., Farro, S., & Canetto, S. S. (2013). Academic self-efficacy and performance of underrepresented STEM majors: Gender, ethnic, and social class patterns. Analyses of Social Issues and Public Policy, 13(1), 347–369.
Martin, D. C. (2020, May 27). How to be a successful grad student: Insider advice. CX College Express. https://www.collegexpress.com/articles-and-advice/grad-school/articles/life-grad-student/how-be-successful-grad-student-insider-tips
McClure, L. S., Combrink, T. S., Foor, C. E., Walden, S. E., & Trytten, D. A. (2006, June 18–21). I wish someone would’ve told me: Undergraduate engineering students offer advice to incoming students [Presentation]. American Society for Engineering Education Annual Conference and Exposition, Chicago, IL, United States. https://bit.ly/3IVJdLv
McHugh, M. L. (2012). Interrater reliability: The kappa statistic. Biochemia Medica, 22(2), 276–282.
Morgan, D. C. (1993). Qualitative content analysis: A guide to paths not taken. Qualitative Health Research, 3(1), 112–121.
Murry, A., Yuan, A., & Atkinson, D. (2022). Relationships in science: Peer impact on research program benefits with undergraduates [Manuscript in preparation].
National Science Board. (2012). Science and technology: Public attitudes and understanding. In Science and Engineering Indicators 2012 (pp. 7-1–7-51). National Science Foundation. https://wayback.archive-it.org/5902/20170708073326/https://www.nsf.gov/statistics/seind12/pdf/c07.pdf
Osborne, J., Simon, S., & Collins, S. (2003). Attitudes towards science: A review of the literature and its implications. International Journal of Science Education, 25(9), 1049–1079.
Robinson, K. A., Perez, T., Carmel, J. H., & Linnenbrink-Garcia, L. (2019). Science identity development trajectories in a gateway college chemistry course: Predictors and relations to achievement and STEM pursuit. Contemporary Educational Psychology, 56, 180–192. https://doi.org/10.1016/j.cedpsych.2019.01.004.
Rodgers, K., Blunt, S., & Truble, L. (2014). A real PLUSS: An intrusive advising program for underprepared STEM students. NACADA Journal, 34(1), 35–42.
Ryan, G. W., & Bernard, H. R. (2003). Techniques to identify themes. Field Methods, 15(1), 85–109.
Sanders, D., & Welk, D. S. (2005). Strategies to scaffold student learning: Applying Vygotsky’s zone of proximity development. Nurse Educator, 30(5), 203–207.
Savitz-Romer, M., Jager-Hyman, J., & Coles, A. (2009). Removing roadblocks to rigor: Linking academic and social supports to ensure college readiness and success. Pathways to College Network, Institute for Higher Education Policy.
Sheu, H. B., Lent, R. W., Miller, M. J., Penn, L. T., Cusick, M. E., & Truong, N. N. (2018). Sources of self-efficacy and outcome expectations in science, technology, engineering, and mathematics domains: A meta-analysis. Journal of Vocational Behaviour, 109, 118–136. https://doi.org/10.1016/j.jvb.2018.10.003
Silvia, P. J. (2005). Deflecting reactance: The role of similarity in increasing compliance and reducing resistance. Basic and Applied Social Psychology, 27(3), 277–284.
Stake, J. E., & Mares, K. R. (2001). Science enrichment programs for gifted high school girls and boys: Predictors of program impact on science confidence and motivation. Journal of Research in Science Teaching, 38(10), 1065–1088.
Stearns, S. C. (1987). Some modest advice for graduate students. Bulletin of Ecological Society of America, 68(2), 145–150.
Stemlar, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assessment, Research, and Evaluation, 9(4), 1–11. https://doi.org/10.7275/96jp-xz07
Toven-Lindsey, B., Levis-Fitzgerald, M., Barber, P. H., & Hasson, T. (2015). Increasing persistence in undergraduate science majors: A model for institutional support of underrepresented students. CBE—Life Sciences Education, 14(2), ar12.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.
Wang, X. (2013). Why students choose STEM majors: Motivation, high school learning, and postsecondary context of support. American Educational Research Journal, 50(5), 1081–1121.
Yaniv, I., & Milyavsky, M. (2007). Using advice from multiple sources to revise and improve judgments. Organizational Behavior and Human Decision Processes, 103(1), 104–120.
Research STEM Teaching Strategies Technology