Skip to main content
 

Research & Teaching

Computing in Bioinformatics and Engaged Student Learning

Student Perspectives on Anticipatory Activities and Innovative Apps

Journal of College Science Teaching—November/December 2022 (Volume 52, Issue 2)

By Tina A. Marcroft, Chris Rasmussen, and Scott T. Kelley

The field of biology educationand other science, technology, engineering, and mathematics (STEM) disciplines more broadlyhas witnessed two major shifts in the past decade: (i) the increased awareness of research-based instructional strategies (RBIS) that are beneficial for student learning, and (ii) a recognition of the importance of computing in industry and academia. However, uptake of RBIS remains low and opportunities for students to learn computing skills are limited. In this article, we present a combination of novel bioinformatics computing apps and associated anticipatory activities embedded in context. This case study examines students’ experiences with these apps and activities as enacted in a student-centered classroom that embraced a variety of RBIS. 

 

The use of computational tools has increased dramatically across all science, technology, engineering, and mathematics (STEM) disciplines, and biology is no exception. DNA sequencing is now less costly due to the advent of next-generation sequencing (Mardis, 2008), and biologists’ analyses are currently limited by data processing rather than scarcity (Scholz et al., 2012). As a result, the software used in such processing has become increasingly sophisticated. This complexity thus requires that undergraduate and graduate students are trained in not only the operation of these programs but also the understanding and creation of the algorithms behind the software. 

In response to these needs, one of this article’s authors, Scott Kelley, has developed a series of apps and corresponding activities (available at https://www.kelleybioinfo.org/) to help students learn about the algorithms employed by software such as Basic Local Alignment Search Tool (BLAST), Clustal, and RNAfold. These programs have a variety of functions, such as DNA sequence alignment, phylogeny reconstruction, and protein structure inference. However, they utilize algorithms (i.e., procedures and/or mathematical equations) that model biological processes. These algorithms are laden with assumptions based on our hypotheses about the mechanisms of biological phenomena. As in other “authentic” cookbook labs (Osborne, 2014), students in bioinformatics and biology courses often practice using these “authentic” research tools. However, students’ experiences often amount to “pressing buttons,” and the software’s inner workings are left as a mystery. The activities that Kelley provides on his website are intended to model the process by which these algorithms are conceived; students devise their own solutions to the same problems faced by biologists when they are creating such software. The apps associated with these activities then help students learn the formal algorithms associated with the aforementioned oft-used software (e.g., BLAST). The learning goals of the activities and the apps are to (i) have students take first steps in the algorithm reinvention process, (ii) help students feel connected to the scientific process by which these algorithms and software were developed, and (iii) scaffold students’ understanding of the formal algorithms.

Research goals

While some researchers have attempted to support students’ understanding of concepts in bioinformatics (e.g., Jungck et al., 2010; Madlung, 2018; Wilson Sayres et al., 2018), bioinformatics education is still a developing field (Attwood et al., 2019; Porter & Smith, 2019; Williams et al., 2019). To our knowledge, no comparable apps intending to promote students’ understanding of several common bioinformatics algorithms exist in the literature. Additionally, we found no studies that examine the use of such educational bioinformatics tools in the context of a classroom that incorporates a variety of research-based instructional strategies (RBIS; e.g., think-pair-share, randomized selection). Educational tools do not exist in isolation; their usage is dependent on the classroom environment (Brown et al., 1989; Master et al., 2016). Thus, rather than comparing the use of these activities and apps to outcomes such as grades, we were interested in the ways that students perceive the activities and apps as they are enacted in the classroom. We wanted to examine the relationship between the classroom context and students’ usage of apps and activities, as we posited that RBIS such as collaborative learning would improve the perception of the activities and apps and invite students to use them. We thus investigated the following research questions: (i) What were students’ experiences with the activities and apps in the context of the class? (ii) How do their experiences align with the activities and apps’ intended learning goals?

Methods

Kelley has developed 12 pairs of activities and apps. The activities are “anticipatory learning sets” based on Madeline Hunter’s Instructional Theory Into Practice model of teaching (Hunter, 1994; Goldberg, 1990). These sets help students “anticipate” formal concepts by posing authentic research problems to them. Students first completed an anticipatory set individually (Figure 1), then discussed their strategies in pairs. A randomly selected student was then invited to present his or her method to the class. The class discussed various student strategies, which were elaborated on by the instructor. A variety of RBIS were employed in the class, including think-pair-share, collaborative learning, solicitation of student-generated problem-solving strategies, and randomized selection (Khatri et al., 2017). 

Figure 1
Figure 1 Sample anticipatory learning set.

Sample anticipatory learning set.

Note. Students were given the following prompt: “Match the DNA and protein sequence pairs. For each case, match Query (Q) to Subject (S). Note: Match may not be perfect.”  This figure shows the problem associated with the aforementioned question (A) and two examples of student work (B, C). This activity poses a problem like that addressed by the Basic Local Alignment Search Tool (BLAST), one type of program (and associated algorithm) commonly used by biologists to search for similar DNA or amino acid sequences.

Students then began to work with the corresponding app and were encouraged to continue doing so at home (Figure 2). The apps were intended to scaffold students’ understanding of the formal algorithms by showing them how to manually perform actions or calculations (e.g., Chou-Fasman method, unweighted pair group method with arithmetic mean [UPGMA]). The apps allow students to play with input parameters to see how they affect the associated formal algorithm’s output. They also feature explicit step-by-step tutorials, links to professional tools (e.g., National Center for Biotechnology Information BLAST), unlimited randomization, and error highlighting.

Figure 2
Image of Kelley’s BLAST app corresponding to the anticipatory set activity shown in Figure 1.

Image of Kelley’s BLAST app corresponding to the anticipatory set activity shown in Figure 1.

Note. The app instructs users about the three basic steps of the BLAST algorithm. Students manipulate a variety of parameters to see how they affect the way the BLAST “words” are matched or scored with each subject (database) sequence. Instructions for the student were as follows: “First, break the Query sequence into all overlapping sequences of length k (in this case, k = 5), known as “k-mers” or “words.” Next, find the first k-mer that is a perfect match to the subject sequence (indicated by the checked box: ‘GATCA’). Finally, extend the match between Query and Subject both left and right from the matching k-mer and scoring the pairwise alignment (+5 for the identical letter, -4 for a different letter).

These activities and apps were administered in an introductory bioinformatics course at a large state university. The course was attended by 52 students (32 undergraduate and 20 graduate students). The course included two components, lecture and laboratory. The anticipatory learning sets and app investigation constituted much of the lectures. Students practiced using the professional tools associated with each formal algorithm in the laboratory, but we did not examine this in this study. The first two authors (Tina Marcroft and Chris Rasmussen) attended lectures and conducted focus groups while Kelley was the course instructor.

To address our research questions, we explored students’ experiences informally and conversationally by conducting focus groups (Wilkinson, 1998). Marcroft and Rasmussen recruited students by advertising the study in the classroom. We emphasized that different opinions were valued and that the instructor would not be privy to their participation until after grades were submitted. We convened three focus groups over 2 subsequent days with a total of seven participants; six were women and two were undergraduates. Participation was voluntary, and students received a $10 gift card as compensation. A focus group protocol was prepared in advance, but deviations were anticipated, as the conversation often followed the flow of students’ ideas, typical of focus group methodology. 

The focus groups were recorded and transcribed. Transcripts were subjected to thematic analysis (Braun & Clarke, 2012) in which we generated several primary codes and subcodes. Marcroft and Rasmussen coded all transcripts until 100% agreement was reached. We used these codes to develop three emergent themes. Our analysis and reporting of themes emphasize the primacy of participant voices and respect for their viewpoints (Robbins, 2005). We carefully considered the data and their relationship to our codes and themes to avoid misrepresenting the participants.

Results

Three themes emerged from our analysis: safe spaces, positive perspectives of the apps’ utility, and the connection of the apps to the anticipatory learning sets. All student names are pseudonyms. All students (graduate and undergraduate) who participated in the focus groups exhibited similar proportions of evidence for all of the following themes.

Anticipatory sets: Safe spaces

The first theme that emerged from the focus groups was the high level of comfort students felt in sharing their informal ideas. Students felt safe to express their strategies without fear of evaluation or criticism—we refer to this environment as a safe space. For example, one student, Mara, described students’ presentation of their solution—one of the RBIS the instructor employed—as “more of a conversation. It wasn’t just us being lectured at. He would invite people to the board and ask them how they solved it. He wouldn’t tell them if it was right or wrong. He would just ask them if anyone else did it differently.” Helen similarly remarked that “[i]f a person I think gave the right answer, or had the right way of thinking about it, I think he was more just like, ‘Okay, that’s right.’ There [were] only a handful of times where he would actually do the problem and be like, ‘You should do it this way.’ Or he’d be leading you to the correct way to do it.” Both students felt the instructor was supporting their understanding of the formal algorithm without evaluation. 

Students also felt safe in the sense that mistakes were acceptable. Jane, a student who did not have an opportunity to share her solution with the class, stated that if she were randomly selected (another RBIS employed in the classroom), she “wouldn’t feel bad because [the instructor] didn’t embarrass anybody.” Olivia stated, “It was a supportive environment to do it, too. Not just, you know, like people are judging you if you say something wrong. And neither is the professor … he just wants you to try your best.” Susan’s impression was similar: “And even when people would make mistakes up—like when they’re presenting up front—it’s always just like, ‘Oh! I did that too!’ or ‘I can see why they did that …’” Helen interrupted, adding, “It’s more relaxed, it’s not like pressured or anything like that.” Susan continued, “Especially, once the pressure was off, and it’s like you’re not going to turn it in, just like, ‘Think about it.’ Like cross some stuff out, not a big deal, and then it was kinda fun to try and do it before we got the directions.” The students’ statements suggest that the lack of evaluation by the class and instructor helped relieve pressure and made the environment feel more relaxed. The instructor thus created a classroom environment where students felt safe to share their ideas. 

The students also recognized that the instructor solicited a variety of student strategies (another RBIS). They appreciated the diverse perspectives brought to each anticipatory set activity and found that hearing other students’ opinions was useful. For example, Helen said: 

There’s definitely different ways that you can figure it out. There’s not just one way, which was interesting to see. ... I would look at this ... and be like, “What the heck? Like how do I even do this?” And then the people start generating all these different ways to attack the problem. Feels pretty cool, then you can see it from different perspectives.

In a different focus group, Ava shared similar sentiments: 

[A] lot of times, you don’t exactly do it right, but you see other people doing it, and it was helpful to talk to other people afterwards. [The instructor] would make us talk to each other, and see what each other did, and you’re like, “Oh! That makes a lot more sense than what I did.” 

Susan also commented on the utility of hearing many perspectives: “[It] also helped to have it explained in another way or have somebody—also getting to know people around me has been super helpful, because then even when I was having ... problems, it wasn’t a big deal for me to [say], ‘Hold on, will you explain this to me?’” She added that “explaining something to someone else is a good way to learn too. Or a good way to ... problem solve.”

Safety was also reflected in the students’ appreciation of hearing from people who were perceived to be of equal experience. Jane said, “We had to discuss with each other, so we could understand that everybody was on our level.” Mara echoed this sentiment: “I found it [discussing the anticipatory sets] helpful. Because it wasn’t coming from people who had a large background in what we were doing. It was coming from people who were on my same level. So any line of logic they were following, I could also follow.” Helen also shared the same opinion:

People come up with all different ways that he [the instructor] doesn’t even know. There’s been times where you’re like, “Oh, you could look at it that way …” I think it’s nice to have different ways, because one way could be confusing to another person while another way could be so much easier to understand. ... When students go up and explain their way, it’s more relatable because the professor obviously—he does this for a living—he knows the most complex ways to figure out these problems. … Students, I don’t think, can really relate to that just because we’re not at that level yet. 

Students felt that hearing from other students was more “relatable.” All students used the term level or same level to describe the relative inexperience of the students, suggesting that there was little difference in experience among undergraduate and graduate students.

Apps: Positive perspectives

The second theme that emerged from our analysis centered on students’ perspectives of the apps as tools that enhanced their learning. Olivia and Ava suggested that the apps helped them understand how the formal algorithms work as well as the concepts behind them. Olivia said that “the app shows us how to do BLAST or … the phylogenies, and then we use the website that does it for us, so we … understand how it worked, not just … you can put DNA sequences in and you don’t know why or how it’s like aligning them but, if we understand it [the concept behind algorithms used], it’s more powerful for us.” Knowing how algorithms work was “powerful” for Olivia, and apps apparently helped her achieve that understanding. Olivia and Ava later reinforced this sentiment:

Ava: I used BLAST a ton in my undergrad and it was just—

Olivia: I never knew how it worked. … Yeah, now I’ll do like different alignment tools, and then I can determine which is the better one, based on my knowledge that I learned in class.

Ava: Right ... yeah, it’s really powerful if you understand the algorithms. And then you can go on to developing your own, which is what I, I kinda hope to do is … build off another algorithm and get it to where I want, so this is a good starting point.

Olivia and Ava suggested that other courses did not support their understanding of BLAST, despite requiring its use. In contrast, they reported that this course enhanced their understanding of the algorithm behind BLAST, so much so that they felt competent enough to create and evaluate new algorithms.

Implicit in the above statements is also the apps’ connection to practical tools like BLAST. Some students, such as Don, made this connection explicit. Don said that the apps were “basically just a dumbed-down version of real websites and real tools that we’re gonna be using.” Ava echoed that the apps were “grossly simplified” but also added that this is “what makes it nice. That’s what makes it helpful for us. ‘Cause you can’t understand the very large concepts if you don’t understand the small ones.” The simplicity of the apps was one of several features that students highlighted as addressing their learning.

Many of the students also indicated that the apps were effortless to use. Students frequently used words and phrases like straightforward and easy to understand when describing the apps. Mara stated that “if you just go to class … do the handouts and then follow up with the app … it is a really easy learning process.” The ease of using the apps are best exemplified by Olivia, who noted that her coworker who is not taking the course was able to use the apps with no assistance:

[A]nother grad student in my lab, she’s never taken this class before, but I saw her ... using this website, and I [said], “Are you in my class?” And she said, “No, I was interested in maybe doing a rotation in [instructor’s] lab,” ’cause she’s a PhD student rotating in our lab, and she just wanted to know more about algorithms, and she was just on this. And I [said], “How are you doing it?” And she just reads the little tutorial thing and kinda just clicked around and figured it out. I thought it was kinda cool that someone who’s never taken the class—I never mentioned these to her, so she just found them on her own and used them and was able to understand it.

Most of the students indicated that they used the apps to study for the exam. Mara “went back to study for the midterm” and “spent 2 hours just cycling through all” of the apps to make sure she “really got it.” Helen recommended that other students should “definitely use” the apps. “Because for the exams ... it’s a great way to practice how to attack each problem.” Helen later noted that “probably a week before the exam or two weeks before,” she was “on that religiously, going over problems [on the apps].” Students used the apps as a study aid. They found that the unlimited randomized practice problems were particularly useful. The apps gave Susan “the opportunity to look at problems and have them regenerated with different answers … which even if you’re practicing on your own, if you are doing the same problem where you’re trying to create your own, it doesn’t work that well.” Helen added, “It’s so much easier just to get a new set of numbers versus you writing out everything.”

The final feature of the apps that students cited as enhancing their learning was the immediateness and precision of the feedback. Several of the apps assessed the students’ ability to perform the steps of the formal algorithm and provided instant feedback. Mara stated, “You could see one step at a time, so if you were doing a problem and you made a mistake, you could find exactly where the mistake was.” Susan commented, “I love doing these problems on the app[s]... and then inputting it and being told right away if you got it right.” Susan replied, “Yeah. It’s very instant. Like, you got it right or you got it wrong.” 

Connecting the apps and anticipatory learning sets

The third theme that emerged from our analysis was the explicit connection that students made between the apps and the anticipatory learning sets. The instructor designed the introductory tasks to anticipate the ideas and methods that students would subsequently see in the apps; he wanted to give students an opportunity to reinvent or otherwise approximate the algorithm on their own before being shown the conventional algorithm. Students in the focus group recognized and valued this relationship. Olivia viewed the introductory tasks as “simplified version[s]” of the apps that allowed them to “at least mostly figure out how it [the algorithm] works, without even knowing that much about it.” She added that the tasks were “just a way to think about [the algorithms] before we do [them] and realize we know more than we thought we did about” the topics. Olivia’s comment that these tasks allowed them to “realize we know more than we thought we did” suggests that one of anticipatory sets’ benefits is their potential to increase students’ confidence in their abilities.

Similarly, Susan remarked, “Well, we always do the exercises before we learn how to do them [the formal algorithms].” Helen added, “So, I guess it’s thinking through what we’re about to learn.” She went on to comment on how algorithms introduced later in the course built on earlier ones: “[S]ome of these build upon each other, so if ... you are able to kinda understand the concept, building off the concepts, I guess. Maybe because there’s different ways you could figure out the problems.” Others in this same focus group commented that the introductory tasks allowed them to “think on [their] own” or their “own methodical way to think through it.” Students explicitly recognized that there are “different ways’’ to “think it through” and that this process enabled them to “understand the concept,” relating to the second theme that students perceived that the apps enhanced their learning.

Students also valued that the tasks helped them “learn how to do an algorithm,” where algorithm refers to the formal algorithms presented later by the apps. Ava reflected that these tasks helped them “start thinking about understanding the material.” Don echoed this remark: “They actually made you think, for like, the beginning of class, because typically he gave you them before you knew how to do it exactly. And so you had to figure out some way of doing it.” These students recognized that the introductory tasks were opportunities for them to “figure out some way of doing it” on their own and they valued them.

Mara and Jane expressed similar sentiments. Mara reflected on the purpose of introductory tasks: “I think he just wanted us to be actively thinking about what we were doing before we started.” Jane added, “Yeah, create our own reasoning … or make sure we understood the reasoning behind [the algorithm].” Jane’s comment echoes the second theme, that the introductory tasks fostered an understanding of the algorithm that went beyond procedural fluency to include the algorithm’s design and invention. 

Students in the focus groups recognized the connection between the anticipatory learning sets and the apps, and they valued designing their own algorithms before being shown the conventional method. Students believed that doing so helped them understand the reasoning behind the algorithm and process the formal information provided by the instructor.

Conclusion

In this study, we investigated students’ perspectives on a series of anticipatory set activities that facilitated students’ reinvention of formal bioinformatics algorithms, their perspectives on the app associated with those anticipatory sets, and how the RBIS employed in the classroom encouraged students to participate. While future studies will examine the relationship between the usage of the apps and achievement, this study examines the activities’ and apps’ usage in relation to the RBIS employed in a student-centered classroom. We find that students perceived the environment to be welcoming as a result of the RBIS employed and they were encouraged to share their self-developed algorithms in response to the anticipatory learning sets. Students also believed the associated apps were useful and enhanced their understanding of the formal algorithms. The students recognized and appreciated that the anticipatory learning sets were intended to prompt them to reinvent formal algorithms that were subsequently introduced. We present these results, however, with the recognition that our study overrepresents graduate students and women; the range of perspectives exhibited may not reflect the full range of student perspectives on the RBIS and the apps in this course and other learning environments.

Our study suggests that the collaborative nature of the classroom environment invited the students to further engage in and interact with both the anticipatory learning sets and the apps. Critics of technology-based efforts to improve learning (such as apps) claim that classroom interaction patterns and classroom social norms have not been taken into account (Bagozzi, 2007). Although this study does not directly track classroom interaction patterns and social norms, our findings provide some evidence that the positive social environment allowed for deeper engagement to occur. Indeed, a sense of belonging is often associated with general engagement or interest in STEM (Thoman et al., 2014) and changing the environment to be more student centered can increase this sense of belonging (Master et al., 2016). 

This case study also suggests that this instructor’s use of RBIS and innovative apps may have the potential to act as a means for women to associate their work in bioinformatics with computer science. All participants except for one were women; all expressed positive opinions about the course, RBIS, activity, and apps. Upon graduating high school, women have reported that their top reason for potentially pursuing computer science is its applicability to other fields (Carter, 2007). As biology currently exhibits greater parity than computer science (Snyder et al., 2016) and computing is being used more frequently in the field of biology, bioinformatics could be an avenue to increase women’s interest in identifying as a computer scientist. Future studies, however, are needed to pursue this conjecture. 

In summary, our study illustrates a kind of existence proof of an approach in which student-centered teaching practices facilitated the use of innovative computing tools, all of which were perceived positively by students. Future work is necessary to investigate the pedagogical training required for bioinformatics faculty to create similarly positive and productive learning environments in their own classrooms.

Acknowledgment

Support for this work was funded by the National Science Foundation under grant number 1612576. The opinions expressed do not necessarily reflect the views of the Foundation.


Tina A. Marcroft (tmarcrof@ucsd.edu) is a doctoral student in the mathematics and science education doctoral program at the University of California, San Diego, and San Diego State University and is part of the Center for Research in Mathematics and Science Education in San Diego, California. Chris Rasmussen is a professor in the Department of Mathematics and Statistics, and Scott T. Kelley is a professor in the Department of Biology, both at San Diego State University in San Diego, California.

References

American Association for the Advancement of Science (AAAS). (2011). Vision and change in undergraduate biology education: A call to action. AAAS. https://visionandchange.org/wp-content/uploads/2013/11/aaas-VISchange-web1113.pdf

Attwood, T. K., Blackford, S., Brazas, M. D., Davies, A., & Schneider, M. V. (2019). A global perspective on evolving bioinformatics and data science training needs. Briefings in Bioinformatics, 20(2), 398–404. https://doi.org/10.1093/bib/bbx100

Bagozzi, R. P. (2007). The legacy of the technology acceptance model and a proposal for a paradigm shift. Journal of the Association for Information Systems, 8(4), 244–254.  https://doi.org/10.17705/1jais.00122

Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 57–71). American Psychological Association. https://psycnet.apa.org/doi/10.1037/13620-004

Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42. https://doi.org/10.3102/0013189X018001032

Carter, L. (2007). Why students with an apparent aptitude for computer science don’t choose to major in computer science. Proceedings of the Thirty-Seventh SIGCSE Technical Symposium on Computer Science Education Bulletin, 38(1), 27–31. https://doi.org/10.1145/1124706.1121352

Eagan, M. K. (2016). Becoming more student-centered? An examination of faculty teaching practices across STEM and non-STEM disciplines between 2004 and 2014. Higher Education Research Institute, University of California, Los Angeles.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111

Goldberg, M. F. (1990). Portrait of Madeline Hunter. Educational Leadership, 47(5), 141–143.

Hunter, M. (1994). Mastery teaching: Increasing instructional effectiveness in elementary and secondary schools, colleges, and universities. Corwin Press. 

Jungck, J. R., Donovan, S. S., Weisstein, A. E., Khiripet, N., & Everse, S. J. (2010). Bioinformatics education dissemination with an evolutionary problem solving perspective. Briefings in Bioinformatics, 11(6), 570–581. https://doi.org/10.1093/bib/bbq028

Khatri, R., Henderson, C., Cole, R., Froyd, J. E., Friedrichsen, D., & Stanford, C. (2017). Characteristics of well-propagated teaching innovations in undergraduate STEM. International Journal of STEM Education, 4(1), Article 2. 

Madlung, A. (2018). Assessing an effective undergraduate module teaching applied bioinformatics to biology students. PLOS Computational Biology, 14(1), e1005872. https://doi.org/10.1371/journal.pcbi.1005872

Mardis, E. R. (2008). The impact of next-generation sequencing technology on genetics. Trends in Genetics, 24(3), 133–141. https://doi.org/10.1016/j.tig.2007.12.007

Master, A., Cheryan, S., & Meltzoff, A. N. (2016). Computing whether she belongs: Stereotypes undermine girls’ interest and sense of belonging in computer science. Journal of Educational Psychology, 108(3), 424–437. https://doi.org/10.1037/edu0000061

National Research Council. (2003). BIO2010: Transforming undergraduate education for future research biologists. The National Academies Press. 

Osborne, J. (2014). Teaching scientific practices: Meeting the challenge of change. Journal of Science Teacher Education, 25(2), 177–196. https://doi.org/10.1007/s10972-014-9384-1

Porter, S. G., & Smith, T. M. (2019). Bioinformatics for the masses: The need for practical data science in undergraduate biology. Omics: A Journal of Integrative Biology, 23(6), 297–299. https://doi.org/10.1089/omi.2019.0080

President’s Council of Advisors on Science and Technology. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. U.S. Office of Science and Technology Policy. https://bit.ly/3DO390b

Robbins, J. (2005). “Brown paper packages”? A sociocultural perspective on young children’s ideas in science. Research in Science Education, 35(2–3), 151–172. https://doi.org/10.1007/s11165-005-0092-x

Scholz, M. B., Lo, C., & Chain, P. S. G. (2012). Next generation sequencing and bioinformatic bottlenecks: The current state of metagenomic data analysis. Current Opinion in Biotechnology, 23(1), 9–15. https://doi.org/10.1016/j.copbio.2011.11.013

Snyder, T. D., de Brey, C., & Dillow, S. A. (2016). Digest of education statistics 2015 (51st ed.). National Center for Education Statistics. https://nces.ed.gov/pubs2016/2016014.pdf

Stains, M., Harshman, J., Barker, M. K., Chasteen, S. V., Cole, R., DeChenne-Peters, S. E., Eagan Jr., M. K., Esson, J. M., Knight, J. K., Laski, F. A., Levis-Fitzgerald, M., Lee, C. J., Lo, S. M., McDonnell, L. M., McDonnell, McKay, T. A., Michelotti, N., Musgrove, A., Palmer, M. S., … Young, A. M. (2018). Anatomy of STEM teaching in North American universities. Science, 359(6383), 1468–1470. https://doi.org/10.1126/science.aap8892

Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., Chambwe, N., Cintrón, D. L., Cooper, J. D., Dunster, G., Grummer, J. A., Hennessey, K., Hsiao, J., Iranon, N., Jones II, L., Jordt, H., Keller, M., Lacey, M. E., Littlefield, C. E., … Freeman, S. A. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences, 117(12), 6476–6483. https://doi.org/10.1073/pnas.1916903117

Thoman, D. B., Arizaga, J. A., Smith, J. L., Story, T. S., & Soncuya, G. (2014). The grass is greener in non-science, technology, engineering, and math classes: Examining the role of competing belonging to undergraduate women’s vulnerability to being pulled away from science. Psychology of Women Quarterly, 38(2), 246–258. https://doi.org/10.1177/0361684313499899

Wilkinson, S. (1998). Focus group methodology: A review. International Journal of Social Research Methodology, 1(3), 181–203. https://doi.org/10.1080/13645579.1998.10846874

Williams, J. J., Drew, J. C., Galindo-Gonzalez, S., Robic, S., Dinsdale, E., Morgan, W. R., Triplett, E. W., Burnette III, J. M., Donovan, S. S., Fowlks, E. R., Goodman, A. L., Grandgenett, N. F., Goller, C. C., Hauser, C., Jungck, J. R., Newman, J. D., Pearson, W. R., Ryder, E. F., Sier, M., … Pauley, M. A. (2019). Barriers to integration of bioinformatics into undergraduate life sciences education: A national study of US life sciences faculty uncovers significant barriers to integrating bioinformatics into undergraduate instruction. PLOS ONE, 14(11), 1–19. https://doi.org/10.1371/journal.pone.0224288

Wilson Sayres, M. A., Hauser, C., Sierk, M., Robic, S., Rosenwald, A. G., Smith, T. M., Triplett, E. W., Williams, J. J., Dinsdale, E., Morgan, W. R., Burnette III, J. M., Donovan, S. S., Drew, J. C., Elgin, S. C. R., Fowlks, E. R., Galindo-Gonzalez, S., Goodman, A. L., Grandgenett, N. F., Goller, C. C., … Pauley, M. A. (2018). Bioinformatics core competencies for undergraduate life sciences education. PLOS ONE, 13(6), e0196878. https://doi.org/10.1371/journal.pone.0196878

Biology Computer Science Life Science Technology College

Asset 2