By S. Elizabeth Alter and Leo Gafney
Effective dissemination is a crucial, yet often overlooked aspect of developing biology curricula for undergraduate students. Part of the issue lies with the way that the term “dissemination” has traditionally been used in science education. For example, a National Science Foundation (NSF) sample dissemination plan lists “the university’s website … presentations at conferences … and through articles,” as appropriate methods for dissemination of new curricula (NSF, 2018). While in the past simply distributing information about new programs might have been enough to attract the attention of potential adopters, educators now have a much wider array of new STEM education programs and learning innovations to select from. Therefore, dissemination is likely to be most effective if program developers can anticipate concerns and obstacles, and have answers to commonly asked questions and issues. Like tending a garden, effective dissemination requires good seed (an effective program) and fertile soil (a receptive pool of potential adopters).
As Henderson et al. (2010) point out, in order to effect curricular change, new educational programs and materials must be accompanied by specific strategies, including efforts to create more reflective teachers, progressive pedagogical policies, and a shared vision among cooperating faculty. Bringing potential adopters into the conversation is critical in order to determine areas that might require additional facilitation or attention. In this paper, we survey the literature on dissemination and review specific strategies employed by projects that seek to develop new curricula across several STEM fields, examining the issues from the perspectives of both potential disseminators and adopters. We propose a framework for effective dissemination of research-based STEM curricula, highlighting factors that we find to be of particular importance to potential adopters. Our recommendations draw from a growing body of literature emphasizing that efforts at dissemination must consider the viewpoint of potential adopters regarding new programs and curricula.
A number of studies have examined the role of institutional and cultural factors that can impede or encourage the dissemination of STEM curricula, but few have examined the relative importance of these factors in a quantitative manner. Work by Rogers (2010) and others suggest four key factors as influences on potential adopters: relative advantage, compatibility, complexity, and potential to be tested or tried before a full commitment (referred to as “trialability”). Rogers (2010) developed a generalized theory on diffusion of innovations that suggests that those innovations that diffuse most successfully are those that are perceived to be more effective (relative advantage); those that are compatible with the system in place (compatibility); level of difficulty in learning to use the new innovation (complexity); and the ability to try or test the innovation (trialability), as well as the potential for the innovation to be used in new and perhaps unintended ways (reinvention). Importantly, potential adopters integrate these factors and may weigh them differently, so that, for example, an innovation that is not able to be tried easily may still be attractive if it is deemed to have very high relative advantage compared to the current tool in use.
Hazen et al. (2012), working more specifically in the area of STEM education, assessed the importance of a set of factors related to dissemination success in engineering education, and arrived at many of the same important factors. Particular factors were identified a priori, based on innovation diffusion theory, technology acceptance model, and other related theories, and then assessed using a Delphi survey, which uses structured and systematic communication methods to survey a panel of experts, and content analysis of research posters on undergraduate STEM education. Hazen et al. found the top factor of dissemination success based on the Delphi results to be relative advantage, followed by other factors that are intrinsic to the project, including: usability, adaptability, and complexity. The same set of factors were also uncovered by Bourrie et al. (2014), again using a Delphi study, applied to NSF principal investigators and co-principal investigators involved in Course, Curriculum, and Lab Improvement (CCLI) and Transforming Undergraduate Education in Science (TUES) programs.
Potential adopters are increasingly interested in evidence of student success. Owen and Stupans (2009) found strong support among practitioners for competency-based curriculum changes and dissemination. In addition, successful implementation is affected by appropriate support and interventions to ease or promote curricular changes. Toward this end, the Concerns-Based Adoption Model (CBAM) (e.g., Hall, 1974; Anderson, 1997; Tunks & Weller, 2009; Roach et al., 2009) focuses on measuring and describing the responses of educators to new programs. One of the more significant contributions in the literature on CBAM is that those interested in professional development can no longer be considered simply trainers; they must take on the roles of facilitators, mediators of learning, designers, and more. Loucks-Horsley (1996) and Roach et al. (2009) pursue this theme with their discussion of school-based consultants. Others have stressed the need for supportive administration and management (Ghosh, 2000; Owen and Stupans, 2009; Weerts, 2007). Dancy and Henderson (2008) point out that a dissemination model should anticipate that the adopter will be heavily involved in the development and adaption of a practice to the local setting.
In considering best practices and constraints to dissemination, it is useful to review particular examples of innovative STEM strategies and programs, and to examine efforts toward dissemination with regard to the applicability of the four key elements of the model under discussions (i.e., relative advantage, compatibility, complexity, and trialability). While these case studies in no way represent a comprehensive accounting of the hundreds of programs developed for undergraduate STEM learning around the United States and the world, our goal is rather to explore some examples of how some successful programs have envisioned and implemented dissemination.
Several decades ago Eric Mazur (1997) introduced into his large lecture classes at Harvard brief breaks into which he proposed a situation in physics along with multiple-choice responses. After attempting to answer individually, students then discussed the situation in pairs and responded a second time to the question. The method is now widely used in STEM fields due to its demonstrated relative advantages, high compatibility, low complexity, and ease of trialability. Careful evaluations using comparison groups have shown the method is more effective than traditional lectures for student learning (Zhang et al., 2017), and peer instruction appears to have particular benefits for students from underrepresented groups (Steer et al., 2009). The approach is compatible with any lecture course. While it requires some additional time and effort by the instructor, adopters have generally found that this time commitment pays for itself in increased learning. For example, Watkins and Mazur (2013) found that twice as many students switched from being physics majors to nonSTEM majors after taking an introductory lecture-based course as those in a course that permitted response and interactions.
Peer-Led Team Learning (PLTL) was developed in the field of chemistry, starting at City College of CUNY (e.g., Gosser & Roth, 1998; Gosser et al., 2001; Gafney & Varma-Nelson, 2008). This program integrates small, peer-led workshops into STEM courses. A careful effectiveness evaluation demonstrated relative advantage: The method leads to better student learning of basic skills and content, as well as better complex problem solving. The evaluation also uncovered a model that included six critical components essential to productive implementation of the method: leader selection and training, appropriate group size and location, integration with the course, involvement of the course instructor, targeted materials, and administrative support. A follow-up, NSF-funded project provided mini-grants to adopters, providing additional relative advantages and compatibility. For example, the funding for peer leaders was important and often not available without the grant. More significant was the fact that, “the support became a two-way street with new strategies flowing from as well as to the new adopters” (Gafney & Varma-Nelson, 2008, p. 46). The adopters became part of the PLT community with a sense of ownership. More recently a literature review article analyzed 67 studies and provided five evaluation themes that would be of use to potential adopters of PLT: student success measure, student perceptions, critical thinking, peer leaders, and variations on the model (Wilson & Varma-Nelson, 2016).
Many recent research-based programs developed in the sciences can be classified into Classroom Research Experiences (CUREs, CREs, focused at individual institutions) or inclusive Research Education Community (iREC) programs in which a common scientific question and/or curriculum is pursued across multiple institutions. Clearly the latter approach is much more labor-intensive to set up, but once it is up and running, many of the most difficult aspects of dissemination and encouraging adoption are centralized and made easier; in particular, faculty training and creating a community of users are critical pieces of such efforts and these factors should strongly encourage additional adopters.
Science Education Alliance-Phage Hunters Advancing Genomic and Evolutionary Science (SEA-PHAGES) is an iREC that provides support for a research-based course for undergraduate science majors, focused on isolation and characterization of bacteriophages (Hanauer et al., 2017). Over 104 institutions and 4,000 students have participated in SEA-PHAGES, making it one of the most widespread programs of its kind. SEA-PHAGES has demonstrated the relative advantage of its program in increasing student interest in STEM and retention, based on measurements of aspects such as project ownership and science identity using student surveys (Hanauer et al., 2017). The program is likely to be most compatible with institutions desiring to revamp the undergraduate science curriculum and build in substantially more authentic research, as it involves a two-semester lab course that might not fit easily into preexisting departmental requirements. For this reason, it would be a challenge to conduct a short trial of SEA-PHAGES. Expense could be a factor in adoption, as the material and instructional costs are higher than most traditional lab courses. The program is likewise complex in that faculty must be able to assist students in laboratory and bioinformatics techniques. However, because it is an iREC, it is able to mitigate some of these issues by providing extensive programmatic support through a centralized administration, including experimental protocols, teacher workshops, an online network of practitioners, biennial retreat for faculty, and an annual conference. Perhaps most usefully, SEA-PHAGES has developed an explicit model that links together program elements and outcomes, integrating information about relative advantage, compatibility, complexity, and trialability in one place.
In this iREC, students learn the basics of genome annotation and editing, while contributing to improving online genomic resources (Lopatto et al., 2008; Elgin et al., 2017). Centralized project management is necessary to ensure that groups are not replicating the efforts of others. The relative advantage of the Genomics Education Partnership (GEP) is demonstrated through its broadening of participation in research experiences in a classroom setting, and assessment of pre- and postcourse quizzes showing gains in student understanding related to genomics and in general appreciation for the scientific process. Compatibility will vary by institution, as some computational resources are necessary and the project would demand a relatively small class size; however, it does not involve wet lab work. Training workshops and extensive web-based resources aid in trialability and faculty comfort with curricular complexity.
Authentic Research Experience in Microbiology (AREM) provides a modular approach for integrating genomics research into the general biology curriculum by investigating the microbiome of New York City (Muth & McEntee, 2014). The “scalable module” was initiated at Brooklyn College, City University of New York, more than 10 years ago and has been adopted at more than 10 other CUNY sites, as well as a number of institutions around the country. The program guides students through the same kinds of research required in any introductory course in microbiology. It is therefore compatible with other curricula. Its focus on the urban biome and the authentic research element introduces a clear relative advantage over courses limited to labs and more traditional topics. An array of student and instructor materials offers assistance to implementers. Dr. Muth (professor of biology, Brooklyn College) has noted, however, that instructors require guidance as they implement the program (T. R. Muth, personal communication). A steep learning curve for the computational analysis needed as part of the project is currently mitigated by data processing done by AREM program staff, but computational needs (including a high-performance computing cluster) might be daunting to adopters in the absence of this assistance.
The current challenge is to describe the key elements of a dissemination model that would make a given program more likely to be implemented by potential adopters. Promotion of programs through website, journals, conferences, etc., should ideally be done after the developers of the program have devoted attention to the primary concerns of potential adopters: relative advantage, compatibility, complexity, and trialability. We recommend that developers of innovative learning programs in STEM consider the following elements and strategies when planning dissemination of their programs: 1.Show the program’s relative advantage over current curricula. Adopters must be convinced that a change will bring improvements in some important areas of teaching and learning. Before disseminating their program, developers should be able to show that their model has been evaluated and is more effective compared with the one in use by potential adopters (in practice, in large STEM courses this means lecture-based courses or “cookbook” laboratories). Metrics used to show “effectiveness” have included: (1) improved student learning as measured on standardized quizzes and exams, (2) improved grades, (3) better retention rates in the major or overall, and (4) immediate learning (as in Peer Instruction). Developers should use these types of metrics to create a concise document outlining the relative advantages compared with, for example, large lectures. 2.Demonstrate program compatibility with current coursework and requirements, as well as class size and budget restrictions. Potential adopters will want to see that a new approach is consistent and compatible with the overall curriculum, as well as with the goals and end results regarding skills, knowledge, and expectations for courses. If a department is making efforts to engage more minority students, to attract or retain more STEM majors, and is open to some experimentation with introductory courses, then the members of that department will feel encouraged to consider the module. Gafney and Varma-Nelson (2008, p. 31) found that having someone from a similar institution successfully implement a new pedagogy was important to faculty members when considering adoption. In the case of large public institutions, compatibility is often aided by shared curricula across campuses and a system of determining course equivalencies. Compatibility also entails the ability to scale the program to courses of different sizes. Borda et al. (2017) find the greatest challenge in scaling up to larger class sizes to be providing tailored, individual feedback. In addition, programs with hands-on laboratory and field components obviously become more challenging (though not impossible) with greater class sizes. Program developers should provide clear examples from a variety of departments and institutions that demonstrate how the program has, or could be, easily integrated into the curriculum. Finally, an important aspect of compatibility is overall cost. Some programs require, in addition to time costs, additional supplies, tools, or services, and some require directly paying students who act as peer mentors. The ability of a program to provide accurate accounting and budget will be another factor that will make buy-in at the department and university level more likely. Developers should provide examples of how the innovation could be scaled up or down to fit particular budget constraints. 3.Provide a clear indication of the complexity of the program. Potential adopters are generally wary of programs that are too complicated and require a lot of preparation, fearing the benefits will not be worth the effort and time. Rogers (2010) hypothesizes that success will have a negative correlation with complexity. Active learning, in particular, is often more complex with greater need for advance preparation than it may first appear. D’Avanzo (2013) notes, “active teaching for many instructors is a pedagogy that requires repeated practice, feedback about efficacy, and ongoing discussion about theoretical underpinnings in the context of a faculty member’s own experiences.” The more complex the program, the greater the need for easily accessible materials that provide step-by-step instructions to STEM teachers. In addition, the creation of communities of practitioners, through means such as conferences, workshops, and list servs, ensures that implementers who run into roadblocks feel like they have a place to turn for answers. Developers should create a flow chart or diagram that gives potential instructors a clear picture of what they would need to do to implement the program, with basic estimates for time, personnel, and supplies. 4.Demonstrate the trialability of the program. Adopters must be able to experience advantages within the normal parameters of time, and the customary investment of energy for planning and teaching. Programs become more advantageous when various aspects can be broken out to create sub-modules or experiences that could be tried with modest investment before faculty commit to revamping an entire semester’s worth of material. Therefore, it will be advantageous to program developers to explicitly design mini-experiences or modules that would allow potential adopters to evaluate and experiment with the program, with an explicit description of how to implement the module and evaluation tools. Partial investment might mean taking on a project in stages or starting with just one class or section.
Those presenting what is new and different must be aware of what instructors are willing and able to use. Regarding potential adopters, the product must be: demonstrably better than what is currently in use, compatible with course needs and expectations, not overly complex, and able to be tried on a small scale. These benefits can be—and in the best cases, are—demonstrated by program creators as part of the more traditional approaches to dissemination (e.g., via websites) to provide an overview of program benefits. However, the focus on the adopter must be more conscious and explicit. Effective program dissemination demands that we listen to and learn from the practitioners and students involved in undergraduate STEM courses so that we can best speak to the needs of the community.
We are grateful for helpful conversations with faculty in the Biology and Chemistry Departments at York College. SEA gratefully acknowledges support from NSF- 1433014.
S. Elizabeth Alter (firstname.lastname@example.org) is an associate professor in the Department of Biology at York College in Jamaica, New York. Leo Gafney is an independent education consultant.
Anderson S. E. (1997). Understanding teacher change: Revisiting the concerns based adoption model. Curriculum Inquiry, 27(3), 331–367.
Borda E., Boudreaux A., Fackler-Adams B., Frazey P., Julin S., Pennington G., & Ogle J. (2017). Adapting a student-centered chemistry curriculum to a large-enrollment context: Successes and challenges. Journal of College Science Teaching, 46(05), 8–13.
Bourrie D. M., Cegielski C. G., Jones-Farmer L. A., & Sankar C. S. (2014). Identifying characteristics of dissemination success using an expert panel. Decision Sciences Journal of Innovative Education, 12(4), 357–380.
Dancy M., & Henderson C. (2008, October). Barriers and promises in STEM reform. National Academies of Science Promising Practices Workshop.
D’Avanzo C. (2013). Post-vision and change: Do we know how to change? CBE—Life Sciences Education, 12(3), 373–382.
Elgin S. C., Hauser C., Holzen T. M., Jones C., Kleinschmit A., Leatherman J., & Partnership T. G. E. (2017). The GEP: Crowd-sourcing big data analysis with undergraduates. Trends in Genetics, 33(2), 81–85.
Gafney L., & Varma-Nelson P. (2008). Peer-led team learning: evaluation, dissemination, and institutionalization of a college level initiative. Springer.
Ghosh S. (2000). Integrating design into undergraduate honors theses in a computer engineering program: An experiment. IEEE Transactions on Education, 43(2), 203–210.
Gosser D. K., Cracolice M. S., Kampmeier J. A., Roth V., Strozak V. S., & Varma-Nelson P. (2010). Peer-led team learning: A guidebook (2nd ed). Prentice Hall.
Gosser D. K. and Roth V. (1998). The workshop chemistry project: Peer-led team learning. Journal of Chemical Education, 75(2), 185–187.
Hall G. E. (1974, April). The concerns-based adoption model: A developmental conceptualization of the adoption process within educational institutions [Paper presentation]. Annual Meeting of the American Educational Research Association, Chicago, IL.
Hanauer D. I., Graham M. J., SEA-PHAGES, Betancur L., Bobrownicki A., Cresawn S. G., Garlena R. A., Jacobs-Sera D., Kaufmann N., Pope W.H., Russell D. A., Jacobs W. R.Jr., Sivanathan V., Asai D. J., & Hatfull G. F. (2017). An inclusive Research Education Community (iREC): Impact of the SEA-PHAGES program on research outcomes and student learning. Proceedings of the National Academy of Sciences, 114(51), 13531–13536.
Hazen B. T., Wu Y., & Sankar C. S. (2012). Factors that influence dissemination in engineering education. IEEE Transactions on Education, 55(3), 384–393.
Henderson C., Finkelstein N., & Beach A. (2010) Beyond dissemination in college science teaching: An introduction to four core change strategies. Journal of College Science Teaching,39(5), 18–25.
Lopatto D., Alvarez C., Barnard D., Chandrasekaran C., Chung H.-M., Du C., Eckdahl T., Goodman A. L., Hauser C., Jones C. J., Kopp O. R., Kuleck G. A., McNeil G., Morris R., Myka J. L., Nagengast A., Overvoorde P. J., Poet J. L., Reed K.,…Elgin S. C. R. (2008). Genomics education partnership. Science, 322(5902), 684.
Loucks-Horsley S. (1996). Principles of effective professional development for mathematics and science education: A synthesis of standards. NISE Brief, 1(1), 1.
Mazur E. (1997). Peer instruction: A user’s manual. Prentice Hall.
Muth T. R., & McEntee C. M. (2014). Undergraduate urban metagenomics research module. Journal of Microbiology & Biology Education, 15(1), 38.
National Science Foundation (NSF). (2018). NSF grant proposal guide 04-23 (document nsf19001). National Science Foundation (NSF). (2018). NSF grant proposal guide 04-23 (document nsf19001).
Owen S., & Stupans I. (2009). Experiential placements: Dissemination and stakeholder engagement for curriculum planning action to prepare future pharmacy professionals. Journal of Learning Design, 3(1), 1–10.
Roach A. T., Kratochwill T. R., & Frank J. L. (2009). School-based consultants as change facilitators: Adaptation of the concerns-based adoption model (CBAM) to support the implementation of research-based practices. Journal of Educational and Psychological Consultation, 19(4), 300–320.
Rogers E. M. (2010). Diffusion of innovations. Simon and Schuster.
Steer D., McConnell D., Gray K., Kortz K., & Liang X. (2009). Analysis of student responses to peer-instruction conceptual questions answered using an electronic response system: Trends by gender and ethnicity. Science Educator, 18(2), 30–38.
Tunks J., & Weller K. (2009). Changing practice, changing minds, from arithmetical to algebraic thinking: an application of the concerns-based adoption model (CBAM). Educational Studies in Mathematics, 72(2), 161.
Watkins J., & Mazur E. (2013). Retaining students in science, technology, engineering, and mathematics (STEM) majors. Journal of College Science Teaching, 42(5), 36–41.
Weerts D. J. (2007). Toward an engagement model of institutional advancement at public colleges and universities. International Journal of Educational Advancement, 7(2), 79–103.
Wilson S. B., and Varma-Nelson P. (2016). Small groups, significant impact: A review of peer-led team learning, research with implications for STEM education researchers and faculty. Journal of Chemical Education, 93(10), 1686–1702.
Zhang P., Ding L., Mazur E. (2017). Peer instruction in introductory physics: A method to bring about positive changes in students’ attitudes and beliefs. Physical Review Physics Education Research 13(1), 010104-1–010104-9.
Web SeminarScience Update: A Deep Dive into NASA's X-57, February 17, 2022
Join us on Thursday, February 17, 2022, from 7:00 PM to 8:00 PM ET for a deep dive into the X-57 Maxwell, NASA's all-electric x-plane....
Web SeminarBook Beat Live! Uncovering Student Ideas with Formative Assessment Probes, December 15, 2021
Join us on Wednesday, December 15, 2021, from 7:00 PM to 8:15 PM ET for another seminar in the Book Beat Live! series. Formative assessment is...