Meeting the Time Demands of Highly Structured Courses
By Scott Freeman, Pamela Pape-Lindstrom, Anne Casper, and Sarah Eddy
Since their founding in 1901, community colleges have focused on a single mission: improving access to higher education. Thanks to their open admission policies and affordability, they have been spectacularly successful. Community colleges currently enroll about 41% of all undergraduates in the United States, and relative to bachelor’s degree granting institutions, educate a disproportionately large percentage of students from demographic groups that are under-represented in STEM fields (American Association of Community Colleges, 2017; Community College Research Center, 2017; Ginder et al., 2017; Labov, 2012; Snyder et al., 2016).
Unfortunately, community college students are also subject to disproportionately high rates of attrition, with over 70% leaving school without completing an associate’s degree or bachelor’s degree, compared to 40% of students at four-year schools failing to complete a bachelor’s degree (Juskiewicz, 2015; National Center for Education Statistics, 2017). Students who are returning to school or who are of nontraditional age are at particularly high risk of withdrawing, as they are often working to support themselves and juggling family responsibilities (Johnson et al., 2016). Other barriers to success for community college students may include speaking English as a second language and entering with poor academic preparation (Johnson et al., 2016).
Under-preparation can also be an issue that places students at risk of failure in comprehensive universities. Many of these four-year, master’s degree granting institutions, which are also called regional public universities, are similar to community colleges in following nearly open-enrollment policies.
Retention in college is important not only for students themselves, but for the United States as a nation. For example, Carnevale and Rose (2011) estimated that the U.S. economy would benefit from an additional 15 million graduates with four-year degrees by 2025, and the President’s Council of Advisors on Science and Technology (2012) called for an additional one million graduates with STEM degrees per year. Meeting these goals will be difficult or impossible unless success rates at community colleges increase sharply.
For STEM-interested community college students of all ages and backgrounds, one of the most important predictors for retention is level of performance in their initial STEM courses (Bahr et al., 2017; Chen & Soldner, 2013). Poor grades trigger withdrawals. What innovations can help students overcome barriers to success and achieve the types of grades that encourage persistence in STEM?
Recent work has shown that for all undergraduates in STEM, increased active learning during class time—defined as problem-solving and discussions that engage students, often in groups, in the process of learning—can improve course performance (Freeman et al., 2014; Freeman et al., 2011). In addition, achievement gaps that impact underrepresented students can be reduced or eliminated by high-structure, flipped, or inverted courses—where intensive active learning is combined with required preclass preparation activities and postclass exam preparation exercises (Eddy & Hogan, 2014; Haak et al., 2011). Based on these data, policy makers and opinion leaders advocate for high-structure course designs that emphasize active learning during class as a solution to historical problems with retention in the STEM disciplines (President’s Council of Advisors on Science and Technology, 2012; Wieman, 2014).
There may be a catch, however. Most research on innovative course designs has taken place at research-intensive universities, not at community colleges (Schinske et al., 2017). This observation is important because the efficacy of high-structure courses depends in part on assigning more work outside of class than is typical of traditional lecture courses (Ruiz-Gallardo et al., 2011). Thus, it is legitimate to question whether course designs with high demands on “the other 23 hours” can be successful at community colleges, given that students at those institution types may be more time-constrained than students at research-intensive institutions. For example, Clement (2016) found that almost 55% of community college students surveyed had a job, and that 35% were working more than 16 hours per week.
If community college students actually do have more out-of-school commitments than peers at other institution types, do they have enough study time available to meet the demands of high-structure courses? If the answers to these two questions are yes and no, then a healthcare analogy is relevant: A beneficial drug, developed by education specialists at research-intensive universities, may be priced out of reach in terms of the time commitment required for “treatment” for the student populations who would benefit the most.
Although study time was the most salient question motivating this study, we also wanted to explore how many hours per week students at different institution types spent working or volunteering in a field related to their course of study. This query was inspired by the literature on the positive impacts of undergraduate research experiences and internships in promoting retention in STEM (e.g., Fakayode et al., 2014). Do students at community colleges have time to participate in noncourse activities that can play an important role in professional development?
To address these questions, we designed an online survey to assess how students partition their time among competing activities. We administered the instrument in introductory biology courses for majors across three institution types in each of two states in the United States. We studied these three institution types because they host the clear majority of undergraduates in the United States; we studied two states to furnish a replicate and contrast quarter versus semester systems. Our goal was to generate data that will help instructors and administrators understand the trade-offs that students at different institution types are making in response to course demands.
We surveyed students at two community colleges in the state of Washington and two community colleges in the state of Michigan, along with the comprehensive university and the research-intensive university in each state where most of the community college students in the sample intended to transfer (Table 1). At all eight institutions, we administered the survey in the first course in the introductory sequence required of biology majors. The four schools in the state of Washington were on the quarter system, while the four schools in the state of Michigan were on the semester system. Students completed the survey in approximately the seventh week of a 10-week quarter or the seventh week or later of a 14-week semester. In every case, we were careful to schedule the survey during a week when no high-stakes exams were scheduled in the focal course.
|Table 1. Courses sampled.|
To determine how highly structured each course in the sample was, three of the authors independently inspected syllabi from each course and determined whether instructors tasked students with completing assignments outside of class rarely, weekly, or daily. The three researchers then met to reach the consensus determination reported in Table 1. Following Freeman et al., 2011 and Haak et al., 2011, we interpreted courses with daily assignments as highly structured. Although the nature of these assignments differed among instructors, information from the instructors involved confirmed that all attempted to design assignments that directed students’ time and attention to content that had challenged their students in the past.
Because no standardized instrument exists to gather data on student time allocation, we designed survey questions with consultation from experts in survey design and time-management studies (see Acknowledgments). The full survey is available in Appendix 1; in essence, we asked students to estimate how many hours they spent on various activities in a typical week that did not include a biology exam. After stating how many credits and courses they were taking in the current term, students answered a series of questions asking about the time they allocated to each activity. Each question appeared as a separate page in the survey; responses (including 0) were required for each. We did not ask students to estimate how many hours they spent eating, in personal care, or in scheduled classes, laboratories, or discussion sections. Most instructors gave a small number of course points to motivate student participation, and all students were told that their instructors would not see their data during the term in question.
In at least some contexts, self-report data need to be interpreted cautiously. Self-report data on time allocation can be subject to inter-individual variation due to different definitions or interpretations of tasks (Burke et al., 2000). In addition, over- or under-estimation of time spent can occur in low- and high-time-investment individuals, respectively (Collopy, 1996). Our use of self-report data is based on the premise that students we surveyed at different institutions are comparable in terms of the frequency and direction of these biases.
We analyzed all data in R (R Core Team, 2013). Preliminary analyses indicated that there were regional differences in the amount of time spent studying for the focal biology course. We attributed this to the differences in the quarter versus the semester system, as students in the quarter system take fewer courses at the same time than students in the semester system—typically three versus four. Accordingly, we added state as a predictor variable to control for this difference in subsequent analyses.
To test the hypothesis that there is a significant difference in nonacademic obligations for students from different institution types, we modeled time reported on nonacademic activities—those related to a student’s intended field as well as those not related to a student’s intended field. These obligations included work for pay and volunteering inside or outside of the student’s intended field, commuting, family care, and participating in sports or student organizations, all reported in units of half hours. The survey questions provided in Appendix 1 provide a full listing of the possible activities that students considered (Table 2). Students report a median of 20 hours spent on these activities a week, with a range from zero hours to 143 hours. This variable shows a high skew, so it was log10 transformed for this analysis. In addition, given the large range in student responses, some of the reported times (such as 143 hours per week) seemed unreliable. Lacking an objective and reliable way to identify and delete unreliable estimates of time, we used Tukey’s outer fence (three times the interquartile range) to identify outliers.
|Table 2. How the undergraduates surveyed report spending their time, by institution type.|
In addition to testing the explanatory importance of institution type, our model accounted for the differences in the two regions and included a random effect for instructor. We used this random effect, symbolized as 1|Instructor, to control for course- or section- or term-specific variation that might have affected time allocation—such as course difficulty or instructor enthusiasm. The following linear mixed model was fit by reduced maximum likelihood, using t-tests based on the Satterthwaite approximation: Log10(nonacademic obligation hours) ~Institution type × Region + 1|Instructor.
Based on initial model testing of the higher-order term, we found no support for the inclusion of interaction term between institution type and region (χ-value = 4.3, p = 0.16). Thus, the final model only includes these factors as main effects. This final model was then run with and without outliers to test the sensitivity of the results to extreme reported values.
The linear mixed-effect model that we fit to answer this question is summarized in Table 3; it indicates that institution type is the only significant predictor of time devoted to nonacademic obligations. Community college students devoted almost twice as much time to noncourse-related commitments as students at the research-intensive universities (Figure 1; p = 0.001). There was no difference in this aspect of time allocation between students at comprehensive universities and R1s. Running the model on the dataset without outliers did not impact these results.
|Table 3. Nonacademic obligations vary by institution type.|
If community college students have more nonacademic obligations than students at other types of institutions, do they have less time to study biology than their peers at regional comprehensive universities and R1s? If so, it would suggest that they may not be able to meet the time demands of a high-structure course—one that includes daily required assignments outside of class. To address this issue, we constructed a model to predict study time that accounted for the differences in geographic region, time spent on nonacademic obligations, and included a random effect for instructor. In this model, the degree of structure was indexed by the number of outside-of-class assignments, binned as rare, weekly, or daily (Table 1); institution type was binned as community college, comprehensive university, or research-intensive university.
In this sample, students reported a median of eight hours a week studying for their introductory biology class in a typical nonexam week, but reports ranged from 0 to 70 hours. This variable shows a high skew, so it was log10 transformed for this analysis. Like obligation time, we suspected that the time estimates on the tail of this distribution were unreliable.
Our initial full model was: Biology study hours ~ Institution type + Nonacademic obligations + Outside-of-class assignments + State + Institution type × Region + Nonacademic obligations × Outside-of-class assignments + 1|Instructor.
Initial examination of the higher-order terms suggested that “Nonacademic obligations × outside of class assignments” was not necessary to retain in the model (χ-value = 2.05, p = 0.35), but that “Institution × Region” should be retained (χ-value = 18.4, p = 0.001). This final model was run with and without outliers to test the sensitivity of the results to extreme reported values.
Table 4 summarizes the result for the fixed effects in the final model. The frequency of out-of-class assignments did not significantly influence the reported hours students studied per week (Figure 2a). In addition, there were no significant differences between institution type and region in reported time studying. However, model selection suggested the retention of the interaction term. This suggests that the relationship between institution type and reported study time varied by region (Figure 2b). The results also show that nonacademic obligations do not explain a significant amount of the observed variation in study time. Although the estimate did not change substantially when we ran the model without outliers, hours spent in nonacademic obligations approached significance (β = 0.0004 ± 0.0002, p = 0.055).
|Table 4. Predicting how much time students spend studying biology: model results.|
There is an important difference between time that students spend working for pay or volunteering in nonacademic activities that are relevant to their intended career path, and nonstudying hours that are spent on paid or volunteer activities unrelated to their intended field. We designated the former as In-Field and the latter as Not-In-Field. The majority of students (69%) did not report any hours in this category. For those that did report hours, the median was six hours with a range of 0.5 hrs to 57 hours and a large negative skew.
We chose to model this outcome variable as a binary (1 = In-Field experience and 0 = Not-In-Field experience) rather than hours reported, because so few students actually had these hours. Thus, we used a general linear mixed effect model with a binomial distribution to model the data. The initial model was: In-Field ~Institution Type + Region + (Institution type × Region) + 1|Instructor.
An initial test for the inclusion of higher-order terms found no support for retaining the interaction term (χ-value = 1.65, p = 0.44).
The results reported in Table 5 indicate that students at comprehensive universities spend significantly less time on experiences in their chosen field—via either work for pay or volunteering—than students at either community colleges or research-intensive universities.
|Predicting whether students engage in nonacademic work—paid or volunteer—in their field.|
Our data indicate that students at community colleges spend just as much time studying introductory biology as students at R1s, and the same or more time studying than students at comprehensive universities. This result holds even though community college students are partitioning other aspects of their time differently than students at comprehensive universities and R1s. Consistent with earlier work, the community college students we surveyed have many more hours of nonacademic commitments than students at comprehensive universities and R1s, but spend more of this nonacademic time pursuing paid or volunteer opportunities related to their field of study than students at comprehensives, though less than students at R1s.
Taken together, these results suggest that community college students are putting a high priority on their academics, and making different trade-offs than their peers at four-year institutions in terms of time allocation. Compared to students at comprehensive universities and R1s, the data in Table 2 suggest that community college students devote more time to family and job responsibilities and less time to student groups and clubs or recreational activities. Being more likely to attend school part-time may also play a role in this trade-off, with reduced credits possibly allowing part-time community college students to invest large amounts of time in each course they take while fulfilling their work and family obligations—at the cost of extending their time to degree.
Our data support the hypothesis that increased course structure—in the form of additional outside-of-class assignments—could be a viable way to promote the success of students at community colleges and comprehensive universities (Pape-Lindstrom et al., 2018). In the state of Washington, community college students studied the same amount as students at an R1 with a high-structure course design. In the Michigan sample, we found no difference in average time spent studying for classes that had rare, weekly, or daily assignments outside of class.
Further analyses are warranted, however, on the relationship between course structure, study time, and study techniques. The question is important because study time in and of itself is not a strong predictor of achievement in many courses (Clement, 2016; DesJardins et al., 2010; Nandagopol & Ericsson, 2012; Plant et al., 2005). Instead of simply needing more time, underprepared students may need more structure—especially in introductory courses (Bahr et al., 2017)—to acquire study skills that are effective at the college level (Clement, 2016; Masui et al., 2014). For example, Nandgopol and Ericsson (2012) found that variation in aspects of self-regulated learning—such as seeking assistance from peers and pursuing additional information—not only explained 46% of observed variation in final grades, but was the key factor distinguishing low-, medium-, and high-performing students. As these authors pointed out, self-regulated learning is closely related to deliberate practice. Deliberate practice, in turn, is hypothesized to be a causative agent in course designs that employ high structure and find disproportionate benefits for underprepared students (Eddy & Hogan, 2014; Haak et al., 2011). Taken together, these observations suggest that how students study may be just as or more important than how much they study.
In conclusion, our data point to two significant take-home messages:
We thank Catharine H. Beyer and the consulting group at the University of Washington Office of Educational Assessment for help designing the time-on-task survey and Allen Farrand, Diane Forson, Jenny McFarland, Carrie Schwarz, Linda Brandt, David Wooten, Matthew Chapman, and Gyorgyi Csankovszki for help with data collection. Comments from the University of Washington Biology Education Research Group improved the manuscript. This work was supported in part by NSF DUE grant 1118890, and was conducted with oversight from the Institutional Review Boards of Everett Community College, Eastern Michigan University UHSRC #131110M, Western Washington University # EX14-015, the Institutional Research Office at Henry Ford Community College, and the Office of the Vice President for Instruction at Washtenaw Community College.
Scott Freeman (firstname.lastname@example.org) is lecturer emeritus in the Department of Biology at the University of Washington in Seattle, Washington. Pamela Pape-Lindstrom is dean of science, technology, engineering, and mathematics at Harford Community College in Bel Air, Maryland. Anne Casper is professor in the Department of Biology at Eastern Michigan University in Ypsilanti, Michigan. Sarah Eddy is assistant professor in biology and the STEM Transformation Institute at Florida International University in Miami, Florida.
This survey asks you to report the average number of hours you spend per week on various activities during the current term.
In the following series of questions, please respond to the following prompt:
On average, over the course of this current academic term, how many hours a week have you spent on the following:
1. Studying outside class for this biology course (including time spent studying alone, with others, and at study or tutoring centers):
2. Studying outside class for all your other courses (including time spent studying alone, with others, and at study or tutoring centers):
3. Getting to and from campus:
4. Of the hours you spend working for pay, how many of these hours are spent at a job directly related to your intended career (i.e., a job where you are developing skills or knowledge you may use as a professional in your intended field):
5. Volunteering (average hours per week):
6. Of the hours you spend volunteering, how many of these hours are spent at a job directly related to your intended career (i.e., a volunteer position where you are developing skills or knowledge you may use as a professional in your intended field):
7. Taking care of your children (e.g., preparing meals, taking them to daycare, playing with them, etc.):
8. Taking care of elderly family members (e.g., preparing meals, helping them with daily chores, etc.):
9. Taking care of siblings, helping family members navigate organizations or demands, or assisting family members in other ways:
10. Household maintenance and chores (e.g., laundry, cleaning, lawn mowing, etc.):
11. Practicing, training, and engaging in other activities directly related to your performance in a sport if you are a scholarship-supported college athlete:
12. Practicing, training, and engaging in other activities directly related to your performance in a sport if you are a nonscholarship-supported college athlete:
13. Exercising, including intramural sport activities (but not related to college-level athletics):
14. Participating in clubs or other college-sponsored activities:
15. Watching TV, playing video games, socializing, or staying in touch with friends:
17. Are there any activities that take a significant amount of your time that are not listed in this survey? Please describe these activities and how much time on average you spend on them a week.
Thank you so much! Your answers will help us design courses that will help students perform better, given the other demands on their time.
American Association of Community Colleges. (2017). Fast facts sheet. .
Bahr P. R., Jackson G., McNaughton J., Oster M., & Gross J. (2017). Unrealized potential: Community college pathways to STEM baccalaureate degrees. The Journal of Higher Education, 88(3), 430–478.
Burke T. A., McKee J. R., Wilson H. C., Donahue R., & Batenhorst A. S. (2000). A comparison of time-and-motion and self-reporting methods of work measurement. Journal of Nursing Administration, 30(3), 118–125.
Carnevale A. P, & Rose S. J. (2011). The undereducated American. Georgetown University Center on Education and the Workforce.
Chen X., & Soldner M. (2013). STEM attrition: College students’ paths into and out of STEM fields. U.S. Department of Education, Institute of Educational Sciences.
Clement L. (2016). External and internal barriers to studying can affect student success and retention in a diverse classroom. Journal of Microbiology & Biology Education, 17(3), 351–359.
Collopy F. (1996). Biases in retrospective self-reports of time use: An empirical study of computer users. Management Science, 42(5), 758–767.
Community College Research Center. (2017). Frequently asked questions. .
DesJardins S. L, McCall B. P., Ott M., & Kim J. (2010). A quasi-experimental investigation of how the Gates Millenium Scholars program is related to college students’ use of time and activities. Educational Evaluation and Policy Analysis, 32(4), 456–475.
Eddy S. L., & Hogan K. A. (2014). Getting under the hood: How and for whom does increasing course structure work? CBE—Life Sciences Education, 13(4), 453–468.
Fakayode S. O., Yakubu M., Adeyeye O. M., Pollard D. A., & Mohammed A. K. (2014). Promoting undergraduate STEM education at a Historically Black College and University through research experience. Journal of Chemical Education, 91, 662–665.
Freeman S., Haak D., & Wenderoth M. P. (2011). Increased course structure improves performance in introductory biology. CBE Life Sciences Education, 10(2), 175–186.
Freeman S., Eddy S. L., McDonough M., Smith M. K., Okoroafor N., Jordt H., & Wenderoth M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the USA, 111(23), 8410–8415.
Ginder S. A., Kelly-Reid J. E., & Mann F. B. (2017). Enrollment and employees in postsecondary institutions, fall 2015, and financial statistics and academic libraries, fiscal year 2015. National Center for Education Statistics.
Haak D. C., HilleRisLambers J., Pitre E., & Freeman S. (2011). Increased course structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216.
Johnson C. W., Johnson R., Steigman R., Odo C., Vijayan S., & Tata D. V. (2016). Appropriately targeting group interventions for academic success adopting the clinical model and PAR profiles. Educational Researcher, 45(5), 312–323.
Juszkiewicz J. (2015). Trends in community college enrollment and completion data, 2015. American Association of Community Colleges.
Kahlenberg R. D. (2019) The real college scandal. The New York Times. .
Labov J. (2012) Changing and evolving relationships between two- and four-year colleges and universities: They’re not your parents’ community colleges anymore. CBE-Life Sciences Education, 11(3), 121–128.
Masui C., Broeckmans J., Doumen S., Groenen A., & Molenberghs G. (2014). Do diligent students perform better? Complex relationships between student and course characteristics, study time, and academic performance in higher education. Studies in Higher Education, 39(4), 621–643.
Nandagopol K., & Ericsson K. A. (2012). An expert performance approach to the study of individual differences in self-regulated learning activities in upper-level college students. Learning and Individual Differences, 22, 597–609.
National Center for Education Statistics. (2017). The condition of education 2017 at a glance. NCES.
Pape-Lindstrom P., Eddy S. L., & Freeman S. (2018). Reading quizzes improve exam scores for community college students. CBE—Life Sciences Education 17(2), 1–8.
Plant E. A., Ericsson K. A., Hill L., & Asberg A. (2005). Why study time does not predict grade point average across college students: Implications of deliberate practice for academic performance. Contemporary Educational Psychology, 30(6), 96–116.
President’s Council of Advisors on Science and Technology. (2012). Engaged to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Executive Office of the President.
R Core Team. (2013). R: A language and environment for statistical computing. R Foundation for Statistical Computing.
Ruiz-Gallardo J.-A., Castaño S., Gómez-Alday J. J., & Valdés A. (2011). Assessing student workload in Problem-Based Learning: Relationships among teaching method, student workload and achievement. A case study in natural sciences. Teaching and Teacher Education, 27, 619–627.
Schinske J. N., Balke V. L., Bangera M. G., Bonney D. M., Brownell S. E., Carter R. S., Curran-Everett D., Dolan E. L., Elliott S. L., Fletcher L., Gonzalez B., Gorga J. J., Hewlett J. A., Kiser S. L., McFarland J. L., Misra A., Nenortas A., Ngeve S. M., Pape-Lindstrom P. A.,…& Corwin L. A. 2017. Broadening participation in biology education research: Engaging community college students and faculty. CBE—Life Sciences Education, 16(2): mr1.
Snyder T. D., de Brey C., & Dillow S. A. (2016). Digest of education statistics 2014. National Center for Education Statistics.
Talamantes E., Mangione C. M., Gonzalez K., Jimenez A., Gonzalez F., & Moreno G. (2014). Community college pathways: Improving the U.S. physician workforce pipeline. Academic Medicine, 89(12), 1649–1656.
Wieman C. E. (2014). Large-scale comparison of science teaching methods sends a clear message. Proceedings of the National Academy of Sciences, 111(23), 8319–8320.
Web SeminarScience Update: A Deep Dive into NASA's X-57, February 17, 2022
Join us on Thursday, February 17, 2022, from 7:00 PM to 8:00 PM ET for a deep dive into the X-57 Maxwell, NASA's all-electric x-plane....
Web SeminarBook Beat Live! Uncovering Student Ideas with Formative Assessment Probes, December 15, 2021
Join us on Wednesday, December 15, 2021, from 7:00 PM to 8:15 PM ET for another seminar in the Book Beat Live! series. Formative assessment is...