ASCB logo LSE Logo

General Essays and ArticlesFree Access

Beyond the Traditional Classroom: Increased Course Structure and Cooperative Learning Remove Differences in Achievement between Students in an In-Person versus Hybrid Microbiology Course

    Published Online:https://doi.org/10.1187/cbe.21-01-0007

    Abstract

    The increase in online learning brought on by the COVID-19 pandemic will likely result in a greater availability of online and hybrid course offerings. In this study, students enrolled in parallel sections of a microbiology lab course with in-person labs and either face-to-face (F2F) or all-online lectures (hybrid, H). Course material and method of assessment in the two sections were identical; student demographics were similar. In the first year, F2F students scored significantly higher on two out of four exams. In the second year, two interventions were introduced: team-building activities (in both sections) and online group discussions (H only). Students in both the F2F and H sections reported similar positive teamwork reviews based on Comprehensive Assessment of Team Member Effectiveness (catme.org) and survey data. Although the COVID-19 pandemic eventually forced all learning online, exam scores from the two sections in the first half of the semester were similar, suggesting that the interventions were effective. In both sections, exam scores were positively correlated with entering grade point averages. This study adds to the body of literature supporting the effectiveness of hybrid learning.

    INTRODUCTION

    Enrollment in online classes has seen a continuous steady increase (Woodyard and Larson, 2017; Seaman et al., 2018; Garrett, 2019; Gallagher and Palmer, 2020). Online education offers many advantages, including increased student flexibility and compatibility with work schedules (U.S. Department of Education, 2010). However, online classes may also limit student–teacher and student–student interaction (Bambara et al., 2009). The COVID-19 pandemic has rapidly accelerated the transition from face-to-face to online education (Davis et al., 2020; Noel et al., 2020). It seems highly likely that this transition will continue, because online courses provide increased options for students as well as an additional revenue stream for colleges that lack the resources, personnel, and space to teach additional classes (Gould, 2003).

    The Benefits of Hybrid Classes

    Hybrid classes, which combine online learning with face-to-face instruction, are becoming increasingly popular (U.S. Department of Education, 2010). A hybrid or blended course is defined as having between 30% and 79% of the course content delivered online (Allen et al., 2007). The goal of a hybrid class is to promote active learning with increased flexibility by combining the best features of in-class learning and online classes (Vaughan, 2007). When done properly, a hybrid class can make better use of available resources; however, this requires a well-functioning and continuously monitored infrastructure to support both students and staff (Twigg 2003; Garrison, 2008; Moskal et al., 2013; Varty, 2016). A successful hybrid class shifts focus from information-delivery formats like lecturing to more active learning such as discussions and debates (Vaughan, 2007). Hybrid classes have been used previously to successfully teach microbiology lab (Sancho et al., 2006) or lecture classes (Krawiec et al., 2005; Adams et al., 2015)

    One issue with any online format is that faculty need training and support to develop a quality class. Online classes should be relevant and collaborative and should provide students some control over their learning (Kim and Bonk, 2006). Classes need to encourage inquiry and critical reflection by their students. Development of such quality online classes requires substantial institutional support and faculty commitment (Keeton, 2019).

    What Can We Do to Improve Outcomes from Online Teaching?

    Many studies have demonstrated that increasing course structure with required active-learning exercises raises the achievement of all students, particularly those from disadvantaged educational backgrounds (Freeman et al., 2011; Haak et al., 2011; Eddy and Hogan, 2014; Gavassa et al. 2019). Haak and colleagues (2011) refer to this as the “Carnegie Hall hypothesis”: students who have an opportunity to practice higher-order thinking skills perform better. Courses incorporating more active learning are positively correlated with increased student motivation and efficacy (Wyk, 2012). One way to achieve this is through the inclusion of active-learning activities, group work, weekly quizzes and discussion sections. These types of course structural interventions have been demonstrated to improve learning, reduce achievement gaps, and improve student performance in science, technology, engineering, and mathematics (STEM) classes (Gibson and Chase, 2002; Jensen and Lawson, 2011; Freeman et al., 2011, 2014; Haak et al., 2011).

    Collaborative learning improves student learning and achievement (Anderson et al., 2005; Armstrong et al., 2007; Doymus, 2008; Preszler, 2009; Johnson et al., 2000). Collaborative learning is derived from constructivist theory (Piaget, 1926; Vygotsky et al., 1978), which states that previously held incorrect conceptions must be challenged in order to build new knowledge; active learning is required to change ideas. The work of Piaget and related theorists is based on the premise that, when individuals cooperate in the environment, sociocognitive conflict occurs that creates cognitive disequilibrium, which in turn stimulates perspective-taking ability and cognitive development.

    Do Different Subpopulations of Students Perform Better in a Traditional or Hybrid Setting?

    The need to provide high-quality science education to minority students is well recognized (Malcom, 1996; Hrabowski, 2011). In general, students at minority-serving institutions have used distance education less extensively than students at other schools (Ashby, 2002). Before the COVID-19 pandemic, Blacks and Hispanics in STEM majors were found to take significantly fewer classes online (Wladis et al., 2015). Also, some disadvantaged students, especially Native Americans and rural Americans, may lack high-speed Internet access necessary to participate in an online learning environment (Rosenboom and Blagg, 2008; Banerjee, 2020).

    Hybrid format with high structure has been shown to improve exam performance for traditionally underrepresented minority (URM) students (Gavassa et al., 2019). In this study, the authors made a comparison of student outcomes using three different teaching formats: fully online, hybrid, and face-to-face in an introductory biology class. Whereas white students in the study had the highest performance in the online format, Hispanic and Black students had higher scores in the hybrid format. In a second study of a large introductory biology class (Haak et al., 2011), educationally or economically disadvantaged students were shown to benefit most from a structured active-learning environment.

    The Current Study

    In our current study, we aimed to identify factors that may contribute to successful teaching in a hybrid format. For this purpose, we compared learning outcomes for a large undergraduate microbiology class with in-person labs and either a hybrid section with lectures delivered entirely online (H) or mostly face to face (F2F). Both sections had very similar demographic distributions. In an initial comparison, we observed a slight but significant reduction in exam scores in the H relative to the F2F section. We then tested the effects of interventions including structured online group discussions and other group activities to improve team satisfaction and student engagement in the H section. We hypothesized that increased course structure and cooperative learning would produce similar student exam scores in courses with the two types of instruction. We used disaggregated student response data to address the question: What factors are associated with academic performance in F2F versus H sections? In the end, students in the H section had similar levels of achievement (exam scores) and team satisfaction as those in a more traditional F2F section. These results are consistent with and expand upon the work of others showing that online or hybrid format can be improved with increased structure (Haak et al., 2011; Biel and Brame, 2016).

    METHODS

    The University of Minnesota Institutional Review Board determined that the proposed activity, STUDY00009819, is not research involving human subjects as defined by U.S. Department of Health and Human Services and U.S. Food and Drug Administration regulations.

    We designed and taught two sections of an introductory microbiology course with a laboratory at a large comprehensive public university: the “traditional” section had one online and two F2F lectures each week, while the H section had three online lectures. This course had historically used only the traditional format, but increased enrollment and a lack of lecture space in the Spring of 2019 led us to add the H section. Both the F2F and H sections had two laboratory sessions per week designed to complement lecture topics. The lecture and lab materials, quizzes, lab reports, presentations, and exams were identical for both sections.

    The F2F sections contained both nursing and non-nursing students, while there were no nursing students enrolled in the H section. We focused our analysis on non-nursing students in both sections; they were primarily juniors and seniors with similar majors but came from several different colleges in the university (Figure 1). In contrast, nursing students belong to a close-knit cohort and are younger (primarily sophomores); these students traditionally score higher on exams than their non-nursing classmates and enter the course with higher grade point averages (GPAs; unpublished data). In 2019, non-nursing students in the hybrid session scored slightly lower on two exams compared with those in the F2F section, prompting us to add interventions in the Spring of 2020.

    FIGURE 1.

    FIGURE 1. Non-nursing student demographics in the 2019 and 2020 F2F and H sections. Data are from the registrar’s office. (A) Distribution of students’ declared majors; (B) percentage of students with various class standings.

    Course Design

    General Microbiology is a large (200+ students) 2000-level mixed-major, lab/lecture, allied health introductory course with prerequisites of one semester of college biology and chemistry. Although some students take the course to complete a major requirement, most students take the course as a requirement for entrance into health professional graduate programs. The course was designed using evidence-based practices including backward design (Wiggins and McTighe, 2005) and universal design for learning (www.cast.org), and entailed a highly structured format (Supplemental Table 1), which is associated with better student outcomes (Haak et al., 2011; Eddy and Hogan, 2014; Gavassa et al, 2019). All course materials were accessible through the online course management systems Moodle (2019) or Canvas (2020). Lectures consisted of a summary of material from the textbook supplemented with case studies from news stories and references to recent publications. In the F2F section, we used the Top Hat platform (https://tophat.com) as a student response system. In-class lectures included three to four questions designed to assess student understanding and address common misconceptions. Students earned participation points by answering assigned questions. Online lectures covered the same material with a voice-over slide presentation annotated with hand-drawn notes using Doceri desktop; lectures were recorded in 10 to 30 minute segments. Before or shortly after lecture, students in both sections completed online adaptive-learning quizzes provided by the publisher (Nester’sMicrobiology: A Human Perspective); students who answered a question correctly moved on to the next subject, while those who answered incorrectly received additional similar questions to try after reading the textbook. Students received credit for the assignment after all questions were answered correctly. We posted learning guides for each lecture that contained critical-thinking questions related to the learning objectives and sample test questions. The F2F section had one online lecture each week, while the H section had three. Two of the authors (D.F.-H. and P.G.-M.) delivered the same in-person or online lectures in both sections and were the only lecture instructors for the course.

    All students participated in 2 hour lab sessions held twice a week, but students in the F2F section and H sections were taught separately. D.F.-H. and P.G.-M. led lab sessions that were facilitated by a lab coordinator (L.B.), instructor (G.M.), and undergraduate peer TAs who had taken the class previously. Groups of four students completed the lab experiments and lab reports together. Before the lab, students were expected to watch a video and read the lab manual. During each lab session, we administered an online lab quiz worth 4 points (group or individual) that covered lab and lecture material. During the lab, students also worked together on critical-thinking questions, case studies, concept maps, and a group presentation.

    Assessments

    A wide variety of assessments, both individual and group, contributed to each student’s final grade (Supplemental Table 1). As described earlier, students took 26 lab quizzes and were allowed to drop three. In 2019, students in both sections took in-person exams with bubble sheets that contained 50 multiple-choice and true/false questions. In 2020, students in both sections took the same type of in-person exams on their laptops using the LockDown browser (https://web.respondus.com/he/lockdownbrowser); these exams also included multiple-answer questions. Each section had four exams; the last exam contained approximately 25% comprehensive material. Groups worked together to produce three sets of lab reports worth a total of 75 points. The same groups produced recorded (2019) or written (2020) presentations (40 points) on an infectious disease topic: ebola in 2019 and measles in 2020. Lab groups also completed a set of immunology case studies together. In 2020, students in the hybrid section participated in online nested discussions (4 points/week, described later).

    Survey

    In 2020, students were invited to participate in a survey posted on Canvas that included multiple-choice or Likert-style questions and two open-ended questions regarding outside commitments, college science courses taken, course modality preference, the number of credits they were taking, and previous experience with online learning (Supplemental Table 2). Answering the survey questions allowed students to drop an additional quiz score. Students were informed that their answers would be de-identified before analysis. Ninety-eight percent of F2F section students and 87% of the H section students responded.

    Student Population

    Students from many different majors self-selected either the F2F or H sections based on preference and schedule availability. One exception was nursing students who had to enroll in the F2F section. However, as noted earlier, nursing students were excluded from this analysis. Of the 222 students enrolled in the 2019 F2F section, 77 were non-nursing students (35%); there were 68 students in the H section. Approximately 75% of the students in both sections were enrolled in biology- or chemistry-related majors (Figure 1A). The percentage of juniors and seniors (87% and 84%) for the F2F and H sections, respectively, were also similar (Figure 1B). The 2020 student demographics were comparable to those in 2019. In the F2F section, 32% of students were non-nursing majors. In both sections, 80% of non-nursing students were enrolled in biology- or chemistry-related majors (Figure 1A). Most students were juniors or seniors (93% F2F, 85% H; Figure 1B).

    In theory, having asynchronous online lectures might give students in the H section additional time to take on work/family commitments and/or extra credit hours. These extra time constraints could contribute to lower exam scores. To address these variables, students in 2020 were given surveys (described earlier). The distribution of credit loads was actually similar for students in the F2F and H sections, with students in the F2F section taking slightly higher loads (Figure 2A). The numbers of students with low (≤5 hours/week) and high (>20 hours/week) levels of work/family commitments outside school were also similar for both groups (Figure 2B). Students answered demographic questions on their entering GPAs and race when enrolling in the Comprehensive Assessment of Team Member Effectiveness (CATME) program at the beginning of the semester (described later); 87% of students in each section participated. Students in both sections reported similar levels of racial diversity (Figure 2C). More than 80% of students in both groups had taken five or more sciences courses before enrolling in our course (Figure 2D). Most students were female: 82% in the F2F and 87% in the H sections (unpublished data). The average incoming GPA was 3.51 and 3.46 for the F2F and H sections, respectively (unpublished data).

    FIGURE 2.

    FIGURE 2. Non-nursing student demographics in the 2020 F2F and H sections assessed by survey. (A) Number of credit hours taken in the semester analyzed; (B) number of hours/week committed to work, family, or volunteer obligations outside school; (C) primary racial identity; (D) number of university-level science courses taken before enrolling in the course. Students answered CATME questions at the beginning of the term (C) or optional course survey questions at the end of the term (A, B, D).

    Course Preference and Withdrawal Rates

    The majority of students were able to enroll in the format of their choice, although the number was smaller for the H course. In the F2F section, 70% of students preferred the traditional format; 20% preferred an all in-person format (no online lectures, not offered), and 9.5% would have enrolled in the H section if schedule allowed. In the H section, 54% of students reported a preference for an all-online lecture format; 29% preferred the hybrid and 17% preferred an all in-person format (not offered). No students withdrew from either section in 2020. In 2019, one student withdrew from the F2F section, and none withdrew from the H section.

    Interventions: Metacognition, Team-Building Activities, Assessment of Group Effectiveness (CATME), and Online Discussions

    After exam 2, students in the 2019 H section were given the opportunity to complete an exam correction exercise designed to increase metacognition and familiarity with the layout of the course (Figure 3). Students could choose up to three questions missed on the exam and, by answering the following prompts for each question, could earn up to 1 point (50%) back per question.

    1. Copy the question.

    2. State the correct answer and why this is better than the one you chose.

    3. Write where you found this information in the class materials.

    FIGURE 3.

    FIGURE 3. Timeline for course interventions in the 2019 and 2020 H courses. Red arrows indicate team-building activities, the gray arrow denotes a metacognition activity, blue arrows indicate weekly discussions, and green arrows indicate CATME Peer Evaluation surveys. A CATME Team Maker survey was also used at the beginning of the 2020 semester (unpublished data). The shaded area of 2020 represents the period when both lecture and lab for the hybrid (and the F2F) section were conducted online; data from this period were not included in the present analysis.

    The average number of points earned was 3; 81% of students completed the exercise. Students in both H and F2F 2020 sections completed a similar exercise after exam 1.

    For the entire semester (2019 and 2020), students worked together in groups of four (or occasionally three) to which they were randomly assigned. Groups completed all 27 labs, wrote lab reports, produced a presentation, and took lab quizzes together. To promote group cohesion, we introduced team-building activities in the first half of the semester. On the first day of the lab in both 2019 and 2020, students introduced themselves using the “Discover Your Microbe Personality: Which Microbe Are You?” online activity developed at the Center for Microbial Oceanography: Research and Education (CMORE; http://cmore.soest.hawaii.edu/education/kidskorner/eng_6up_quiz/ur_q1.htm).

    In 2020, students participated in four additional team-building activities (Figure 3). First, we (D.F.-H. and P.G.-M.) discussed our roles and responsibilities as a team teaching several courses together. We offered personality insights, provided examples of ways our strengths are complementary, and discussed how we resolve conflict. Afterward, students were given time to reflect on their own strengths and to share these with one another. In another activity, as instructors we shared Two Truths and a Lie, an activity where students had to guess which of three statements about us was untrue; students continued this activity as a group. Before the first group lab report, students prepared a team contract (described later) with questions and language inspired by undergraduate student TAs. After students turned in the first group lab reports, they filled out a CATME peer evaluation (described later). In the last activity before the shutdown, TAs described their experiences working in groups, offered suggestions for resolving conflict, and gave students 10 minutes to reflect on their experiences.

    In 2020, we used the CATME Web-based program (https://info.catme.org) that evaluates teamwork by recording both students’ peer evaluations of individual group members and their perception of the team experience as a whole (Figure 3). In the week preceding the first day of class, students completed the Team Maker survey, which included questions on sex, race, schedule, incoming GPA, year (academic standing), and major; 87% of students in each section completed the survey. Students completed three CATME peer evaluations: one after the first group lab report (pre-shutdown), one after the second lab report, and one at the end of the semester (both post-shutdown). Only data from the pre-shutdown survey were included in this analysis.

    For online group discussions, each week, we posted two questions similar to those in the Learning Guide. Students responded to the questions midweek but were able to see other group members’ answers only after they posted; by the end of the week, they were expected to either revise their answers based on information shared by the group or comment on another post, adding to the content or pointing out a potential misconception.

    Data Analysis

    Mean exam scores for the two sections were compared using GraphPad Prism (www.graphpad.com/scientific-software/prism) software and pairwise independent two-tailed t tests with unequal variance. The effect of the metacognition exercise on exam scores was assessed using two-way repeated-measure analysis of variance (ANOVA) on GraphPad Prism. IBM SPSS software was used to analyze the factors associated with students’ academic performance. We performed multiple linear regressions using the stepwise method and the IBM SPSS software platform. The regression model with the highest adjusted R2 was chosen. We used the coefficient to measure the degree of relationship between the exam scores and 1) the self-reported entry GPA, 2) the number of previous university-level online courses taken, 3) the number of previous university-level science courses taken, 4) the time after lecture at which students reviewed the lecture learning guide, 5) the number of credits taken in that semester, and 6) the number of hours per week that students spent working, volunteering, or caring for family members. Residual plots had no outliers, and assumptions of regression were all satisfied. Survey questions can be found in Supplemental Table 2.

    RESULTS AND DISCUSSION

    Spring 2019 Hybrid Student Exam Scores Were Slightly Lower Despite Metacognition Intervention

    Exam 1 scores for students in the F2F and H sections were similar overall (Figure 4A). However, the mean exam 2 score for H students (72.8%) was 4.5% points lower than the mean score for F2F students (77.3%, p < 0.05; Figure 4A). We hypothesized that students who did not have in-class instruction for studying were not as familiar with the layout and emphasis of the course. Students in the H section completed a metacognition test correction exercise (described in Methods; Figure 3) to earn up to 3 points back on exam 2. We compared the average increase in exam scores (exam 3 − exam 1/2 average, and exam 4 − exam 1/2 average) for students within the H section: those who completed the metacognition exercise, H(+M), and those who did not, H(−M). H(+M) students actually started with median exam 1 scores that were lower than those of H(−M) students (Supplemental Figure 1A). H(+M) students improved their exam 3 scores by an average of +5.4% points compared with +2.2% points for H(−M) students. The difference between the two groups increased to 5% points by exam 4 (Supplemental Figure 1B).

    FIGURE 4.

    FIGURE 4. Exam scores for students in parallel F2F and H sections from two terms: (A) 2019 and (B) 2020. Lines mark median scores. Asterisks indicate pairwise comparisons with a significant (p < 0.05) difference between the means.

    Metacognition is an important element in the process of learning (Wang et al., 1990; Tanner, 2012), and exercises designed to increase student metacognition can result in higher quiz scores and greater understanding of biological concepts (Mynlieff et al. 2014; Siegesmund, 2016; Sabel et al., 2017). We observed a trend toward higher test scores after students completed a post-exam metacognition assignment. Learning where to find the material on slides and learning guides may have helped some students prepare for subsequent exams. On the other hand, students who chose to complete the exercise may have been more motivated in general to improve their grades. In either case, the number of students in each group, 55 (+M) and 12 (−M), was too small to conclude that the exercise made a significant difference. Although we did not find a significant difference between the two groups using a repeated-measures ANOVA test, we observed a trend in the data. In 2020, we chose to keep this exercise and increase the focus on cooperative learning with an increased course structure.

    Spring 2020 Hybrid and F2F Student Scores Were Similar and Were Associated with Positive Group Dynamics and Increased Course Structure

    2020 Student Demographics, Interventions, and Exam Scores.

    In the second iteration of this two-section course (Spring 2020), we attempted to account for confounding variables by collecting additional demographic data from two sources: a brief survey completed at the beginning of the term when students enrolled in CATME and an additional survey administered at the end through the learning management system (Canvas), both described in Methods. In terms of course load, hours of outside work or family commitment, major, ethnicity, the level of science background, and class standing, the cohorts of non-nursing students in the F2F and H sections were very similar (Figures 1 and 2); the average entering GPA was also quite similar (unpublished data). For the first half of the semester, sections were taught as planned and described in Methods. However, because the COVID-19 pandemic forced the entire university to make a sudden switch to all-online learning, our study was confined to the first half of the semester, which included two exams.

    Study 1: Will an Increase in Course Structure and Emphasis on Positive Group Dynamics Decrease the Achievement Gap between Students in the 2020 F2F and H Sections?

    Role assignment, group contracts, peer evaluations, and peer assessments are useful tools to promote effective group collaboration (Messersmith, 2015; Chang and Kang, 2016; Chang and Brickman, 2018). In a previous study (unpublished), we introduced each of these in a section of microbiology very similar to the two in this analysis. TAs led three team-building activities during lab sessions. We assessed team satisfaction using the CATME online system (https://info.catme.org; described in Methods). At the end of the semester, CATME ratings for team satisfaction were high, with only a small proportion (7%) reporting a neutral or negative level of satisfaction (unpublished data). In online surveys, the majority of students (69%) found the team-building activities and group contracts (69%) to be helpful (unpublished data). Sixty-three percent of students in our study rated group work VBS 2032 as “more satisfying” than group work in other courses they had taken (unpublished data). These results influenced our design for the 2020 study, which included three types of interventions (Figure 3, described below).

    Students in the 2020 F2F sections answered questions collaboratively during class; however, students in the corresponding H section lacked this opportunity, because all lectures were online (asynchronous). We hypothesized that an increase in course structure and additional focus on collaborative learning would result in higher exam scores in the H section. Three types of interventions for the H section are depicted in the timeline (Figure 3) and described in Methods: team-building activities, CATME assessment of team satisfaction, and online group discussions. The COVID-19 pandemic forced most colleges and universities to make a sudden transition to online learning. For this reason, only data from the first half of the 2020 semester were used in this study.

    Before the shutdown, students engaged in four team-building activities and prepared a group contract. They completed a CATME Peer Evaluation Survey (www.CATME.org) and participated in five online group discussions. We used the same team-building activities in the F2F section; these students also completed the CATME surveys, but they did not participate in the online discussions.

    In response to three Likert-style questions administered at midterm through CATME before the shutdown, students in both sections reported high levels of team satisfaction (4.6–4.8 out of 5; Table 1). Even after the shutdown, CATME team satisfaction scores in both groups were high (unpublished data). When asked to respond to the statement “This class improved my group work skills,” 75% of F2F and 73% of H students responded with “strongly agree” or “agree.” When asked “How would you rate your group work experience in this course compared with group work in other university courses you have taken?,” 57% of students in each section reported that it was “more satisfying,” while only 5% (F2F) and 8% (H) reported that it was “less satisfying” (Supplemental Figure 2). CATME also assesses other parameters of team effectiveness such as conflict and psychological safety. Students in F2F and H sections reported similar low levels of conflict related to tasks, relationships, and process (Supplemental Table 3); students in both groups also reported similar high levels of psychological safety as measured by answers to six questions (Supplemental Table 3).

    TABLE 1. CATME peer evaluation ratings for students in the two sectionsa

    CATME team satisfactionbF2F (n = 44)H (n = 61)
    I am satisfied with my present teammates.4.7 (0.08)b4.7 (0.09)
    I am pleased with the way my teammates and I work together.4.7 (0.08)4.7 (0.06)
    I am very satisfied with working in this team.4.6 (0.10)4.7 (0.09)

    aStandard errors given in parentheses.

    bScored 1 to 5, with 1 = low, 5 = high.

    In 2020, students in the H section scored as high on exams 1 and 2 as those in the F2F section (Figure 4B). In contrast with the 2019 results (pre-interventions), exam 2 mean scores from the H section were actually slightly higher (though not statistically significant) (Figure 4B). These results add to a growing number of studies that document similar learning outcomes in hybrid versus face-to-face learning environments (Twigg, 2003; Garrison, 2008; Vaughan, 2007; Means et al., 2013; Moskal et al., 2013; Dziuban et al., 2018). Having the added structure of a weekly online discussion of critical-thinking questions may have helped students in the H section stay actively engaged while also examining their own understanding of the material in relation to others in the group. Spending dedicated time in the lab with team-building activities and group contracts may have also contributed to team satisfaction and a better cooperative learning environment.

    Many factors influence the ability of students to learn in an online environment (Schrum and Hong, 2002). Students’ level of experience in a university environment (class standing) was correlated with success in another hybrid (non-lab) microbiology course, as was the method of note-taking (Adams et al., 2015). Factors that may contribute to the success of online learning also include demographic characteristics, individual preferences and technical skills, student self-efficacy and motivation, and social interactions with other students and teachers (Baturay and Yukselturk, 2015). We asked students to respond to questions regarding their mode of course preference (F2F or online), semester credit load, number of hours spent on family or work commitments, and level of experience with online courses and university-level science and the timing of learning guide completion. (Questions are available in Supplemental Table 2.)

    Study 2: What Factors Are Associated with Academic Performance in F2F versus H Sections?

    A multiple linear regression was calculated to analyze exam scores based on 1) the self-reported entry GPA; 2) the number of previous university-level online courses taken; 3) the number of previous university-level science courses taken; 4) the time after lecture at which students reviewed the lecture learning guide; 5) the number of credits taken in that semester; and 6) the number of hours per week that students spent working, volunteering, or caring for family members (extracurricular activities).

    For the F2F section, significance was found in the entry GPA and those who were taking 13–14 course credits that semester (Table 2). The regression equation is y = 36.94 + 12.68x1 − 6.408x2, where x1 represents the entry GPA, and x2 is the 13–14 credits taken in the semester, with adjusted R2 of 0.365. Participants predicted that exam scores is equal to 36.935 + 12.677(entry GPA) − 6.408(13–14 credits per semester). The p values are 0.004 for the predictor of entry GPA and 0.049 for the 13–14 credits per semester.

    TABLE 2. Student factors associated with 2020 exam scoresa

    F2F (n = 36)H (n = 54)
    Pre-entry GPAp = 0.004p < 0.001
    Credit workloadp = 0.049ns
    Lecture review timingnsns
    Number of previous online coursesnsns
    Course delivery preferencensns
    Number of previous online science coursesnsns
    Hours spent in work/family commitmentsnsp = 0.014

    ans, not statistically significant.

    For the H section, significance was found in the entry GPA and those who spend fewer than 5 hours per week in extracurricular activities (point 6). The regression equation is y = 19.481x1 − 5.23x2, where x1 is the entry GPA, and x2 is the time spent in extracurricular activities, with adjusted R2 of 0.608. Participants predicted that exam scores is equal to 14.867 + 19.475(entry GPA) − 5.233(time spent in extracurricular activities) measured in hours per week. The p values are < 0.001 for the predictor of entry GPA and 0.014 for the group that spent 5 hours or fewer in extracurricular activities (Table 2).

    In this study, three variables were associated with higher exam scores: incoming GPA, the number of credits taken, and the hours per week spent working/volunteering/caring for family members (Table 2). There was a strong association between student GPA and exam scores in both the F2F and the H sections, with p values of 0.004 and < 0.001 respectively, Table 2. This held true even when all students were analyzed together regardless of section (unpublished data). Many previous studies have shown GPA to be a good indicator of academic performance in undergraduate biology classrooms (e.g., Freeman et al., 2007, 2011; Adams et al., 2015). The data from our analysis of the F2F and H students are consistent with these results. Students were asked whether their semester course credit load was < 12, 13–14, 15–16, 17–18, > 20 credits. They were also asked if they spent < 5, 6–10, 11–15, and 16 or more hours per week working/volunteering/caring for family members. Students who had a course load of 13–14 credits scored slightly higher in the F2F section (p = 0.049). Most universities advise students to take 12–15 credits/semester, so it may not be surprising that 13–14 credit hours allowed students enough time for study. Students in this group (n = 12) spent more time working/volunteering/caring for family members (> 5 hours/week) than students with higher credit loads (n = 27). In the H group, students who spent ≤ 5 hours/week on these activities (n = 15) had higher scores (p = 0.014). This result is consistent with the study by Wenz and Yu (2010), who found that choosing to work was associated with higher GPAs, but adding additional work hours had a modest negative effect on grades. Our conclusions with respect to course loads and extracurricular responsibilities are limited by the small sample size.

    Three additional variables were not significantly associated with exam scores in either section: course preference, previous experience with science/online course work, and timing of lecture review. Most F2F students (70%) enrolled in the section because they preferred that mode of delivery. Only 53% enrolled in the H section because it was a mode they preferred. However, students who had to enroll in the H section (despite their preference) scored the same as those who preferred online lectures. A large proportion of students (80% F2F and 83% H) had previously taken more than five college science courses, so this factor had no statistical significance. The level of previous experience with online learning was not associated with higher exam scores in either section. However, all students had taken at least one to two courses online, so this type of learning was not entirely new. All lectures were accompanied by a learning guide containing review and critical-thinking questions. Asynchronous lectures allow the flexibility for some students to procrastinate. We asked students if they completed the learning guides within a day, 1 week, or 2 weeks of lecture or only before exams. The data were not statistically significant, but we noticed a positive correlation between exam scores and learning guide completion within a day of lecture in both groups.

    Increasing the participation of URM individuals in the scientific workforce is a high priority (Malcom, 1996; Hrabowski, 2011). Adding alternative formats with additional flexibility like hybrid classes could increase enrollment for nontraditional and URM students. Is the achievement gap influenced by the course format? Comparing three sections of introductory biology (face to face, hybrid, and online) Gavassa et al. (2019) found that Black and Hispanic students scored the highest in the hybrid and lowest in the F2F formats, although when the authors controlled for prior performance (measured by math subset ACT score), students in all ethnic/racial groups scored highest in the hybrid section. In that study, the hybrid class had the highest level of course structure as defined by Haak et al. (2011). URM students in an introductory microbiology course scored lower in both formats studied, F2F and hybrid (Adams et al. 2015). The number of URM students in our study (2020, F2F = 4, H = 10) was too low to draw many significant conclusions. URM students scored 10.5% and 14.4% lower than majority students in the H section for exams 1 and 2, respectively, but the average incoming GPA was also 0.2 pts lower for the URM students. Many factors ultimately contribute to students’ success in any course format; finding ways to engage all students equitably in multiple modalities is critically important.

    Limitations of the Study

    One limitation of this study is that the sudden changes in instruction caused by the pandemic did not allow us to compare the F2F and H sections in the second half of the semester (2020). However, we were able to use a full semester of data from 2019, and together, these data show that students learning in both instructional modes performed at fairly similar levels on exams overall. Exams themselves do not fully capture students’ learning. Quizzes, lab reports, and presentations are alternate assessments in which some students excel. In our study, these were group exercises in which individual learning gains could not be measured. The F2F class included nursing students, and while this cohort was excluded from study, the group learning dynamics as a whole may have been somewhat different, as these students typically bring more clinical and less basic science knowledge to their groups.

    CONCLUSION

    With a high level of course structure and cooperative learning, students in a hybrid lecture/lab course had similar levels of achievement and team satisfaction as those in a more traditional F2F lecture/lab course. CATME was an easy and powerful tool to assess team effectiveness. Because the demographics of both groups were very similar, the primary difference between the sections was the mode of instruction. (The F2F section also contained sophomore nursing students, but these students were not included in the study.) The similar outcomes in 2020 contrasted somewhat with the difference between the F2F and H sections in 2019 when there were fewer team-building activities and there were no online group discussions to help students avoid procrastination and stay accountable with teammates. The 2020 results add to the body of literature supporting the success of hybrid courses (Garrison, 2008; Means et al., 2013; Moskal, 2017). This format gives students more flexibility in their schedules and frees up limited classroom space on many campuses (Vaughan, 2007; Moskal et al., 2013). Not all students are comfortable with online lectures, and some will have difficulty with time management and/or procrastination. Most of our students were juniors and seniors, and this may have influenced their success. A previous study reported that, when sophomores were excluded, students in a microbiology hybrid course performed as well as those in a F2F section (Adams et al., 2015). Entry GPA was a strong predictor of exam scores in both sections.

    Even as students return to campuses around the world, the COVID-19 pandemic will change the nature of higher education in the future. Some courses will continue online, and many more may adopt a hybrid format, giving students greater flexibility with scheduling and transportation. By engaging students through a high-level of structure with active and cooperative learning, student learning outcomes and achievement can rival those in a traditional classroom.

    ACKNOWLEDGMENTS

    The authors would like to thank Drs. Aaron Rendahl and Wei Wei for expert assistance with statistical analysis, Greta Henry for her excellent work as a TA and colleague, and Dr. Cheryl Dvorak for critical reading of the article.

    REFERENCES

  • Adams, A. E. M., Randall, S., & Traustadóttir, T. (2015). A tale of two sections: An experiment to compare the effectiveness of a hybrid versus a traditional lecture format in introductory microbiology. CBE—Life Sciences Education, 14(1), ar6. doi: 10.1187/cbe.14-08-0118 LinkGoogle Scholar
  • Allen, I. E., Seaman, J., & Garrett, R. (2007). Blending in: The extent and promise of blended education in the United States. Retrieved January 12, 2021, from https://eric.ed.gov/?id=ED529930 Google Scholar
  • Anderson, D., Salm, S., & Allen, D. (2018). Nester’s Microbiology: A Human Perspective (9th ed.). New York, NY: McGraw-Hill.. Google Scholar
  • Anderson, W. L., Mitchell, S. M., & Osgood, M. P. (2005). Comparison of student performance in cooperative learning and traditional lecture-based biochemistry classes. Biochemistry and Molecular Biology Education, 33(6), 387–393. doi: hTtps://doi.org/10.1002/bmb.2005.49403306387 MedlineGoogle Scholar
  • Armstrong, N., Chang, S.-M., & Brickman, M. (2007). Cooperative learning in industrial-sized biology classes. CBE—Life Sciences Education, 6(2), 163–171. doi: 10.1187/cbe.06-11-0200 LinkGoogle Scholar
  • Ashby, C. M. (2002). Distance education: Growth in distance education programs and implications for federal education policy. Retrieved January 12, 2021, from www.gao.gov/products/GAO-02-1125T Google Scholar
  • Bambara, C. S., Harbour, C. P., Davies, T. G., & Athey, S. (2009). Delicate engagement: The lived experience of community college students enrolled in high-risk online courses. Community College Review, 36(3), 219–238. doi: 10.1177/0091552108327187 Google Scholar
  • Banerjee, M. (2020). An exploratory study of online equity: Differential levels of technological access and technological efficacy among underserved and underrepresented student populations in higher education. Interdisciplinary Journal of e-Skills and Lifelong Learning, 16, 93–121. doi: 10.28945/4664 Google Scholar
  • Baturay, M. H., & Yukselturk, E. (2015). The role of online education preferences on student’s achievement. Turkish Online Journal of Distance Education, 16(3), 3–12. doi: 10.17718/tojde.47810 Google Scholar
  • Biel, R., & Brame, C. J. (2016). Traditional versus online biology courses: Connecting course design and student learning in an online setting. Journal of Microbiology & Biology Education, 17(3), 417–422. doi: 10.1128/jmbe.v17i3.115 MedlineGoogle Scholar
  • Chang, B., & Kang, H. (2016). Challenges facing group work online. Distance Education, 37(1), 73–88. doi: 10.1080/01587919.2016.1154781 Google Scholar
  • Chang, Y., & Brickman, P. (2018). When group work doesn’t work: Insights from students. CBE—Life Sciences Education, 17(3), ar52. doi: 10.1187/cbe.17-09-0 LinkGoogle Scholar
  • Davis, M. C., Libertucci, J., Acebo Guerrero, Y., Dietz, H., Noel, T. C., Rubin, J. E., & Sukdeo, N. (2020). Finding the silver lining during a global pandemic: Opportunities for curriculum innovation in microbiology education. Canadian Journal of Microbiology, 66(10), 600–602. doi: 10.1139/cjm-2020-0374 MedlineGoogle Scholar
  • Doymus, K. (2008). Teaching chemical equilibrium with the jigsaw technique. Research in Science Education, 38(2), 249–260. doi: 10.1007/s11165-007-9047-8 Google Scholar
  • Dziuban, C., Graham, C. R., Moskal, P. D., Norberg, A., & Sicilia, N. (2018). Blended learning: The new normal and emerging technologies. International Journal of Educational Technology in Higher Education, 15(1), ar3. doi: 10.1186/s41239-017-0087-5 Google Scholar
  • Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work? CBE—Life Sciences Education, 13(3), 453–468. https://doi.org/10.1187/cbe.14-03-0050 LinkGoogle Scholar
  • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Science USA, 111(23), 8410–8415. doi: 10.1073/pnas.1319030111 MedlineGoogle Scholar
  • Freeman, S., Haak, D., & Wenderoth, M. P. (2011). Increased course structure improves performance in introductory biology. CBE—Life Sciences Education, 10(2), 175–186. doi: 10.1187/cbe.10-08-0105 LinkGoogle Scholar
  • Freeman, S., O’Connor, E., Parks, J. W., Cunningham, M., Hurley, D., Haak, D. ... & Wenderoth, M. P. (2007). Prescribed active learning increases performance in introductory biology. CBE–Life Sciences Education, 6(2), 132–139. doi.org/10.1187/cbe.06-09-0194 LinkGoogle Scholar
  • Gallagher, S., & Palmer, J. (2020, September 30) The pandemic pushed many universities online. The change was long overdue. Harvard Business Publishing: Education. Retrieved June 30, 2021, from https://hbsp.harvard.edu/inspiring-minds/the-pandemic-pushed-universities-online-the-change-was-long-overdue Google Scholar
  • Garrett, R. (2019). Expanding demand for online higher education: Surveying prospective students. Online Learning, 11(1). doi: 10.24059/olj.v11i1.1735 Google Scholar
  • Garrison, D. R. (2008). Blended learning in higher education: Framework, principles, and guidelines (1st ed.). San Francisco: Jossey-Bass. Google Scholar
  • Gavassa, S., Benabentos, R., Kravec, M., Collins, T., & Eddy, S. (2019). Closing the achievement gap in a large introductory course by balancing reduced in-person contact with increased course structure. CBE—Life Sciences Education, 18(1), ar8. doi: 10.1187/cbe.18-08-0153 LinkGoogle Scholar
  • Gibson, H. L., & Chase, C. (2002). Longitudinal impact of an inquiry-based science program on middle school students’ attitudes toward science. Science Education, 86(5), 693–705. hTtps://doi.org/10.1002/sce.10039 Google Scholar
  • Gould, T. (2003). Hybrid classes: Maximizing institutional resources and student learning. In Proceedings of the 2003 ASCUE Conference (pp. 54–59). ASCUE. Google Scholar
  • Haak, D. C., HilleRisLambers, J., Pitre, E., & Freeman, S. (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216. doi: 10.1126/science.1204820 MedlineGoogle Scholar
  • Hrabowski, F. A. (2011). Boosting Minorities in Science. Science, 331(6014), 125–125. MedlineGoogle Scholar
  • Jensen, J. L., & Lawson, A. (2011). Effects of collaborative group composition and inquiry instruction on reasoning gains and achievement in undergraduate biology. CBE—Life Sciences Education, 10(1), 64–73. doi: 10.1187/cbe.10-07-0089 LinkGoogle Scholar
  • Johnson, S. D., Aragon, S. R., & Shaik, N. (2000). Comparative analysis of learner satisfaction and learning outcomes in online and face-to-face learning environments. Journal of Interactive Learning Research, 11(1), 29–49. Google Scholar
  • Keeton, M. T. (2019). Best online instructional practices: Report of phase I of an ongoing study. Online Learning, 8(2). doi: 10.24059/olj.v8i2.1829 Google Scholar
  • Kim, K.-J., & Bonk, C. J. (2006). The future of online teaching and learning in higher education: The survey says… a survey substantiates some ideas about online learning and refutes others. Educause Quarterly, 29(4), 22–30. Google Scholar
  • Krawiec, S., Salter, D., & Kay, E. J. (2005). A “hybrid” bacteriology course: The professor’s design and expectations; the students’ performance and assessment. Microbiology Education, 6, 8–13. doi: 10.1128/jmbe.v6.78 MedlineGoogle Scholar
  • Malcom, S. M. (1996). Science and diversity: A compelling national interest. Science, 271(5257), 1817–1819. doi: 10.1126/science.271.5257.1817 Google Scholar
  • Means, B., Toyama, Y., Murphy, R., & Baki, M. (2013). The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teachers College Record, 115(3), 1–47. Google Scholar
  • Messersmith, A. S. (2015). Preparing students for 21st century teamwork: Effective collaboration in the online group communication course. Communication Teacher, 29(4), 219–226. doi: 10.1080/17404622.2015.1046188 Google Scholar
  • Moskal, P., Dziuban, C., & Hartman, J. (2013). Blended learning: A dangerous idea? The Internet and Higher Education, 18, 15–23. https://doi.org/10.1016/j.iheduc.2012.12.001 Google Scholar
  • Moskal, P. D. (2017). Evaluating the outcomes and impact of hybrid courses. New Directions for Teaching and Learning, 2017(149), 19–26. doi: https://doi.org/10.1002/tl.20223 Google Scholar
  • Mynlieff, M., Manogaran, A. L., St Maurice, M., & Eddinger, T. J. (2014). Writing assignments with a metacognitive component enhance learning in a large introductory biology course. CBE—Life Sciences Education, 13(2), 311–321. https://doi.org/10.1187/cbe.13-05-0097 LinkGoogle Scholar
  • Noel, T. C., Rubin, J. E., Acebo Guerrero, Y., Davis, M. C., Dietz, H., Libertucci, J., & Sukdeo, N. (2020). Keeping the microbiology lab alive: Essential microbiology lab skill development in the wake of COVID-19. Canadian Journal of Microbiology, 66(10), 603–604. doi: 10.1139/cjm-2020-0373 MedlineGoogle Scholar
  • Piaget, J. (1926). The language and thought of the child. San Diego, CA: Harcourt, Brace. Google Scholar
  • Preszler, R. W. (2009). Replacing lecture with peer-led workshops improves student learning. CBE—Life Sciences Education, 8(3), 182–192. doi: 10.1187/cbe.09-01-0002 LinkGoogle Scholar
  • Rosenboom, V., & Blagg, K. (2008). Disconnected from higher education: How geography and Internet speed limit access to higher education (p. 9). Washington, DC: Urban Institute. Google Scholar
  • Sabel, J. L., Dauer, J. T., & Forbes, C. T. (2017). Introductory biology students’ use of enhanced answer keys and reflection questions to engage in metacognition and enhance understanding. CBE—Life Sciences Education, 16(3), ar40. doi: 10.1187/cbe.16-10-0298 LinkGoogle Scholar
  • Sancho, P., Corral, R., Rivas, T., González, M. J., Chordi, A., & Tejedor, C. (2006). A blended learning experience for teaching microbiology. American Journal of Pharmaceutical Education, 70(5), 120–120. doi: 10.5688/aj7005120 MedlineGoogle Scholar
  • Schrum, L., & Hong, S. (2002). From the field: Characteristics of successful tertiary online students and strategies of experienced online educators. Education and Information Technologies, 7(1), 5–16. doi: 10.1023/A:1015354423055 Google Scholar
  • Seaman, J. E., Allen, I. E., & Seaman, J. (2018). Grade increase: Tracking distance education in the United States. Babson Survey Research Group. Retrieved June 1, 2021, from https://eric.ed.gov/?id=ED580852 Google Scholar
  • Siegesmund, A. (2016). Increasing student metacognition and learning through classroom-based learning communities and self-assessment. Journal of Microbiology & Biology Education, 17(2), 204–214. https://doi.org/10.1128/jmbe.v17i2.954 MedlineGoogle Scholar
  • Tanner, K. D. (2012). Promoting student metacognition. CBE—Life Sciences Education, 11(2), 113–120. doi: 10.1187/cbe.12-03-0033 LinkGoogle Scholar
  • Twigg, C. A. (2003). Improving learning and reducing costs: New models for online learning. Educause Review, 38(5), 28–38 Google Scholar
  • U.S. Department of Education. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, DC. Google Scholar
  • Varty, A. K. (2016). Options for online undergraduate courses in biology at American colleges and universities. CBE—Life Sciences Education, 15(4), ar58. doi: 10.1187/cbe.16-01-0075 LinkGoogle Scholar
  • Vaughan, N. (2007). Perspectives on blended learning in higher education. International Journal on e-Learning, 6(1), 81–94. Google Scholar
  • Vygotsky, L. S., Cole, M., John-Steiner, V., Scribner, S., & Souberman, E. (1978). Mind in society: Development of higher psychological processes. Cambridge, MA: Harvard University Press. Google Scholar
  • Wang, M. C., Haertel, G. D., & Walberg, H. J. (1990). What influences learning? A content analysis of review literature. Journal of Educational Research, 84(1), 30–43. doi: 10.1080/00220671.1990.10885988 Google Scholar
  • Wenz, M., & Yu, W.-C. (2010). Term-time employment and the academic performance of undergraduates. Journal of Education Finance, 35(4), 358–373. Google Scholar
  • Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.) ASCD. New York, NY: Pearson. Google Scholar
  • Wladis, C., Hachey, A. C., & Conway, K. (2015). Which STEM majors enroll in online courses, and why should we care? The impact of ethnicity, gender, and non-traditional student characteristics. Computers & Education, 87, 285–308. Google Scholar
  • Woodyard, L., & Larson, E. (2017). Distance education report. Retrieved April 18, 2022, from https://www.cccco.edu/-/media/CCCCO-Website/About-Us/Reports/Files/2017-DE-Report-Final-ADA.pdf Google Scholar
  • Wyk, M. M. V. (2012). The effects of the STAD-cooperative learning method on student achievement, attitude and motivation in economics education. Journal of Social Sciences, 33(2), 261–270. doi: 10.1080/09718923.2012.11893104 Google Scholar