ASCB logo LSE Logo

Reading Quizzes Improve Exam Scores for Community College Students

    Published Online:https://doi.org/10.1187/cbe.17-08-0160

    Abstract

    To test the hypothesis that adding course structure may encourage self-regulated learning skills resulting in an increase in student exam performance in the community college setting, we added daily preclass online, open-book reading quizzes to an introductory biology course. We compared three control terms without reading quizzes and three experimental terms with online, open-book reading quizzes; the instructor of record, class size, and instructional time did not vary. Analyzing the Bloom’s taxonomy level of a random sample of exam questions indicated a similar cognitive level of high-stakes assessments across all six terms in the study. To control for possible changes in student preparation or ability over time, we calculated each student’s grade point average in courses other than biology during the term under study and included it as a predictor variable in our regression models. Our final model showed that students in the experimental terms had significantly higher exam scores than students in the control terms. This result shows that online reading quizzes can boost achievement in community college students. We also comment on the importance of discipline-based education research in community college settings and the structure of our community college/4-year institution collaboration.

    INTRODUCTION

    Community colleges educate 41% of all undergraduates in the United States, and 50% of all students earning bachelor’s degrees in biology have been enrolled at community colleges (National Science Foundation [NSF], 2010; American Association of Community Colleges, 2017). These numbers include a disproportionately large percentage of students who are either low-income or underrepresented in science, technology, engineering, and mathematics (STEM) majors and professions (Labov, 2012; Community College Research Center [CCRC], 2017). Unfortunately, rates of bachelor’s degree completion by community college students are low; only 15% of students who started at community colleges in 2009 had completed a 4-year degree 6 years later (Shapiro et al., 2015). The problem is of national importance, as policy makers project that the academy will fail to produce enough newly trained STEM professionals to meet demand in coming decades unless success rates of all students, including those at community colleges, increase dramatically (National Academy of Sciences, 2011; President’s Council of Advisors on Science and Technology, 2012).

    A large body of work has shown that underrepresented or underprepared students can complete rigorous 4-year STEM degrees if they are given sufficient support outside class in the form of financial aid, supplementary instruction, access to undergraduate research experiences, and intrusive advising (Summers and Hrabowski, 2006; Selingo et al., 2013; Renick, 2016). But a growing body of work also supports the hypothesis that course-based interventions have a crucial role to play in improving outcomes for at-risk students. Specifically, reformed courses that stress collaboration, active learning, and flipped or high-structure approaches at the introductory level can have disproportionately large benefits for underrepresented students (Haak et al., 2011; Hrabowski, 2011; Eddy and Hogan, 2014).

    To date, however, most studies of interventions that benefit underrepresented students have taken place at research-intensive or comprehensive universities—not at community colleges. This is a general deficit in the literature: despite scattered examples of high-quality work in introductory biology (Kenyon et al., 2016; Schinske et al., 2016), chemistry (Mohamed, 2008; Walker et al., 2016), and physics (Hake, 1998; Lasry et al., 2008), students at 2-year schools remain highly underrepresented in discipline-based education research, or DBER (Fletcher and Carter, 2010; Singer et al., 2012; Schinske et al., 2017).

    The reasons for the dearth of DBER at 2-year schools range from systemic to logistical (Schinske et al., 2017). Community college instructors are hired and promoted based on teaching experience and expertise—not research productivity. Further, they usually lack access to the graduate students and postdoctoral fellows who often implement DBER work, the statistical expertise required to analyze complex data sets, and the institutional support required for grant writing and administration, including an institutional review board. In addition, most community college classes are small, meaning that course-based studies may not have the statistical power required to discern the 3–6% improvements in exam scores that are typical of successful interventions (Freeman et al., 2014), much less examine their impact on subpopulations of interest.

    Two concerns are relevant here: national policy goals in STEM education cannot be met unless community college students and the communities they represent are included in educational reform efforts, and progress in DBER may stall without an influx of creativity, energy, and insights from community college faculty. Recognizing this, the NSF and other funding agencies are supporting partnerships between DBER specialists at 4-year schools and their colleagues at 2-year institutions (Fletcher and Carter, 2010).

    The work reported here is the result of one such collaboration. The specific question we asked was, Can community college students who are enrolled in an introductory biology course with moderate structure (Freeman et al., 2011) benefit from the addition of preclass, online, open-book reading quizzes?

    Reading quizzes or other techniques for assessing preclass preparation are being widely implemented in reformed STEM courses that follow the flipped or just-in-time or high-structure model (Mazur, 1997; Novak et al., 1999; Crouch et al., 2007; Freeman et al., 2011; Heiner et al., 2014; Hodges et al., 2015). Data from 4-year schools suggest that reading quizzes can be effective in improving exam scores—presumably because students get more out of class sessions (Narlock et al., 2006; Dobson, 2008; Johnson and Kiviniemi, 2009; Smith et al., 2010). But a recent study questioned whether reading quizzes were responsible for the success of flipped classrooms (Jensen et al., 2015), and to our knowledge, the efficacy of online, open-book reading quizzes has not been assessed in the community college setting. This last point is important, as Clement (2016) recently documented the constraints that community college students face in terms of time and study skills, and suggested that increasing course structure might be an effective strategy to encourage self-regulated learning skills in community college students. In a course with increased structure, the instructor creates graded (required) assignments that help students do the deliberate practice required to perform better on exams—instead of leaving students on their own to figure out how to study. If Clement’s hypothesis is correct, then online reading quizzes and other elements of high structure may be particularly beneficial in the community college setting.

    This paper has two goals: 1) to enrich the literature on the efficacy of reading quizzes as a course intervention and 2) to furnish an example of how collaborative DBER projects can be carried out in community colleges.

    METHODS

    Course Background and Design

    We implemented this experiment at Everett Community College in an introductory biology course called Biology 221. The course introduces evolution, diversity of life, and ecology and is the first in a three-quarter sequence designed for students who intend to transfer to a 4-year school and major in life sciences. Biology 221 includes a 3-hour laboratory and two 50-minute and one 110-minute class sessions per week.

    Table 1 summarizes the structure of the course during the six quarters (hereafter called a term) in the study. Each was taught in fall. The years 2009, 2010, and 2011 were control terms, when the course included occasional worksheets on the assigned reading but no formal preclass assessments with course points at stake. Experimental terms occurred in 2012, 2013, and 2015 and included daily reading quizzes that were graded right/wrong for a small number of course points. The reading quiz questions were multiple choice, administered online—hence open book and open notes. The questions (generated by S.F. and P.P.-L.) were primarily designed to cover basic vocabulary and concepts, although some did refer students to their text to interpret data from specific graphs or figures (see Supplemental Material, Appendix 1). Reading quiz questions did not appear directly on exams, although other questions requiring understanding of the same content assessed by reading quizzes did appear on exams. Although there were slight variations in the types of graded assignments and the exam schedule among years in the study, the only systematic difference between the three control years and the three experimental years was the introduction of online, open-book reading quizzes (Table 1).

    TABLE 1. Course structure in control vs. experimental yearsa

    aThe n values are the number of students who took all exams and completed the course. The percentages indicate the proportion of total course points assigned to each major class component. Worksheets or class prep exercises were assigned one to three times per week but were seldom collected (three to four times per term) and graded for participation. See text for notes on clicker use and grading. BIOMATH homework covered basic skills; the video assignment was an in-class video on evolution followed by a 10-point worksheet; online assignments were mastery-based self-quizzes available on textbook publisher websites. SimBio exercises were computer simulations with associated workbooks. All midterms had 100 points possible; finals had either 150 or 175 points possible.

    bReading quizzes were administered in a learning management system and were due before class; they had an open-book, open-note format and consisted primarily of multiple-choice questions. See Supplemental Material, Appendix 1, for an example.

    cIncludes online quizzing via publishers’ system.

    dQuizzes were administered during first 10 minutes of class, consisted of multiple-choice questions, and had a closed-book, closed-note format.

    eSum of percentages does not equal 100 due to rounding.

    In all six terms, the class sessions consisted of Socratic lecturing—meaning, occasional instructor-posed questions with responses elicited from student volunteers—and clicker questions. The instructor posed an average of four clicker questions per class, and asked for peer discussion and reanswering if individual responses were less than 80% correct. Clickers were graded right/wrong. Because all terms in the study included clickers and a homework assignment every 1 to 2 weeks, we considered the course to have a moderate structure design (Freeman et al., 2011).

    Student Demographics

    At the time of the study, there were no residence halls on the Everett Community College campus; all students commuted, and most lived at home. Average age in the general student population is 28; 35% of all students work, and 21% have dependents. In addition, 52% of all students are considered first generation, meaning that they would be the first in their family to complete a 4-year college degree. As the online reading quiz intervention occurred only in the first author’s (P.P.-L.) classroom, the maximum possible sample (class) size was 48 (n values given in Table 1).

    Table 2 describes student characteristics in terms of sex and ethnicity/nationality for individuals who participated in the study. Table 1 shows the number of students in each term (range = 34 – 74, with a total n = 251 students).

    TABLE 2. Study participant ethnicity and gender

    African American1.4%
    Asian American13.9%
    International5.8%
    Latinx5.8%
    Mixed/multiple5.3%
    Native American1.4%
    White60.6%
    Not stated6%
    Female67%
    Male33%

    Experimental Conditions and Controls

    To minimize instructor effects on student performance in the control and experimental conditions, P.P.-L. was the sole instructor in all six terms analyzed here. Contact time with the instructor and total number of credit hours did not vary over the course of the study, and class size varied within a small range and in an unsystematic way (see Table 1).

    To test the hypothesis that student exam performance changed during the study due to changes in the cognitive level of exams and not due to the addition of reading quizzes, we evaluated exams from all six terms under three criteria. We began by preparing a document containing 25% of the questions given in each exam, chosen at random, during each of the six terms. We then recruited two raters who were experienced researchers and blind to the source of the exam questions; the raters then independently assigned each question to one of six levels in Bloom’s taxonomy of learning. Later, the raters met to discuss their rankings and reach a consensus (Crowe et al., 2008). We combined this consensus value with the points per question, expressed as a percent of the total points possible on each exam, to compute a weighted Bloom’s index that estimated the overall cognitive level of each exam (Freeman et al., 2011). A weighted Bloom’s index of 33.3 represents an exam consisting entirely of questions at the conceptual level; an index of 50.0 indicates an exam consisting entirely of questions at the application level. Using the same list of randomly selected questions, a researcher categorized each by type and content area assessed.

    To control for the hypothesis that student exam performance changed due to changes in student ability or preparation and not due to the addition of reading quizzes, the multiple regression model that we used to analyze course performance included each student’s average grade in the term under study in courses other than biology. In this way, we were able to analyze exam scores while controlling for the possibility of changes in academic preparation and ability across terms in the study. We were unable to use overall college grade point average (GPA) as an index of prior student achievement, because many study participants took the focal course in their first term in college. We were also unable to include standardized admission exam scores as an index of student preparation and ability, because Everett Community College is an open-admission institution; administrators use a variety of placement exams solely for the purpose of advising students into appropriate courses—not to set a minimum standard for admission.

    Data Analysis: Impact of Reading Quizzes on Exam Performance

    The outcome variable in our analysis was earned exam points, expressed as a proportion of possible exam points to account for variation in the total exam points per term over the course of the study. We focused on earned exam points, because the number and nature of non-exam points was not consistent in the control versus experimental terms, and exams are a measure of student learning.

    We used backward-stepwise model selection to determine the final model reported in the Results section. We started with a full model that allowed for the possibility that the treatment impact of the treatment would be moderated by student demographics (gender or ethnicity/nationality). We tested first whether each interaction increased the fit of the model to the data and then each main effect. The initial full model was

    Exam scores (% correct) ∼ GPA in semester student took introductory biology (not including biology grade) + Treatment × Gender + Treatment × Ethnicity/Nationality

    Once we established the final model (see Supplemental Table S2 for more details on results of model selection), an examination of residual plots indicated that the predictor variables conformed to the assumption of normality.

    Structure of the Research Collaboration

    We initiated the collaboration while preparing a grant proposal focused on testing how well elements of the high-structure course design developed at the University of Washington (Freeman et al., 2011) transferred to other institutions. Although the faculty who participated in this study had interacted extensively before in other contexts, the community college partner (P.P.-L.) was new to DBER at the time this work started, while the research university partners (S.E. and S.F.) had published previously.

    Our goal in structuring the collaboration was to achieve equity and create synergies based on complementary strengths, in keeping with the framework for community-based participatory research (Hacker, 2013). This framework aspires to collaborations that build on the strengths and resources of each partner; facilitates equity in expectations, roles, and benefits; and involves all partners in dissemination of results (Schinske et al., 2017).

    Human Subjects Oversight

    This work was conducted with review and approval by the Institutional Review Board of Everett Community College and the University of Washington Human Subjects Division, application 38945.

    RESULTS

    Table 3 reports the average weighted Bloom’s index for exam questions in each term of the study. The average values varied over the six terms within a range of 5.7 units on the index’s 100-point scale, and in each term analyzed here, exams averaged close to or slightly above the conceptual level. The average Bloom’s level of an exam during the experimental treatment terms was slightly higher than the average level of an exam during the control terms. And the distribution of individual question values (points possible*Bloom’s level) was also very similar between control and experimental terms. Regarding question type, the exams were 94% multiple choice in the control terms and 93% multiple choice in the experimental terms; the remaining questions required students to draw a graph or phylogenetic tree, fill in a blank, or construct an open response (Supplemental Table S1a). Content coverage was also similar. We identified a total of 12 topic categories; in nine of these 12, the difference in the number of questions asked in control versus experimental terms was less than 3% (Supplemental Table S1b).

    TABLE 3. Weighted Bloom’s index, by year

    TreatmentYearAverage weighted Bloom’s index of exams
    Control200932.1
    Control201033.1
    Control201133.5
    Reading quizzes201237.8
    Reading quizzes201334.6
    Reading quizzes201533.5

    Figure 1 provides box plots of the raw exam scores for each term in the study. Supplemental Table S2 outlines the model-selection process that resulted in the final model predicting percent exam points a student earned: Percent correct ∼ GPA in term took biology + ethnicity/nationality + treatment. Table 4 provides the summary statistics from the regression analysis with proportion of total exam points correct as the outcome variable. On the basis of the model, students with equivalent non-biology GPAs who are in the treatment group earned 4.9% more total exam points, on average, than a student with the same college GPA in the control group. This means that the average student in our sample (average GPA and average race/ethnicity/nationality) in the control treatment was predicted to earn 69.1% of the exam points possible, while the equivalent student in the treatment was predicted to earn 74.0% of the possible points (Figure 2).

    FIGURE 1.

    FIGURE 1. Raw changes in exam performance. Proportion of exam points answered correctly, by treatment and year. Raw data uncorrected by control variables used in regression model. The boxes indicate the first quartile, median, and third quartile; the whiskers represent 1.5 times the interquartile range; the dots are data points outside this range: the bold line within the box is the median proportion of exam questions answered correctly in that year.

    FIGURE 2.

    FIGURE 2. All students perform better on exams with reading quizzes than in the control. Model-predicted average percent of exam points correct in the treatment vs. control terms controlling for cumulative GPA earned in the semester a student took biology (not including biology) and race/nationality. Bars represent 95% confidence intervals around the predicted mean. Sample sizes (n) by treatment with control listed first: Asian American: 16, 9; international: 4, 8; multiracial: 7, 4; underrepresented minority (URM): 10, 6; and white: 54, 59. Not all students reported their ethnicity.

    TABLE 4. Regression statisticsa

    EstimateSEt valuep value
    (Intercept)0.540.020726.0<<0.0001
    Non-Bio GPA0.0620.00689.5<<0.0001
    Treatment0.0490.01273.840.0002
    Underrepresented minority−0.0190.0223−0.840.40
    Asian0.0320.0186−1.730.086
    International−0.1520.0253−6.00<<0.0001
    Multiracial−0.0050.0263−0.1960.84

    aResidual SE: 0.083 on 170 degrees of freedom. Multiple R2: 0.46, adjusted R2: 0.44. F-statistic: 24.05 on 6 and 170 DF, p value < 2.2e-16.

    To evaluate how well the collaboration conformed to the goals of community-based participatory research, we recorded the roles played by each researcher. The community college partner took on the tasks of defining the question and experimental intervention, designing all assessments, serving as instructor of record in all terms studied, obtaining permission from the institutional review board, and organizing the raw data. The colleagues at a 4-year school advised on all aspects of experimental design and implementation and were primarily responsible for obtaining grant funding and performing data analysis and preparation of figures and tables. All three members of the research team contributed to manuscript development.

    DISCUSSION

    The reported average 4.9% gain in student exam score is consistent with the 2–5% increases observed in similar experiments at 4-year institutions (Narlock et al., 2006; Dobson, 2008; Johnson and Kiviniemi, 2009), suggesting that reading quizzes can be just as effective in the community college setting as they have been shown to be at schools with admission criteria and thus student populations with a generally higher level of preparation. Given that reading quizzes are considered a low-cost intervention in terms of the instructor time required to create and implement them, our data indicate that they could be a highly attractive intervention for community college instructors with large teaching loads.

    In addition to extending the literature on reading quizzes to the community college setting, our data support the hypothesis that increasing the level of course structure and the amount of active learning in courses that have already introduced some elements of evidence-based teaching can increase student performance (Freeman et al., 2011; Connell et al., 2016; Elliot et al., 2016). Stated another way, dosage may matter, if instructors are using evidence-based prescriptions. But as noted in a recent meta-analysis on the efficacy of active learning, much more work remains to be done on this question (Freeman et al., 2014).

    It is unlikely that the increase in exam scores during this study’s experimental terms resulted from a reduction in the rigor of exam questions, as the average cognitive level appeared to be slightly higher than in the control terms. Summative assessments were very similar between control and experimental terms, with similar questions being used. In addition, the question types and content coverage did not differ in a meaningful way between control and experimental terms.

    Even though it is reasonable to rule out changes in the cognitive level, question type, and content coverage of exams to explain our data, we do not know the mechanism responsible for the benefit conferred by reading quizzes. We suggest that one or more of the following non–mutually exclusive hypotheses are important: 1) effective preclass preparation allows students to achieve better mastery of the material during class sessions (e.g., Gross et al., 2015); 2) a transfer of course points from high-stakes exams to low-stakes reading quizzes—10% in this case—might reduce anxiety enough to benefit test-anxious students (Cassady and Johnson, 2002); and/or 3) the frequent practice provided by reading quizzes contributes to a testing effect that improves information recall (Roediger and Butler, 2011). The question of causation deserves further study, because flipping and other course designs that include structured preclass preparation are increasing in popularity.

    Study Limitations

    It is important to note that the successful implementation of online, open-book reading quizzes reported here occurred in a course that already included elements of moderate course structure, including evidence-based use of clickers in each class and weekly homework outside class. It is not clear whether reading quizzes can be effective in courses that do not have some element of active learning in class. We also have no data to evaluate the relative strengths of the different types of preclass assessments currently being widely used: 1) open-book and “open-Web” online reading quizzes that are not timed, are administered in a Learning Management System, and are due before class starts; 2) traditional reading quizzes with a closed-book, closed-note, timed format, given at the start of class, which take up instructional time but may require deeper student effort and understanding; and 3) open-response, guided-reading questions, which may train better study skills but are more difficult to grade. We also did not investigate the trade-off between standard textbook reading assignments and preclass modules dominated by videos or animations—an alternative that may benefit students who are still building their reading comprehension skills.

    Owing to its quasi-random and longitudinal design, this study also cannot address the hypothesis that unmeasured changes in the student population are responsible for the difference in performance, or the hypothesis that the instructor of record simply became a better teacher over time. We view both explanations as unlikely, however. Our analyses used college GPA in nonfocal courses as a control for changes in student ability or preparation, which other studies have shown is a strong predictor of student course performance in introductory biology (e.g. Freeman et al., 2007). In addition, we have no reason to suspect that significant changes in student motivation, self-efficacy, or other aspects of psychological or emotional state occurred over the study interval. In terms of instructor effects, P.P.-L. began teaching majors’ biology courses at Everett Community College in 2000, had previously taught at other 2-year and 4-year institutions, and as such was an experienced midcareer instructor during the first control term of 2009. Although the impact of teaching experience on student outcomes has not been studied extensively at the undergraduate level, recent work in K–12 education suggests that the most rapid improvement in student outcomes occurs during an instructor’s first 12 years of teaching (reviewed in Kini and Podolsky, 2016). Further, the instructor in the study reported here had already experimented with an array of innovations (Table 1), suggesting that if better teaching occurred, online reading quizzes were the primary cause.

    DBER in the Community College Environment

    Ongoing research is not an expectation at Everett Community College. Instead, motivation for this participant sprung from a desire to better understand how to facilitate student learning and increase performance, and thereby make a larger number of interested students eligible to transfer to a 4-year institution. Even with intrinsic motivation, however, several important constraints to biology education research at community colleges have been identified (Schinske et al., 2017). The time constraint was reduced by having initial project design meetings during the summer, when P.P.-L. was not engaged in teaching. These sessions were facilitated by a modest amount of summer salary support from the grant that supported this work. Subsequent work on the project was conducted outside P.P.-L.’s typical work schedule. The existence of an Institutional Review Board at Everett Community College facilitated this research project, but lack of other infrastructure was a hindrance. For example, access to grant funds, statistical expertise, and most of the relevant research journals were provided through the 4-year partners. Other than IRB oversight, P.P.-L. did not seek buy-in from administrators or peers, as the course interventions occurred only in this author’s classroom and are consistent with the general effort to implement evidence-based pedagogies at community colleges. The community college did, however, provide travel funds to support a poster presentation about a different project at a national conference. This was important, as the professional development provided by participating in national meetings and subsequent selection as a Partnership for Undergraduate Life Sciences Education Leadership Fellow in 2012 augmented P.P.-L.’s intrinsic motivation.

    CONCLUSION

    Despite its limitations, our work suggests that reading quizzes can have important benefits for community college students and that collaborations between community college instructors and discipline-based researchers at large research-intensive universities can be extraordinarily fruitful. Based on the division of roles achieved in the project, the collaboration appeared to be consistent with the framework for community-based participatory research (Schinske et al., 2017). Although we do not have direct measures of the collaboration’s success beyond the publication of this work, it is notable that subsequent to this project’s start, the community college partner independently became an active participant in several other DBER initiatives (Aguirre et al., 2013; Schinske et al., 2017).

    When designing collaborations like this one, however, it is important to recognize the need for multiple years of study. In our case, it took six terms of data collection to be confident about the pattern reported here, even though the class size of ∼50 is large by community college standards. One solution may be to collect data from multiple sections taught by the same instructor within terms or multiple instructors carrying out the same intervention in different sections of the same course. We argue that the extra effort involved in collecting sufficient data is more than worthwhile, given the impact that community colleges have on the total student population and especially on outcomes for at-risk students. Most low-income and underrepresented students begin their college careers at community colleges (CCRC, 2017). It is time for DBER to reach those classrooms.

    ACKNOWLEDGMENTS

    This work was supported by a grant from the NSF (DUE 1118890). We are grateful to co–principal investigator Mary Pat Wenderoth for advice and support during the design of the study and to the members of the University of Washington Biology Education Research Group for helpful comments on an early draft of the article.

    REFERENCES

  • Aguirre, K. M., Balser, T. C., Jack, T., Marley, K. E., Miller, K. G., Osgood, M. P., … Romano, S. L. (2013). PULSE Vision & Change Rubrics. CBE—Life Sciences Education, 12(4), 579–581. 10.1187/cbe.13-09-0183 LinkGoogle Scholar
  • American Association of Community Colleges. (2017). Fast Facts Retrieved September 16, 2017, from www.aacc.nche.edu/AboutCC/Documents/AACCFactSheet2017.pdf Google Scholar
  • Cassady, J. C., & Johnson, R. E. (2002). Cognitive text anxiety and academic performance. Contemporary Educational Psychology, 27(2), 270–295. 10.1006/ceps.2001.1094 Google Scholar
  • Clement, L. (2016). External and internal barriers to studying can affect student success and retention in a diverse classroom. Journal of Microbiology and Biology Education, 17(3), 351–359. 10.1128/jmbe.v17i3.1077 MedlineGoogle Scholar
  • Community College Research Center [CCRC]. (2017). Frequently Asked Questions. Retrieved March 15, 2017, from https://ccrc.tc.columbia.edu/Community-College-FAQs.html Google Scholar
  • Connell, G. L., Donovan, D. A., & Chambers, T. G. (2016). Increasing the use of student-centered pedagogies from moderate to high improves student learning and attitudes about biology. CBE—Life Sciences Education, 15(1), ar3. 10.1187/cbe.15-03-0062 LinkGoogle Scholar
  • Crouch, C. H., Watkins, J., Fagen, A. P., & Mazur, E. (2007). Peer instruction: Engaging students one-on-one, all at once. In Redish, E. F.Cooney, P. J. (Eds.), Reviews in physics education research. Vol. 1: Research-based reform of university physics. College Park, MD: American Association of Physics Teachers.Retrieved March 15, 2017, from www.per-central.org/items/detail.cfm?ID=4990 Google Scholar
  • Crowe, A., Dirks, C., & Wenderoth, M. P. (2008). Biology in Bloom: Implementing Bloom’s taxonomy to enhance student learning in biology. CBE—Life Sciences Education, 7(3), 368–381. 10.1187/cbe.08-05-0024 AbstractGoogle Scholar
  • Dobson, J. L. (2008). The use of formative online quizzes to enhance class preparation and scores on summative exams. Advances in Physiology Education, 32(4), 297–302. 10.1152/advan.90162.2008 MedlineGoogle Scholar
  • Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work. CBE—Life Sciences Education, 13(4), 453–468. 10.1187/cbe.14-03-0050 AbstractGoogle Scholar
  • Elliott, E. R., Reason, R. D., Coffman, C. R., Gangloff, E. J., Raker, J. R., Powell-Coffman, J. A., & Olgilvie, C. A. (2016). Improved student learning through a faculty learning community: How faculty collaboration transformed a large-enrollment course from lecture to student-centered. CBE—Life Sciences Education, 15(2), ar22. 10.1187/cbe.14-07-0112 LinkGoogle Scholar
  • Fletcher, L. A., & Carter, V. C. (2010). The important role of community colleges in undergraduate biology education. CBE—Life Sciences Education, 9(4), 382–383. 10.1187/cbe.10-09-0112 LinkGoogle Scholar
  • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences USA, 111(23), 8410–8415. 10.1073/pnas.1319030111 MedlineGoogle Scholar
  • Freeman, S., Haak, D., & Wenderoth, M. P. (2011). Increased course structure improves performance in introductory biology. CBE—Life Sciences Education, 10(2), 175–186. 10.1187/cbe.10-08-0105 LinkGoogle Scholar
  • Freeman, S., O’Connor, E., Parks, J. W., Cunningham, M., Hurley, D., Haak, D., & Wenderoth, M. P. (2007). Prescribed active learning increases performance in introductory biology. CBE—Life Sciences Education, 6(2), 132–139. 10.1187/cbe.06-09-0194 LinkGoogle Scholar
  • Gross, D., Pietri, E. S., Anderson, G., Moyano-Camihort, K., & Graham, M. J. (2015). Increased pre-class preparation underlies student outcome improvement in the flipped classroom. CBE—Life Sciences Education, 14(4), ar36. 10.1187/cbe.15-02-0040 LinkGoogle Scholar
  • Haak, D. C., HilleRisLambers, J., Pitre, E., & Freeman, S. (2011). Increased course structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216. 10.1126/science.1204820 MedlineGoogle Scholar
  • Hacker, K. (2013). Community-based participatory research. Thousand Oaks, CA: Sage. Google Scholar
  • Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66(1), 64–74. 10.1119/1.18809 Google Scholar
  • Heiner, C. E., Banet, A. I., & Wieman, C. (2014). Preparing students for class: How to get 80% of students reading the textbook before class. American Journal of Physics, 82(10), 989–996. 10.1119/1.4895008 Google Scholar
  • Hodges, L. C., Anderson, E. C., Carpenter, T. S., Cui, L., Malinky Gierasch, T., Leupen, S., … Wagner, C. R. (2015). Using reading quizzes in STEM classes—the what, why, and how. Journal of College Science Teaching, 45(1), 49–55. Google Scholar
  • Hrabowski, F. A. III. (2011). Boosting minorities in science. Science, 331(6014), 125. 10.1126/science.1202388 MedlineGoogle Scholar
  • Jensen, J. L., Kummer, T. A., & Godoy, P. D. D. M. (2015). Improvements from a flipped classroom may simply be the fruits of active learning. CBE—Life Sciences Education, 14(1), ar5. 10.1187/10.1187/cbe.14-08-0129 LinkGoogle Scholar
  • Johnson, B. C., & Kiviniemi, M. T. (2009). The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teaching of Psychology, 36(1), 33–37. 10.1080/00986280802528972 MedlineGoogle Scholar
  • Kenyon, K. L., Onorato, M. E., Gottesman, A. J., Hoque, J., & Hoskins, S. G. (2016). Testing CREATE at community colleges: An examination of faculty perspectives and diverse student gains. CBE—Life Sciences Education, 15(1), ar8. 10.1187/cbe.15-07-0146 LinkGoogle Scholar
  • Kini, T., & Podolsky, A. (2016). Does teaching experience increase teacher effectiveness? A review of the research. Palo Alto, CA: Learning Policy Institute. Google Scholar
  • Labov, J. B. (2012). Changing and evolving relationships between two- and four-year colleges and universities: They’re not your parents’ community colleges anymore. CBE—Life Sciences Education, 11, 121–128. 10.1187/cbe.12-03-0031 LinkGoogle Scholar
  • Lasry, N., Mazur, E., & Watkins, J. (2008). Peer instruction: From Harvard to the two-year college. American Journal of Physics, 76(11), 1066–1069. 10.1119/1.2978182 Google Scholar
  • Mazur, E. (1997). Peer instruction. Upper Saddle River, NJ: Prentice Hall. Google Scholar
  • Mohamed, A.-R. (2008). Effects of active learning variants on student performance and learning perceptions. International Journal of Scholarship in Teaching and Learning, 2(2), 1–15. 10.20429/ijsotl.2008.020211 Google Scholar
  • Narlock, R., Garbin, C. P., & Turnage, K. D. (2006). Benefits of prelecture quizzes. Teaching of Psychology, 33(2), 109–112. 10.1207/s15328023top3302_6 Google Scholar
  • National Academy of Sciences. (2011). Expanding underrepresented minority participation: America’s science and technology talent at the crossroads. Washington, DC: National Academies Press. Google Scholar
  • National Science Foundation. (2010). National Survey of Recent College Graduates, Survey Year 2010. Retrieved September 16, 2017, from https://ncsesdata.nsf.gov/datatables/recentgrads/2010/ Google Scholar
  • Novak, G. M., Patterson, E. T., Gavrin, A. D., & Christian, W. (1999). Just-in-time teaching: Blending active learning with Web technology. Upper Saddle River, NJ: Prentice Hall. Google Scholar
  • President’s Council of Advisors on Science and Technology. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Washington, DC. Google Scholar
  • Renick, T. (2016). Complete college Georgia 2016 status report Atlanta: Georgia State University. Google Scholar
  • Roediger, H. L.III, & Butler, A. C. (2011). The critical role of retrieval practice in long-term retention. Trends in Cognitive Science, 15(1), 20–27. 10.1016/j.tics.2010.09.003 MedlineGoogle Scholar
  • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, D. M., Brownell, S. E., Carter, R. S., … Corwin, L. A. (2017). Broadening participation in biology education research: Engaging community college students and faculty. CBE—Life Sciences Education, 16(2), mr1. 10:1187/cbe.16-10-0289 LinkGoogle Scholar
  • Schinske, J. N., Perkins, H., Snyder, H., & Wyer, M. (2016). Scientist spotlight homework assignments shift students’ stereotypes of scientists and enhance science identity in a diverse introductory science class. CBE—Life Sciences Education, 15(3), ar47. 10.1187/cbe.16-01-0002 LinkGoogle Scholar
  • Selingo, J., Carey, K., Pennington, H., Fishman, R., & Palmer, I. (2013). The next generation university. Washington, DC: New America Foundation. Google Scholar
  • Shapiro, D., Dundar, A., Wakhungu, P. K., Yuan, X., Nathan, A., & Hwang, Y. (2015). Completing college: A national view of student attainment rates—Fall 2009 cohort. Herndon, VA: National Student Clearinghouse Research Center. Google Scholar
  • Singer, S. R., Nielsen, N. R., & Schweingruber, H. A. (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering. Washington, DC: National Academies Press. Google Scholar
  • Smith, B. L., Holliday, W. G., & Austin, H. W. (2010). Students’ comprehension of science textbooks using a question-based reading strategy. Journal of Research in Science Teaching, 47(4), 363–379. 10.1002/tea.20378 Google Scholar
  • Summers, M. F., & Hrabowski, F. A. III. (2006). Preparing minority scientists and engineers. Science, 311(5769), 1870–1871. 10.1126/science.1125257 MedlineGoogle Scholar
  • Walker, J. P., Sampson, V., Southerland, S., & Enderle, P. J. (2016). Using the laboratory to engage all students in science practices. Chemistry Education Research and Practice, 17(4), 1098–1113. 10.1039/C6RP00093B Google Scholar