ASCB logo LSE Logo

Implementation of Open Textbooks in Community and Technical College Biology Courses: The Good, the Bad, and the Data

    Published Online:https://doi.org/10.1187/cbe.19-01-0022

    Abstract

    One challenge facing students today is high textbook costs, which pose a particularly difficult obstacle at community and technical colleges, where students typically have lower incomes and textbooks constitute a larger proportion of the overall cost of education. To address this, many advocate for using open-source textbooks, which are free in a digital format. However, concerns have been raised about the quality and efficacy of open textbooks. We investigated these concerns by collecting data from general biology classes at four community and technical colleges implementing traditionally published (non-open) and open textbooks. We compared student outcomes, textbook utilization methods, and perceptions of textbooks in these courses. In generalized linear statistical models, book type (open vs. non-open) did not significantly influence measured student outcomes. Additionally, survey results found that students and faculty perceived the open textbook as equal in quality to other textbooks. However, results also suggested that student textbook use did not always align with faculty expectations. For example, 30% of students reported reading their textbooks compared with 85% of faculty expecting students to read the textbook. Finally, faculty who implemented open textbooks expected the textbook to be used more often for reference and review compared with faculty who use traditional textbooks.

    INTRODUCTION

    One of the challenges facing students today is high textbook costs. In 2017, the average cost of textbooks per student at public community and technical colleges reached $1440 per year—approximately 40% of the cost of tuition (Ma et al., 2018). As textbook costs climb, students are beginning to rely more on financial aid to cover the cost. At community colleges, 50% of students are using an average of $300 per semester of financial aid to cover the cost of textbooks (Senack and Donaghue, 2016).

    Beyond the direct financial impact, the high cost of textbooks influences many decisions made by students. A 2016 report by the Florida Virtual Campus found that 66.6% of college and university students are forgoing the purchase of expensive books, and at the associate’s-degree level, students report taking fewer courses due to textbook costs (Fischer et al., 2015; Donaldson and Shen, 2016). Students who are economically disadvantaged and/or part-time, common student attributes for community and technical college students, also have lower persistence rates (Paulsen and St. John, 2002; National Student Clearinghouse Research Center, 2018). These students may be particularly aided by reducing textbook costs, as it would make a larger impact on the total cost of their education.

    To ameliorate this expense, many have advocated for using open educational resources (OER). OER are defined as educational materials under an open license or in the public domain that permit “no-cost access, use, adaptation and redistribution by others with no or limited restrictions” (Hewlett Foundation, 2018). OER are now widely available; in fact, many open textbooks (a subset of OER) have been prepared that can be used to directly replace expensive commercial textbooks. These textbooks are not only available for free online, they also use Creative Commons licenses so that teachers and others can revise them to better meet the needs of a specific student body (Johnstone, 2005; Bissell, 2009; D’Antoni, 2009). One commonly used set of open textbooks is produced by Rice University’s OpenStax (2013) (www. openstax.org). These textbooks are more similar to traditional textbooks (also referred to as “published,” “commercial,” or “non-open” textbooks), as printed copies are available for purchase and they are written and peer reviewed by experts (Boyd, 2016). Specifically, the OpenStax Biology text has been adopted and reviewed by major universities such as the University of Georgia (Watson et al., 2017).

    Value perceptions may lead faculty or students to question the quality of a free, open textbook. A study by Piehl (1977) found that the perceived value of a textbook was related to the price of the textbook. Further studies have noted that perceived value is multidimensional and can be influenced by subject and object interaction, situational variables, and personal perceptions (Sanchez-Fernandez and Iniesta-Bonillo, 2007). Student engagement with course materials is important, as positive student perceptions of course materials have a positive impact on student performance (Struyven et al., 2008). Therefore, if students perceive a textbook to be valuable, it may make a difference to their performance.

    Fortunately, the literature on OER reviewed in the next few paragraphs suggests that, once implemented, these materials are perceived as high quality by faculty and students and are not harmful to student learning. In two survey-based studies by Bliss et al. (2013a,b), ∼600 students and 70 instructors from 15 colleges and universities were surveyed on their perceptions of OER. The majority of instructors found the open textbooks equal or better quality than traditional textbooks. Also, around 50% of the surveyed students felt the open text was equal to a typical text, and 40% felt it was an improvement over a traditional textbook (Bliss et al., 2013a,b). Two specific benefits identified by students were the technical advantage of a digital text and that is was free. In Watson et al.’s (2017) study reviewing OpenStax Biology, they found that the majority of the ∼1000 students surveyed rated the textbook as being the same quality or better quality than other texts. These studies did not include specific student outcome data.

    Hilton (2016) performed an analysis of 16 studies that focused on OER efficacy in addition to perceptions from survey data. He found that, across nine efficacy studies (involving around 45,000 students), 93% of students achieved the same or better outcomes when OER were implemented. In addition, nine perception studies surveyed a combined 4500 students and faculty to determine that more than half the students and two-thirds of faculty found OER to be comparable or superior to traditional resources, while a small minority found them to be inferior (Hilton, 2016).

    The studies described earlier pool results across many disciplines and colleges, which may obscure patterns that could more easily be interpreted at the level of a single course or one college. A few studies have addressed OER within a single discipline level. First, Hilton et al. (2013) found that, after OER adoption by a math department at a community college, the student scores on department exams were essentially unchanged compared with previous semesters when traditional textbooks had been used. Moreover, they found that ∼80% of students reported that they would recommend OER to their classmates. Second, Bowen et al. (2012) compared students in a statistics class at a large university who took either the face-to-face class with a traditional textbook or a hybrid class that used OER. Students performed equally well on the final examination, regardless of which version of the class they took. Third, Pawlyshyn et al. (2013) reported the use of OER in a basic math course at a small, private, 4-year college. They compared the results of students enrolled in the course and found that pass rates increased slightly when all courses used OER. Finally, Grisset and Huffman (2018) tested an open and published textbook in a psychology class at a small public university. Similarly, they found no significant differences in exam scores or course grades. Interestingly, they also found that students preferred the textbook they used over an option they did not use.

    There are a few studies in a growing field that focus on OER efficacy and perceptions in postsecondary science classrooms. One study, done in a chemistry course at University of California, Davis (Allen et al., 2015), compared an experimental class of around 500 students who used the OER ChemWiki with a control class of around 500 who used a commercial textbook. The sections were taught in the same semester by the same instructors. Researchers found no significant differences between the two groups on overall exam results (both groups were given the same exams) or individual learning gains (determined by comparing pretests to final exam scores). It is important to remind ourselves that a lack of a difference between OER and non-OER student groups is a desired outcome, because it was originally predicted that these resources would be lower quality. In addition, if there is no difference in learning outcomes, the cost savings alone is an argument for use of OER. A second study, Colvard et al. (2018), examined the use of OER materials in several courses, including four undergraduate biology courses at the University of Georgia. They did find significant learning gains across all the courses they studied (with around 10,000 students in the OER group and 11,000 in the non-OER group). However, they did not report results specific to the biology courses.

    Neither of these studies focused on biology specifically, and both were done at large, 4-year universities. In general, ∼97% of the biology education research data comes from 4-year institutions (Schinske et al., 2017). Considering that 50% of STEM undergraduates begin their scientific studies at community and technical schools (National Science Foundation, National Center for Science and Engineering Statistics, 2010) and considering the high diversity present in these student populations, it is imperative that we broaden our understanding about the efficacy and perceptions of biology-specific OER in community and technical colleges. One study was performed by Fisher (2018), who surveyed a group of community college environmental science students on their perceptions of a newly created environmental science open textbook. He found that the majority of students surveyed rated the quality of this open textbook as the same or better than a traditional textbook (Fisher, 2018). Increased utilization of OER can also make higher education more inclusive, as it has the potential to considerably lessen cost and increase access to learning materials for many community and technical college students who cannot afford to purchase textbooks.

    The present study serves to address the gap in the literature by examining implementation of an open textbook in general biology courses at four community and technical colleges. We investigate the following questions: 1) Do we see any differences in student outcomes in courses using an open textbook compared with courses using a published textbook? 2) What are perceptions of students and faculty regarding the quality of their textbooks? 3) What are the faculty expectations and student reports on the utilization of textbooks in their biology courses?

    METHODS

    Context

    This study took place at four community and technical colleges that are all part of the Minnesota State Colleges and Universities system (MinnState). This is a large public system of higher education that serves nearly 400,000 students each year. The system serves many students from groups underrepresented in higher education, which include students of color, first-generation students, and low-income students (using Pell-grant eligibility as a proxy). The individual college demographics are summarized in Table 1. Anoka Technical College is a rural–suburban college that serves ∼2000 students. Century College is a suburban college that serves 12,500 students. Saint Paul College is an urban college and serves ∼9500 students. Saint Cloud Technical and Community College is a regional college that serves ∼6500 students.

    TABLE 1. Summary of demographics from the four participating community and technical colleges

    Anoka Technical CollegeCentury CollegeSaint Cloud Technical and Community CollegeSaint Paul College
    Male45%45%46%46%
    Female55%55%54%54%
    First-generation by federal guidelines73%46%69%68%
    Pell grant eligible41%24%43%53%
    Minority groups21%39%19%61%
    Full-time students46%41%48%39%
    Average student age27252429

    Student Outcome Data Collection

    To investigate our research questions, we collected student outcome data from four instructors, one at each college. For consistency, we used an equivalent, majors-level General Biology 1 course at all four colleges. Each professor first taught the course using a traditionally published, non-open textbook (either CampbellBiology [Reece et al., 2014] published by Pearson; or Biology by Mader and Windelspecht [2015] published by McGraw-Hill). Then, in a subsequent semester, the instructors adopted the open textbook Biology, distributed by OpenStax (2013) (https://openstax.org/details/books/biology). We selected this particular textbook because it is a readily available textbook that seemed fairly similar to an average majors-level biology text. Also, anecdotal evidence suggests that it is a commonly used textbook, and at least one large university, the University of Georgia, has adopted it (Watson et al., 2017). No other major changes to pedagogy, content, or the course were made when the textbook was switched (i.e., highly similar tests, assignments, lecture slides, activities, etc.).

    Student outcome data were collected from Fall semester 2012 through Spring semester 2017, with most data from 2015 through 2017 (we needed to collect more distant data for one professor who switched to the open textbook earlier than the other professors). A summary of the data collected for each student is shown in Supplemental Table 1. We assessed 33 courses and 1082 students across four colleges. The courses spanned a range of modalities (face-to-face “seated,” blended online and face-to-face “hybrid,” and fully “online”), class sizes (large and small), textbook type (traditional and open), and semesters (Fall, Spring, Summer). The median course size was 29 students, with courses of 29–48 considered “large,” and courses with 18–28 considered “small.” We collected six student outcomes for each student: course percentage, mean test percentage, pass/fail grade on first exam (yes/no), class withdrawal (yes/no), received an “A” (yes/no), and received a “C,” “D,” or “F” (yes/no).

    Statistical Analysis on Student Outcomes

    Data from all colleges were aggregated into a single data set for statistical analysis. Dependent variables with binomial outcomes (yes/no; see Student Outcome Data Collection section) were analyzed using mixed-effect generalized linear models (GLMs) with logit link functions (i.e., logistic regression). Continuous variables, including course percentage (student level), mean test percentage (student level), and first exam percentage (course level), were analyzed with GLMs with logit link function and a beta distribution for the response to account for the fact that the dependent variable was bound between 0 and 1. For course percentage and first exam percentage, which both had values that exceeded 1.0 due to the extra credit that some instructors gave with either textbook type, values were transformed to ensure that all values fell within 0 and 1 by subtracting 0.1 from all values. In all regressions, random effects were included to account for clustering of students within courses, with course ID nested within instructor included as a nested random effect (Zuur, 2009). These random effects were included to account for the fact that students within a course are not independent from one another; differences in teaching styles, grading, and other aspects of the learning environment can potentially influence the distribution of student outcomes, creating systematic variation according to course/instructor that must be accounted for in the analysis. This potential variation based on course ID and instructor is accounted for in the statistical models through the random effect structure. Institution may also influence student outcomes, due to differences in the student body and academic culture. However, this study included only one instructor per institution; thus it is not possible to distinguish between the effect of instructor versus institution. Logistic and beta GLMs were performed using the lme4 and glmmTMB packages (Bates et al., 2015; Magnusson et al., 2018), respectively, in R statistical software (v. 3.2.4).

    Independent variables that were examined for their effects on student outcomes included semester (Fall, Summer, Spring), course format (seated, online, hybrid), book type (open access vs. published), and class size category (small vs. large). Class size category breaks for small and large classes were based on the median class size in the data set (29 students). Class size, semester, format, and school/instructor were other factors that could impact student outcomes, so these were included in the analysis to address potential confounding. Variables were first screened in univariable analyses with either logistic or beta GLMs. The univariable analysis for book type is shown in Table 2. Multivariable regression is a standard way to evaluate the effect of potential predictors while simultaneously accounting for the effects of other variables (Zuur, 2009). For multivariable regression, all variables were included in a full model, which was then backward selected until a minimal univariable model was reached. Model comparisons of multi- and univariable models were performed to select the best-fit model using the Akaike information criterion corrected for small sample size (AICc). Models with AICc values within 2.0 of the model with the lowest AICc were considered equally good fits for the data (Burnham and Anderson, 2002). The purpose of model selection was to find the most parsimonious model that best explained the data, and variables that are dropped from the best-fit model can be considered to have a statistically insignificant influence on the outcome of interest. Dropped variables do not explain a significant proportion of the variation in the outcome, and their inclusion does not improve the ability of the model to explain the data. Results are shown in Tables 3 and 4.

    TABLE 2. Univariable models of the effect of textbook on each student outcomea

    OutcomeVariableEstimateSEp value
    Likelihood of getting an “A” (n = 894)Book type−0.0050.1940.981
    Likelihood of getting a “C,” “D,” or “F” (n = 894)Book type0.1500.1810.407
    Likelihood of withdrawal (n = 1082)Book type−0.2270.2160.293
    Likelihood of failing first exam (n = 1014)Book type−0.2330.2220.294
    Course percentage (n = 802)Book type−0.0380.0620.537
    Mean test percentage per student (n = 802)Book type−0.0580.0610.343

    aBinomial and continuous outcomes were modeled with logistic and beta regressions, respectively.

    TABLE 3. Best-fit models for each of four student outcomes with binomial outcomes (logistic regressions)a

    Model nameVariableEstimateSEp valueAICc differenceAICc weight
    A. Likelihood of getting an “A” (n = 894)
     A1Semester: Spring−0.1620.2020.4220.0000.457
    Semester: Summer0.6510.2610.013
     A2Class size category−0.1630.2380.4921.6000.210
    Semester: Spring−0.1370.2050.505
    Semester: Summer0.5580.2940.058
     A3Class size category−0.4250.2090.0421.9000.179
    B. Likelihood of getting a “C,” “D,” or “F” (n = 894)
     B1Semester: Spring0.5850.1770.0010.0000.730
    Semester: Summer−0.5220.2880.070
    C. Likelihood of withdrawal (n = 1082)
     C1Semester: Spring0.0590.2150.7850.0000.719
    Semester: Summer−1.4520.4050.000
    D. Likelihood of failing first exam (n = 1014)
     D1Semester: Spring0.1350.2220.5430.0000.390
    Semester: Summer−1.8580.6440.004
     D2Class size category0.4220.2640.1100.5000.297
    Semester: Spring0.0890.2150.680
    Semester: Summer−1.5660.6640.018
     D3Class size category0.7080.2600.0061.9000.151

    aFor each outcome, all uni- and multivariable models that are <2 of the lowest AICc are shown.

    TABLE 4. Best-fit models for each of the three student outcomes with continuous outcomes (beta regressions)a

    Model nameVariableEstimateSEp valueAICc differenceAICc weight
    F. Course percentage per student (n = 802)
     F1Semester: Spring−0.1510.0520.0040.0000.553
    Semester: Summer0.1960.0670.003
     F2Class size category−0.0710.0570.2110.5000.431
    Semester: Spring−0.1430.0520.006
    Semester: Summer0.1450.0770.059
     G1Class size category−0.0940.0620.1310.0000.474
    Semester: Spring−0.1160.0540.030
    Semester: Summer0.1930.0820.019
     G2Semester: Spring−0.1270.0530.0170.3000.415
    Semester: Summer0.2570.0710.000
     G1Class size category−0.0940.0620.1310.0000.474

    aFor each outcome, all uni- and multivariable models that are <2 of the lowest AICc are shown.

    Faculty and Student Survey Data Collection

    To assess any potential differences between faculty perceptions of textbook use and actual use by students, we developed two surveys: one was distributed to biology faculty at any of the 30 MinnState community and technical colleges; the second was distributed to students from six different faculty’s courses at five different MinnState community and technical institutions. We were unfortunately unable to distribute student surveys to as many colleges due to institutional limitations on surveying students. The surveys were administered through the survey tool, Qualtrics, by the Office of Institutional Research, Planning, and Grants at Saint Paul College in 2017 and were open for 3 months. Questions included demographic questions, questions on courses taught and textbooks used, and questions on perceptions and expected or actual use of the course textbook. Answer format varied, but included Likert-like options, multiple choice, yes/no, and open-ended comments. To look for significant differences between proportions that chose different answers in the survey, we used a Fisher’s exact t test. This test is more accurate than chi-squared tests for small sample sizes (McDonald, 2014).

    For faculty, 21 questions were included, adapted from the survey presented in the 2013 Bliss et al. (2013a,b) studies. The survey questions can be viewed in Supplemental Figure 1. A total of 44 faculty responded to the survey. Twenty of the respondents responded “yes” to the question “Is your textbook an OER?,” with 17 using OpenStax Biology or Anatomy and Physiology. Twenty-four respondents responded “no” to the OER question and use a variety of traditionally published textbooks. Survey responses that had significant differences in answers between OER and non-OER faculty are shown in Table 5 along with the proportions of faculty who selected each answer and Fisher’s exact t test results. The open-ended questions were optional and were not coded.

    TABLE 5. Faculty responses to survey questions with answers that had significant Fisher’s t test values or trends

    AnswerProportion of open-textbook faculty selecting this optionProportion of non–open textbook faculty selecting this option
    Q8: For this semester, how much do students spend on required materials for this course?
     $0**0.4000.000
     $1–49*0.4500.136
     $50–990.1000.273
     $100–200*0.050.591
    Q9: Why did you choose your current textbook? (Select all that apply)
     This textbook has an excellent accompanying online program**0.0000.381
     Students learn well from this textbook**0.1000.571
     It was cost-effective for students*0.9000.429
     I did not choose the textbook as it was decided by the department0.1500.381
    Q10: How would you rate the quality of the textbook used for this course?
     WORSE than the quality of texts in my other courses0.2500.043
     BETTER than the quality of texts in my other courses*0.0500.434
     About the SAME AS the quality of texts in my other courses0.7000.522
    Q12: Rate each statement on how important textbook choice is in determining student success or learning (not including online programs).a
     It is essential for all students to succeed in the course0.6000.773
     It is essential to earn an A in the course0.1120.089
     It is necessary for students to succeed in certain topics0.8000.955
     It is necessary only as a reference and review resource*0.5000.182
     Students do not need any additional materials beyond my lectures0.1500.130
    Q13: How do you expect students to utilize the textbook in this course? (Select all that apply)
     Students should read each chapter before or after class0.8420.857
     Students should only read topics they struggle with0.4210.143
     Students should use the book as a reference for figures and terms0.4210.474
    Q14: Rate each statement on how often do you think students actually use the textbook for this course throughout the semester.b
     Students never access or look at it0.4440.238
     Students use it once or twice total*0.5000.095
     Students use it about once a week0.8950.762
     Students use it more than once a week*0.6840.952

    aRatio is proportion who agreed or strongly agreed with the statement. The available options were strongly agree, agree, disagree, or strongly disagree.

    bRatio is proportion who responded “sometimes” or “often.” The available options were often, sometimes, seldom, or never.

    *Fisher’s t test value of <0.05.

    **Fisher’s t test value of <0.005.

    For students, 23 questions were adapted from the 2013 Bliss et al. (2013a,b) studies. The student survey questions can be viewed in Supplemental Figure 2. A total of 160 students responded to the survey, but only 146 answered the questions completely enough for analysis. All of these students who responded used the OpenStax textbook, so we were not able to compare answers for students using OER versus not using OER. As expected at community and technical colleges, many students reported receiving Pell grants to fund their education (46%) and working more than 15 hours per week (86%).

    RESULTS AND DISCUSSION

    Open-Textbook Use Does Not Significantly Impact Student Outcomes

    The type of textbook used (open vs. non-open) was not significant (p > 0.2) for any student outcomes in univariable models, including course percentage, mean test percentage; pass/fail on first exam, class withdrawal, received an “A,” and received a “C,” “D,” or “F” (Figure 1 and Table 2). All models controlled for the influence of instructor and course through inclusion of random effects into the GLM. As each instructor was at a different college, this also controls for the influence of institution.

    FIGURE 1.

    FIGURE 1. This graph summarizes the outcome data that suggest that the book was not significant (p > 0.2) in univariable models of student outcomes. For the first two graph sets, the y-axis represents the overall average course percentage and test percentages. For the other graph sets, the y-axis represents the percentage of students in that category. Error bars represent SE.

    Because book was not significant for any outcome in univariable models, it was not included in the best-fit multivariable models. Other variables did have a predictive effect on outcomes. In both uni- and multivariable models, semester and class size significantly influenced student outcomes (Tables 3 and 4). For example, as compared with Fall students, students in the Summer semester were 1.9 times more likely to get an “A” and Spring students were ∼1.8 times more likely to get a “C,” “D,” or “F.” After controlling for the effects of other variables, including semester, students in larger classes (29–48 students) were more likely to fail the first exam and had lower course percentages. This suggests that there were a number of factors associated with student success, but textbook is not one of them in our study (Figure 1 and Table 2).

    These findings are similar to other work comparing open and non-open textbooks. Hilton et al. (2013) found test scores virtually unchanged in mathematics courses, and a summary of nine efficacy studies showed mostly nonsignificant differences when OER were used (Hilton, 2016). Additionally, two studies in psychology courses compared two different textbooks within the same course and found no significant differences in exam scores (Durwin and Sherman, 2008; Grisset and Huffman, 2018). Our results add to this body of research by focusing on student outcomes in a single community and technical college biology course.

    Faculty Surveys Suggest Differences in How Faculty Choose Textbooks

    Of the 44 respondents to the faculty survey, 92.5% are aware of OER and 91% report they would use proven OER materials. But the 20 faculty who use an open textbook (referred to as “open-textbook faculty” in this study) cite significantly different reasons for choosing their textbook than the 24 faculty who do not use an open textbook (referred to as “non–open textbook faculty” in this study). As expected, 90% of open-textbook faculty chose their text because is was cost-effective, and only 10% chose their text because “students learn well from it.” In contrast, 43% of non–open textbook faculty selected cost-effectiveness as a reason they chose their text, while 57% selected “students learn well from it” as a reason. Together, these findings suggest that cost-effectiveness is a higher priority than the perception of student learning for faculty who use an open textbook.

    Also, 38% of non–open textbook faculty chose their textbook because of the accompanying online program, while 0% of open-textbook faculty selected this reason. The attachment to an accompanying online program may be a potential barrier for faculty to switch to an open textbook. The details of these answers are shown in Table 5.

    These findings contrast with work presented in Allen and Seaman (2014). In this study of around 2100 faculty at associate’s and bachelor’s degree–granting institutions, only 2.7% of faculty (not divided by type of resource chosen or type of college) select cost as a factor in selecting teaching resources, while we had 43% of non–open textbook and 90% of open-textbook faculty selecting this option. Also, in the Allen and Seaman study, 60% of faculty selected “proven efficacy” as a factor, similar to the percentage (57%) of our surveyed non–open textbook faculty who selected “students learn well from it.” Only 10% of open-textbook faculty selected that option, again suggesting that faculty using open resources may use different criteria when selecting a textbook.

    Open Textbook Is Perceived as a Quality Resource by Students and Faculty

    Both students and faculty who use the open textbook generally view their textbook as similar or better in quality than texts used in other courses. Seventy-five percent (75%) of the 146 student respondents and 70% of the 20 open-textbook faculty found the open textbook to be of equal quality compared with texts used in other courses (Figure 2). One open-textbook faculty made the following additional comment in the survey: “It is pretty equal to a text like Campbell Biology, but it is SO much less expensive for students.” Given the similarities in perceived quality, quality should not be considered a major impediment to switching to an open textbook.

    FIGURE 2.

    FIGURE 2. Responses to the question “How does your textbook compare to textbooks used in other courses?” A single asterisk represents a Fisher’s exact p value of <0.05 comparing open-­textbook to non–open textbook faculty. Error bars represent SE.

    These results are similar to previous published work. Hilton found that 50% or more of students had favorable impressions of OER over eight different perception studies of 2300 students (Hilton, 2016). In addition, a 2013 survey of community college students at seven colleges revealed that 97% of students found OER to be equal or superior to traditional textbooks (Bliss et al., 2013a).

    One interesting aspect of our study is we also asked faculty who use non-open textbooks their opinion of their non-open text. Forty percent (40%) believe that their textbook is better quality than texts used in other courses, while only 5% of faculty using the open text believe their textbook is better quality (Table 5 and Figure 2). In the open-ended comments question, one non–open textbook faculty said “[My published textbook] is a standard and essentially perfect textbook for the first year of biology.” This suggests that faculty who use a non-open book may be more passionate about their book, or perhaps perceive its value to be higher, as cost is one of many related facets to perceived value (Piehl, 1977; Sanchez-Fernandez and Iniesta-Bonillo, 2007).

    Faculty Expectations of Textbook Use Do Not Always Align with Student-Reported Use

    Survey results suggest that student textbook use does not always align with faculty expectations and perceptions of textbook use. For example, 29% of students report reading their texts’ chapters completely compared with ∼85% of faculty expecting that students read the chapters (Figure 3). Failure to read the textbook has been observed in other studies. One study reports that only 27% of psychology students read the textbook before class (Clump et al., 2004). Another study found that 24% of students believe the professor expects them to read 3 hours a week, even when only 8% of students actually read 3 hours per week (Berry et al., 2010).

    Sixty-five percent of students use the textbook for reference and for topics they struggle with. In this case, student use approximately aligns with faculty expectations, as 50% of faculty using OpenStax agree with the statement that “it is necessary only for reference and review.” It is interesting to note that only 18% of faculty using a published book agree with this statement (Table 5). However, students do use the textbooks, with 88% using it about once per week or more (Figure 4). That result may come as a surprise to many open-textbook faculty, of whom 45% think that students only use the book one or two times per semester (9% of students report they use the text one or two times only). Only 1% of faculty using a non-open book thought students use it this infrequently (Table 5).

    Our results suggest that open-textbook and non–open textbook faculty may have different viewpoints on textbook use. For example, as stated earlier, open faculty are more likely to find the book only necessary for reference and review. Also, we see differences in other faculty expectations of textbook use, although not all are statistically significant. (Table 5 and Figures 3 and 4). It is not clear whether this is due to the type of faculty who switch to OER or the type of resource. It is possible that faculty are advocating the book differently to their students. Berry et al. (2010) found that students often do not read the textbook, as they view it as a substitute for other resources a professor provides, such as study guides, videos, or online activities.

    FIGURE 3.

    FIGURE 3. Reported textbook use by students and faculty expectations of use. Students answered “Describe your current use of your textbook for this course” and could select one choice. Faculty answered “How do you expect students to utilize the textbook in this course?” and could select all options that applied. Although questions are not directly comparable for statistical analysis, the trend of answers still shows a difference between student use and faculty expectations of use. Error bars represent SE.

    FIGURE 4.

    FIGURE 4. Frequency of textbook use. Students answered “How often are you using the textbooks for this course?,” and faculty answered “How often do you think students actually use the textbook for this course throughout the semester?” For faculty responses, the ratio is the faculty who checked “sometimes” or “often.” A single asterisk represents a Fisher’s exact p value of <0.05 comparing open-textbook to non–open textbook faculty. Error bars represent SE.

    Limitations

    We did not survey students using published textbooks, so we do not know whether students using a published textbook would answer questions differently than the students using the open textbook. Surveying students who are not using OER would be useful to see whether they have similar views and usage patterns when using a traditional textbook. That being said, it can be assumed that surveyed students had experience with using published, non-open textbooks in other classes. Also, our student outcome data were not disaggregated by demographic information, as it was not allowed in our data-sharing agreement between the colleges. We also did not link survey responses to student outcomes, so we can only make generalizations when comparing the survey data with the outcome data. An important future question is to investigate whether gains are made in any particular student demographic groups.

    In addition, faculty self-selected whether they would use an open-source or published textbook. Owing to this self-selection, we cannot say with certainty that faculty responses are broadly representative of what all faculty would experience were they to switch to OER or stay with non-open resources.

    This study only examined student outcome data with one OER source, OpenStax Biology, and one course, General Biology 1. We do not know whether we would see similar results in other biology courses or other OER sources. Even with this, OpenStax has produced approximately 50 open textbooks (Rice University), and if others are similar in quality to the Biology text, the results may be similar. In addition, the OpenStax Biology textbook is widely used and readily accessible (Boyd, 2016), so having results even for just this one textbook is still valuable.

    If students are not regularly using the textbook (22% report using it only for reference or not using it), they may not be informed enough to accurately judge the quality of the textbook. One student even commented in the survey’s open-ended comments that the textbook is the same quality “because I don’t exactly use any of my textbooks.” It is also possible that some faculty using a non-open book have not recently compared their texts with other textbooks, so they are making comparisons without sufficient knowledge to truly judge their texts in comparison to others.

    As the survey data were self-reported, some of the measures, such as textbook quality, are subject to individual student biases. However, self-reported perceptions are important determinants of faculty choice and students’ engagement with the textbook. Published literature suggests that when students perceive classroom materials and activities to be valuable, they engage more with the course and strive for deeper understanding (Struyven et al., 2008; Brazeal et al., 2016; Cavanagh et al., 2016).

    Summary

    We investigated questions related to implementation of open textbooks in general biology courses at four different community and technical colleges. This study also expanded research on biology OER to community and technical colleges, where many science students begin their studies.

    First, we did not see any significant differences in student outcomes when using an open textbook compared with using a published, non-open textbook (Figure 1). This alone supports use of open textbooks over non-open textbooks. It is possible student learning is relatively independent of textbook choice, OER or otherwise.

    Second, the majority of students and faculty in our study perceive the open textbook as equal in quality to other textbooks (Figure 2). In addition, open-textbook faculty were more likely to select cost-effectiveness and less likely to select “students learn well from it” as a reason the textbook was chosen. Faculty who use a traditionally published book were less likely to select cost-effectiveness and more likely to select “students learn well from it” (Table 5). As student outcomes did not change significantly when using the open book, we argue that students also “learn well” from the open textbook.

    Third, student textbook use by students and faculty expectations of textbook use do not always align. For example, most faculty expect that students read the textbook chapters, while only 29% report reading (Figure 3). But most students do use the textbook at least once a week, even if many faculty think students do not use it this frequently (Figure 4). The results suggest that having a textbook still may be important to a student, but among the textbooks used in this study, the choice of textbook may not be a primary determinant of student learning.

    ACKNOWLEDGMENTS

    This work was funded in part through an Open Education Resource Grant through MinnState.

    REFERENCES

  • Allen, E., & Seaman, J. (2014). Opening the curriculum: Open educational resources in U.S. Babson Research Group. Retrieved January 12, 2019, from www.onlinelearningsurvey.com/oer.html Google Scholar
  • Allen, G., Guzman-Alvarez, A., Molinaro, M., & Larsen, D. (2015, January 2015). Assessing the impact and efficacy of the open-access ChemWiki Textbook Project (Educause Learning Initiative Brief). Retrieved February 18, 2018, from https://library.educause.edu/resources/2015/1/assessing-the
-impact-and-efficacy-of-the-openaccess-chemwiki-textbook-project Google Scholar
  • Bates, D., Maechler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-­effects models using lme4. Journal of Statistical Software, 67, 1–48. Retrieved September 10, 2018, from https://cran.r-project.org/web/packages/lme4/lme4.pdf Google Scholar
  • Berry, T., Cook, L., Hill, N., & Stevens, K. (2010). An exploratory analysis of textbook usage and study habits: Misconceptions and barriers to success. Journal of College Teaching, 59(1), 13–29. https://doi.org/10.1080/
87567555.2010.509376 Google Scholar
  • Bissell, A. (2009). Permission granted: Open licensing for educational resources. Open Learning, Journal of Open and Distance Learning, 24, 97–106. https://doi.org/10.1080/02680510802627886 Google Scholar
  • Bliss, T., Hilton, J., Wiley, D., & Thanos, K. (2013a). The cost and quality of open textbooks: Perceptions of community college faculty and students. First Monday, 18(1). https://doi.org/10.5210/fm.v18i1.3972 Google Scholar
  • Bliss, T., Robinson, T. J., Hilton, J., & Wiley, D. (2013b). An OER COUP: College teacher and student perceptions of Open Educational Resources. Journal of Interactive Media in Education, 2013(1), Art. 4. http://doi
.org/10.5334/2013-04 Google Scholar
  • Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2012). Interactive learning online at public universities: Evidence from randomized trials. Ithaka S+R. https://doi.org/10.18665/sr.22464 Google Scholar
  • Boyd, J. (2016). OpenStax already saved students $39 million this academic year. Rice University News & Media. Retrieved October 9, 2018, from http://news.rice.edu/2016/01/20/openstax-already-saved-students-39
-million-this-academic-year Google Scholar
  • Brazeal, K., Brown, T. L., & Couch, B. A. (2016). Characterizing student perceptions of and buy-in toward common formative assessment techniques. CBE—Life Sciences Education, 15, ar73. https://doi.org/10.1187/cbe.16-03-0133 LinkGoogle Scholar
  • Burnham, K. P., & Anderson, D. R. (2002). Model selection and multimodal inference: A practical information-theoretic approach. New York: Springer-Verlag. Google Scholar
  • Cavanagh, A. J., Aragon, O. R., Chen, X., Couch, B. A., Durham, M. F., Bobrownicki, A., ... & Graham, M. J. (2016). Student buy-in to active learning in a college science course. CBE—Life Sciences Education, 15, ar76. https://doi.org/10.1187/cbe.16-07-0212 LinkGoogle Scholar
  • Clump, M., Bauer, H., & Bradley, C. (2004). The extent to which psychology students read textbooks: A multiple class analysis of reading across the psychology curriculum. Journal of Instructional Psychology, 31(3), 227–232. https://doi.org/10.1080/87567555.2010.509376 Google Scholar
  • Colvard, N. B., Watson, C. E., & Park, H. (2018). The impact of open educational resources on various student success metrics. International Journal of Teaching and Learning in Higher Education, 30(2), 262–276. Google Scholar
  • D’Antoni, S. (2009). Open educational resources: Reviewing initiatives and issues. Open Learning. Journal of Open and Distance Learning, 24, 3–10. https://doi.org/10.1080/02680510802625443 Google Scholar
  • Donaldson, R. L., & Shen, E. (2016). 2016 Florida textbook & course materials survey. Tallahassee: Florida Virtual Campus. Retrieved November 11, 2018, from www.openaccesstextbooks.org/pdf/2016_Florida_Student
_Textbook_Survey.pdf Google Scholar
  • Durwin, C. C., & Sherman, W. M. (2008). Does choice of college textbook make a difference in students’ comprehension? Journal of College Teaching, 56(1), 28–34. https://doi.org/10.3200/CTCH.56.1.28-34 Google Scholar
  • Fischer, L., Hilton, J., III, Robinson, T. J., & Wiley, D. A. (2015). A multi-­institutional study of the impact of open textbook adoption on the learning outcomes of post-secondary students. Journal of Computing in Higher Education, 27, 159–172. https://doi.org/10.1007/s12528-015
-9101-x. Google Scholar
  • Fisher, M. (2018). Evaluation of cost savings and perceptions of an open textbook in a community college science course. American Biology Teacher, 80(6), 410–415. https://doi.org/10.1525/abt.2018.80.6.410 Google Scholar
  • Grisset, J. O., & Huffman, C. (2018). An open versus traditional psychology textbook: Student performance, perceptions, and use. Psychology Learning and Teaching, 18(1), 21–35. doi: 10.1177/1475725718810181 Google Scholar
  • Hewlett Foundation. (2018). Open Educational Resources. Retrieved September 10, 2018, from https://hewlett.org/strategy/open-educational
-resources Google Scholar
  • Hilton, J. (2016). Open educational resources and college textbook choices: A review of research on efficacy and perceptions. Educational Technology Research and Development, 64(4), 573–590. https://doi.org/10.1007/s11423-016-9434-9 Google Scholar
  • Hilton, J., Gaudet, D., Clark, P., Robinson, J., & Wiley, D. (2013). The adoption of open educational resources by one community college math department. International Review of Research in Open and Distance Learning, 14(4), 37–50. https://doi.org/10.19173/irrodl.v14i4.1523 Google Scholar
  • Johnstone, S. M. (2005). Open educational resources serve the world. Educause Quarterly, 28(3), 15. Retrieved September 12, 2018, from https://er.educause.edu/articles/2005/1/open-educational-resources
-serve-the-world Google Scholar
  • Ma, J., Baum, S., Pender, M., & Libassi, C. J. (2018). Trends in college pricing 2018. New York: College Board. Google Scholar
  • Mader, S., & Windelspecht, M. (2015). Essentials of biology (5th ed.). New York: McGraw Hill. Google Scholar
  • Magnusson, A., Skaug, H., Nielsen, A., Berg, C., Kristensen, K., Maechler, M., ... & Brooks, M. (2018). glmmTMB: Generalized linear mixed models using template model builder (R package Version 0.2.2.0). Retrieved December 13, 2018, from cran.r-project.org/web/packages/glmmTMB/glmmTMB.pdf Google Scholar
  • McDonald, J. J. (2014). Handbook of biological statistics (3rd ed., pp. 77–85). Baltimore, MD: Sparky House Publishing. Retrieved December 13, 2018, from www.biostathandbook.com/fishers.html Google Scholar
  • National Science Foundation, National Center for Science and Engineering Statistics. (2010). Characteristics of recent science and engineering graduates: 2010. Retrieved September 10, 2018, from http://ncsesdata.nsf.gov/recentgrads Google Scholar
  • National Student Clearinghouse Research Center. (2018). Snapshot report—Persistence and retention. Retrieved September 10, 2018, from https://nscresearchcenter.org/snapshotreport33-first-year-persistence
-and-retention/ Google Scholar
  • OpenStax. (2013). Biology. OpenStax CNX. Retrieved May 15, 2015, from http://cnx.org/contents/[email protected] Google Scholar
  • Paulsen, M. B., & St. John, E. P. (2002). Social class and college costs: Examining the financial nexus between college choice and persistence. Journal of Higher Education, 73(2), 189–236. https://doi.org/10.1080/00221546.2002.11777141 Google Scholar
  • Pawlyshyn, N., Braddlee, D., Casper, L., & Miller, H. (2013). Adopting OER: A case study of cross-institutional collaboration and innovation. Educause Review. Retrieved September 10, 2018, from www.educause.edu/ero/article/adopting-oer-case-study-cross-institutional-collaboration
-and-innovation Google Scholar
  • Piehl, J. (1977). The effects of price and range of potential readers upon evaluation of psychological textbooks. Journal of Psychology: Interdisciplinary and Applied, 97(2), 275–279. https://doi.org/10.1080/00223980.1977.9923974 Google Scholar
  • Reece, J. B., Urry, L. A., Cain, M. L., Wasserman, S. A., Minorsky, P. V., Jackson, R., & Campbell, N. A. (2014). Campbell biology (10th ed.). Boston: Pearson. Google Scholar
  • Sanchez-Fernandez, R., & Iniesta-Bonillo, M. A. (2007). The concept of perceived value: A systematic review of the research. Marketing Theory, 7(4), 427–451. https://doi.org/10.1177/1470593107083165 Google Scholar
  • Schinske, J. N., Balke, V. L., Bangera, M. G., Bonney, K. M., Brownell, S. E., Carter, R. S., … & Corwin, L. A. (2017). Broadening participation in biology education research: Engaging community college students and faculty. CBE—Life Sciences Education, 16(2), mr1. https://doi.org/10.1187/cbe.16-10-0289 LinkGoogle Scholar
  • Senack, E., & Donoghue, R. (2016). Covering the cost: Why we can no longer afford to ignore high textbook prices. Student PIRGs. Retrieved from www.studentpirgs.org/textbooks Google Scholar
  • Struyven, K., Dochy, F., & Janssens, S. (2008). Students’ likes and dislikes regarding student-activating and lecture-based educational settings: Consequences for students’ perceptions of the learning environment, student learning and performance. European Journal Psychology of Education, 23, 295–317. https://doi.org/10.1007/BF03173001 Google Scholar
  • Watson, C. E., Domizi, D. P., & Clouser, S. A. (2017). Student and faculty perceptions of OpenStax in high enrollment courses. International Review of Research in Open and Distributed Learning, 18(5). https://doi.org/10.19173/irrodl.v18i5.2462. Google Scholar
  • Zuur, A. F. (2009). Mixed effects models and extensions in ecology with R. New York: Springer. Google Scholar