ASCB logo LSE Logo

General ArticlesFree Access

An Evaluation of Two Hands-On Lab Styles for Plant Biodiversity in Undergraduate Biology

    Published Online:https://doi.org/10.1187/cbe.14-03-0062

    Abstract

    We compared learning cycle and expository formats for teaching about plant biodiversity in an inquiry-oriented university biology lab class (n = 465). Both formats had preparatory lab activities, a hands-on lab, and a postlab with reflection and argumentation. Learning was assessed with a lab report, a practical quiz in lab, and a multiple-choice exam in the concurrent lecture. Attitudes toward biology and treatments were also assessed. We used linear mixed-effect models to determine impacts of lab style on lower-order cognition (LO) and higher-order cognition (HO) based on Bloom's taxonomy. Relative to the expository treatment, the learning cycle treatment had a positive effect on HO and a negative effect on LO included in lab reports; a positive effect on transfer of LO from the lab report to the quiz; negative impacts on LO quiz performance and on attitudes toward the lab; and a higher degree of perceived difficulty. The learning cycle treatment had no influence on transfer of HO from lab report to quiz or exam; quiz performance on HO questions; exam performance on LO and HO questions; and attitudes toward biology as a science. The importance of LO as a foundation for HO relative to these lab styles is addressed.

    INTRODUCTION

    In the past decade, the recognition that science disciplines have individual nuances associated with pedagogy has given rise to a new field of research called discipline-based educational research (Singer et al., 2012). In undergraduate biology, biodiversity is a unique discipline-specific topic that has far-reaching importance due to the recent global loss of biodiversity (Thomas et al., 2004a,b), and because biodiversity education is recognized as a worldwide priority in the United Nations Decade of Education for Sustainable Development (2005–2014; Lindemann-Matthies et al., 2009). Biodiversity refers to variability within and between all living organisms, inclusive of all ecosystems (terrestrial, marine, freshwater, etc.), as well as within and between species (Lindemann-Matthies and Bose, 2008).

    At some large universities in the United States, such as the University of Colorado at Boulder, biodiversity is explored during first-year, introductory biology. Such classes commonly have 1000+ students separated into a lecture/recitation led by a professor and a lab taught by a graduate student teaching assistant (GTA). The lecture/recitation is typically dominated by a lecture and discussions designed to reinforce content cognition; whereas, labs provide a hands-on experience with the diversity of living organisms through a combination of dissecting or analyzing living or preserved specimens, culturing, or microscopically analyzing representative characters or adaptations. Student observations provide a foundation for broad integrated reasoning that, most importantly, relates to the overarching umbrellas of evolution and sustainability. Of key importance is that students develop integrated reasoning associated with understanding the role of evolution in creating biodiversity, understand how adaptations are critical to ecosystem dynamics, and how and why there is a balance in ecosystems that can be disrupted. Through this evolutionary framework, students can analyze ecosystems to understand why they have difficulty dealing with environmental changes.

    Currently in the United States, commonplace undergraduate biodiversity labs are often called “marches through the phyla,” because they are mostly nonexperimental, teacher-centered endeavors that emphasize declarative knowledge (see Harris-Haller, 2008; Morgan and Carter, 2008; Vodopich and Moore, 2008; Addy and Longair, 2009; Scully and Fisher, 2009). Such lab designs may be falling short for higher-order (HO) learning goals in biodiversity based on Bloom's taxonomy (analysis and synthesis) and do not follow recent recommendations in science, technology, engineering, and mathematics (STEM) education to transform undergraduate science classes from pedagogy that emphasizes lower-order (LO; knowledge and comprehension) to pedagogy that emphasizes science reasoning, HO, and science-process skills (National Research Council, 2003; National Science Teachers Association, 2007; American Association for the Advancement of Science, 2010; National Academy of Sciences, 2010).

    Why do commonplace biodiversity labs take on a “march through the phyla” approach and emphasize declarative knowledge? Basey et al. (2014) contended that the shift from an emphasis on declarative knowledge to evolution-based HO learning in biodiversity labs is difficult to successfully achieve because of the potential for cognitive overload due to three interacting factors: the high quantity and novelty of required foundational declarative knowledge, the extensive interactivity of concepts, and the theoretical nature of evolution. Thus, innovative designs for biodiversity labs that emphasize evolution-based higher-order learning and tests of those designs are greatly needed. The three-phase learning cycle is a well-supported pedagogy that improves student reasoning (Lawson, 2001) and is potentially a good hands-on model for this shift in learning goals.

    The three-phase learning cycle starts with exploration through hands-on activities designed to engage students and get them to use inductive reasoning to derive explanatory hypotheses; this is followed with a teacher-centered, concept-introduction phase; and the cycle ends with a concept-application phase in which students apply what they have learned using varying formats (Lawson, 2001). Most research examining various formats of the learning cycle has focused on the K–12 level comparing a hands-off traditional lecture experience with a hands-on, learning cycle format (Balci et al., 2006; Dogru-Atay and Tekkaya, 2008; Yalcin and Bayrakceken, 2010; Yilmaz et al., 2011; Sadi and Cakiroglu, 2012). Research on the learning cycle in undergraduate college biology classes encompassing both lab and lecture are not as common. In an introductory biology lab/lecture for non–science majors, Lawson and Johnson (2002) showed that student achievement was greater in a semester-long three-phase learning cycle format compared with a traditional class design; however, LO and HO were not separately assessed. In the same study (published separately), Johnson and Lawson (1998) showed that although achievement was significantly greater in the learning cycle treatment, reasoning ability was the only covariate that explained a significant amount of variance in achievement scores for both lab styles. In a semester-long, community college cell biology course, Jensen and Lawson (2011) found that students in a three-phase learning cycle treatment outperformed students in a traditional class setting on HO, but no significant differences were seen for LO.

    One limitation of studies on the learning cycle is that the learning cycle classes were designed to address learning goals, while the “traditional style” used the method already in place and may not have been equally optimized to address learning goals. Traditional hands-on classes are a subset of a larger group of lab designs called expository (Domin, 1999), in which the teacher provides explicit information at the beginning and students verify by following teacher-derived procedures for a hands-on lab (Domin, 1999). The expository format can vary from a traditional style that is typically associated with a “cookbook” lab format focusing primarily on LO (Jackman et al., 1987; Anders et al., 2003; Luckie et al., 2004), to an inquiry-oriented style emphasizing both LO and HO (Hohenshell and Hand, 2006; Basey and Francis, 2011). In introductory college chemistry, Domin (2007) found that growth in student HO and science reasoning occurred at different times for an expository versus a problem-based lab (another lab style designed to emphasize HO). In the problem-based lab, reasoning and HO growth occurred during the lab process; whereas, in the expository lab, it occurred after the lab during the report write-up, which included reflection, argumentation, and induction (Domin, 2007). Domin (2007) found the learning outcomes were similar between the two formats when each treatment had a prelab assignment, hands-on lab manipulation, and a postlab assignment that included reflection and argumentation; and argued that research comparing learning from the expository versus other styles was not valid without all three components.

    RESEARCH GOALS

    The purpose of this study was to compare student learning outcomes resulting from two styles of biodiversity labs at the undergraduate college level for science majors: 1) the three-phase learning cycle with a problem-based application phase; and 2) an enhanced expository style with back-loaded opportunities for reflection, argumentation, and induction. Both lab styles had a 3-wk preparatory lab experience, a hands-on focal lab experience, and a postlab write-up. In a concurrent study, Basey et al. (2014) demonstrated that both lab styles examined in this study significantly improved student learning of lecture-specific material at both the LO declarative knowledge level and evolution-based HO integration level with a margin of improvement ranging from 6.3 to 11%. Each lab was embedded within a yearlong inquiry-oriented lab curriculum, in which students had multiple opportunities for investigative inquiry using multiple formats, including a full-inquiry, student project lab experience.

    METHODS

    Location and Sample

    The study was conducted at the University of Colorado at Boulder during the Spring 2011 general biology lab (GBLII). GBLII was part of the yearlong introductory sequence for science majors and was run concurrently with a lecture covering similar content and topics, but it was also an independent, one-credit class that could be taken separately from the lecture. Overall, GBLII had ∼864 students, of whom 60% were freshmen, 30% were sophomores, 5% were juniors, and 5% were seniors. Each class had up to 18 students and was taught by a GTA (n = 24).

    Lab Curriculum

    The lab curriculum was a two-semester sequence that most students took in order (i.e., lab 1 and then lab 2). During the first semester, students were introduced to science with two explicit, experimental labs and then practiced inquiry in a series of experimental labs that progressed in level/number of open-ended components and culminated in an open-ended, research-based, student project lab. In the second semester, students continued with inquiry-oriented experimental labs. Intermixed with experimental labs were several nonexperimental hands-on biodiversity labs in which students made firsthand observations of organisms through culturing, microscopic analyses, and dissections. Students were required to extend their observations to derive associated contentions and explain the evidence from the lab and reasoning that lead to the contentions. During such labs, GTAs used mixed models of guidance with some didactic and some constructivist pedagogy.

    Treatments

    The plant biodiversity labs were divided into two treatments: expository and three-phase learning cycle. Progressing alphabetically, GTAs were randomly assigned to teach in one of the two treatments until 12 GTAs were assigned to a single treatment. The remainder of the GTAs were assigned to the second treatment. In the end, 12 GTAs taught in each treatment.

    On the first day of class (week 1), relevant components of the study were explained to students, and all students were allowed to choose whether or not to participate. Approximately 78% of the students chose to participate. Participating students filled out the on-line Colorado Learning Attitudes about Science Survey (CLASS; Semsar et al., 2011). Selection bias by participating versus nonparticipating students was not measured. In week 2, students began preparatory lab activities that lasted 3 wk. In the preparatory lab activities students cultured and grew the C-Fern (Ceratopteris richardii) and investigated its life cycle. In week 5, students worked on a 2-h, 50-min, hands-on, plant biodiversity lab. In week 6, students took a practical assessment at the beginning of lab. In week 11, students took a multiple-choice assessment in the associated lecture. In week 15, students completed an attitude survey in lab and completed the CLASS once again.

    Both preparatory lab treatments were designed so that students allocated approximately the same amount of time. Instead of lecturing during the preparatory lab activities, GTAs provided a handout and interacted with students in small groups. A comparison of the preparatory lab handout for each treatment is in Table 1. In the focal lab following the preparatory lab activities, GTAs in the expository treatment began the class with a lecture/discussion about plant life cycles in general highlighting similarities and differences, as well as adaptations for arid habitats. For the learning cycle, instead of hearing an introductory lecture by the GTA, students began class by using their observations from the preparatory lab activities to construct a diagram of a life cycle for the mystery organism. Following the introduction, students in both treatments were provided with handouts, which are compared in Table 2.

    Table 1. Comparison of the preparatory lab activity handouts given to students in the expository treatment versus the 3-phase learning cycle treatment

    ExpositoryLearning cycle
    Learning goals, including a list of terms.Learning goals generalized toward science reasoning.
    A paragraph or two explaining terminology, structures, and purposes of the observations and techniques to follow that day.An introductory overview of the total prelab, which informed students that the purpose of the prelab was to act like scientists doing discovery science and to use observations to figure out and document important features of a mystery organism.
    Step-by-step instructions with a diagram outlining the techniques for a hands-on procedure that day.Step-by-step instructions with a diagram outlining the techniques for a hands-on procedure that day.
    An explicit statement outlining what observations to make and how to document them in the lab report.Space in the handout for diagrams and notes.
    For example: “Use a compound microscope to examine the germinating spores under the slide. Draw a germinating spore and label the spore coat, rhizoids and developing gametophyte.”Students chose what observations to document and how to document them.

    In both treatments GTAs did not lecture, but had small group interactions with the students.

    Table 2. Components of the focal lab handouts given to students in the expository treatment versus the learning cycle treatment

    CategoryExpositoryLearning cycle
    Handout similaritiesLearning goalsLearning goals
    Specific information about each division of plants studied.Specific information about each division of plants studied.
    Handout differencesStep-by-step procedures and techniques for hands-on observations.An introductory sentence or two describing organisms in the division and techniques and specimens that were available for students to use for observations.
    An explicit statement outlining visualization with required labels. For example: “Look at the living specimen of the moss under the dissecting microscope. Draw a gametophyte with an attached sporophyte and label the capsule at the tip of the sporophyte.”
    LO foundational questions associated with the observations.
    Additional HO integrative reasoning questions. For example: “Pick one theme and utilize specific evidence from examples in this lab to defend the hypothesis—Life originated in aquatic environments and then radiated to terrestrial habitats.”
    Write-upA description of the required lab report: “The document will consist of text, images, and images integrated with diagrams.”A description of the required lab report: “You will individually create a text document including drawings/photos that can educate a person about plant life cycles and provides a test of both of the following hypotheses: 1) life originated in aquatic environments and then radiated to terrestrial habitats, and 2) evolution through natural selection with adaptive radiation is an overarching theoretical framework that explains the current diversity of living organisms.
    Numbered directions outlining questions to be answered and images with labels followed the statement.

    GTA Preparation and Evaluation of Teaching Styles

    In the first semester of the two-semester lab sequence, GTAs met weekly for 2 h to prepare for the inquiry-oriented, experimental labs. Over the first 7 wk, GTAs were exposed to several different lab styles and were coached to use more of a constructivist approach each week, culminating in a full constructivist approach during the student projects lab. During the second semester, in the week before they taught the prelabs and the focal lab, GTAs were provided with a hands-on workshop covering how to teach in each treatment.

    GTA teaching style in each treatment was verified with classroom observations. During the first 2 wk of preparatory lab activities, J.M.B. and A.P.M. designed and modified a simple quantified checklist (Supplemental Material). To ensure observer consistency, the two researchers conducted several observation sessions together to calibrate the checklist, and throughout the week of the focal lab, they continuously compared notes to ensure consistency. The checklist had three major categories: lab format, content instruction, and teacher–student interactions. Each category was averaged for a treatment consistency score on a 3-point scale. GTAs who scored above a 2 for the treatment they were teaching were retained in the study, whereas, GTAs who scored below a 2 were removed from the study. Of the 24 participating GTAs, 3 GTAs were removed based on these criteria. Two additional GTAs were removed due to problems associated with assessments.

    Assessment

    The overall assessment was made up of a formative assessment (a student lab report), a formative/summative assessment (a practical quiz in the lab the week following the plant biodiversity lab), and a summative multiple-choice exam in the concurrent lecture 6 wk after the plant biodiversity lab. Because a high proportion of the students were taking lab and lecture concurrently and plant biodiversity was a topic in the lecture, the plant biodiversity lab and practical quiz were timed so that they occurred before any coverage of plant biodiversity in the concurrent lecture.

    Formative Assessment: Lab Report

    Students in both treatments produced a postlab write-up (lab report) and handed it in the week following the plant biodiversity lab. Because the learning cycle lab reports were open-ended, content in each lab report of both treatments was quantified and categorized with a comprehensive checklist that included any content a student could have included in their lab report (Figure 1).

    The checklist was divided into “knowledge and comprehension” (LO) and “analysis and synthesis” (HO) using the Blooming Biology Tool (Crowe et al., 2008). A point was assigned for each correctly used content category in the lab report that was on the checklist. The final LO and HO score for each lab report was the total number of correctly used content categories (1 point per box). We also scored lab reports based only on the subset of content included that related directly to the quiz (shaded boxes in Figure 1).

    Because grading improved at first with practice and then stabilized, grading consistency was checked after assessing the first 150 lab reports. Eighteen previously assessed lab reports were randomly chosen and were regraded. A paired t test indicated that knowledge/comprehension scores were statistically significantly higher in the regrade (t = −3.19, df = 17, Ptwo-tailed = 0.005) and the analysis/synthesis scores were not significantly different (t = 0.61, df = 17, Ptwo-tailed = 0.55). The first 150 lab reports were reassessed to ensure that lab reports were held to the same standard throughout the assessment process.

    Formative/Summative Assessment: Practical Quiz

    The practical quiz given 1 wk following the plant biodiversity lab was composed of five stations. Three stations dealt with LO foundational information used as building blocks for HO processing, and two stations dealt with HO integration built upon the LO foundation. Questions were categorized by level of learning utilizing the Blooming Biology Tool (Crowe et al., 2008). The three LO questions had two parts: visual identification (knowledge) and relating the visual identification to various aspects of plant life cycles (comprehension). One visual question had microscope slides next to a microscope, and students had to use the microscope to identify the specimen. The other two visual questions had a microscope image displayed on a computer screen. The two HO questions were a synthesis question and an analysis question (Crowe et al., 2008). For the analysis question, students were provided a data table and asked whether the data were consistent with evolution through natural selection and to explain their answers. For the synthesis question, students were provided specimens from one of the four plant divisions examined. Students were asked in a multiple-choice question which of four animal phyla was the equivalent in terms of adaptations to terrestrial life and to explain their answers. (Students had a lab exploring animal diversity before the plant diversity lab.)

    Figure 1.

    Figure 1. The checklist used to quantify content included in student lab reports. Shaded boxes indicate content directly associated with quiz questions (the key is in the top left of the table).

    Table 3. Model-averaged coefficient estimates for all variables present in models with strong support (△AICc < 2) from linear mixed-effect models (teaching assistant was treated as a random effect)a

    Explanatory variableCoefficient estimateSELower CIUpper CIRelative importance
    Does lab style influence the lab reports produced by students?
    Lab report LO
    Style-LC−0.0790.018−0.114−0.0441.00
    Lab report HO
    Style-LC0.022b0.0090.0040.0401.00
    Is the transfer/retention of information from lab report to quiz for LO and HO different for the different lab styles?
    Lab report LO
    Style-LC5.699b2.1401.5059.8931.00
    Does the combination of classroom activities and the lab report influence learning by students in the different treatments as assessed by the quiz and exam?
    Quiz LO
    Style-LC−0.091b0.029−0.146−0.0351.00
    LR-HO0.109b0.0260.0580.1601.00
    LR-LO0.118b0.0270.0650.1701.00
    Quiz HO
    Style-LC0.022b0.0250.0270.0700.34
    LR-HO0.091b0.0230.0460.1361.00
    LR-LO0.059b0.0240.0130.1051.00
    Exam LO
    Style-LC0.015b0.0210.0560.0260.32
    LR-HO0.057b0.0200.0170.0961.00
    LR-LO0.045b0.0210.0050.0861.00
    Exam HO
    Style-LC0.030b0.0320.0340.0930.27
    LR-HO0.113b0.0320.0490.1761.00
    LR-LO0.027b0.0330.0920.0380.24

    When more than one model had strong support (△AICc < 2), we used Akaike weights to calculate model-averaged variable coefficients, unconditional standard errors (SE), and 95% confidence intervals (95% CIs, lower = LCI, upper = UCI). Akaike weights were also used to weigh the evidence of importance for each variable included in supported models (△AICc < 2.00). There was little evidence for the effect when the 95% CIs included or overlapped zero. A negative effect size indicates a negative effect. Style-LC = learning cycle (relative to expository), LR-HO = lab report–higher-order cognition, LR-LO = lab report–lower-order cognition.

    aBold denotes parameters with strong effects because the 95% CI does not overlap zero.

    bEffect sizes have been standardized on two SD following Gelman (2008).

    To deter the sharing of answers by students, we used two paired versions of the quiz and divided them randomly between treatments. Each quiz question corresponded to a question on the other version of the quiz. Both versions of the quiz had the same analysis question. One matched LO foundational question (computer image) was excluded from the analysis because student performance across treatments indicated the paired questions were different in level of difficulty. The remaining four questions—two LO and two HO—were not significantly different between versions (pair 1: Ptwo-tailed = 0.706; pair 2: Ptwo-tailed = 0.186; pair 3: Ptwo-tailed = 0.499; pair 4: Ptwo-tailed = 0.260; pair 5: Ptwo-tailed = 0.359).

    Because learning cycle lab reports were open-ended, students in the learning cycle treatment were less likely than students in the expository treatment to address material in the lab report that was directly related to the quiz. Thus, for comparing relative transfer of learning from the lab report to the quiz, components of lab reports that directly related to quiz questions were tallied (see Figure 1). Relative transfer was calculated with the following equation:

    where Qscore = the student's quiz score (%), Bchecked = the number of quiz boxes checked in the lab report checklist (shaded boxes in Figure 1), and Btotal = the total possible number of quiz boxes that could be checked in the lab report checklist (shaded boxes in Figure 1). (Note: lab report questions were different from quiz questions, so a relative comparison at best could only be made.) Positive scores indicated students demonstrated greater understanding of LO and/or HO on the quiz than on the lab report (transfer), while negative scores indicated students demonstrated less understanding of LO and HO on the quiz than on the lab report (transfer).

    Summative Assessment: Multiple-Choice Exam

    Six weeks after the plant diversity lab, students in lecture were given a multiple-choice exam. Ten multiple-choice exam questions related to plant biodiversity were written at the knowledge, comprehension and analysis levels of Bloom's taxonomy according to Crowe et al. (2008). The lecture professor (B.M.), who did not have any knowledge of the lab treatments, chose several exam questions from each level of learning to include on the exam (10 knowledge, 6 comprehension, and 5 analysis). To verify that the multiple-choice questions were correctly categorized into Bloom's levels, two independent experts reviewed the questions and independently categorized them into Bloom's levels. For each reviewer, a quadratic weighted kappa (Cohen, 1968) was estimated relative to the independent classifications by J.M.B. The quadratic weighted kappa values were high (reviewer 1: Kqw = 0.863, SE = 0.035; reviewer 2: Kqw = 0.870, SE = 0.006).

    The CLASS

    The CLASS uses a pre/post format and examines students’ beliefs about biology and how they are influenced by classroom instruction. The premise is that the attitudes and beliefs of experts in biology are different from those of novices. Pedagogy that promotes expert-like attitudes and beliefs are sought. The CLASS has been well validated and is widely used (Barbera et al., 2008).

    Survey of Attitudes toward Specific Labs

    We estimated student attitudes toward specific labs with a validated survey (see Basey et al., 2008). In the survey, students were asked to rate the labs on a scale of 1–10. In addition to giving an overall lab rating, students were asked to rate each lab based on how easy they thought the lab was to master (ease of lab), level of excitement, time efficiency, and how much the lab helped with lecture. We included these four measures in the survey, because past research on lab style has indicated that these four parameters are important factors that influence student attitudes toward different styles of biology labs (Basey et al., 2008; Basey and Francis, 2011).

    ANALYSES

    We used linear mixed-effect models to determine the effect of lab style (expository vs. learning cycle) on the response variables of LO and HO questions on the lab reports, the plant diversity quiz in lab (quiz), and the plant diversity multiple-choice exam in lecture (exam and CLASS scores). We treated lab style as a fixed effect and GTA as a random effect. Because student attitude assessments were anonymous, we could not use GTA as a random effect in models with their inclusion. Therefore, for models in which we sought to determine how lab style influenced attitudes and how attitudes influence overall lab scores, we used linear regression. Where appropriate, response variable scores were arcsine square root–transformed to meet assumptions about normality and homogeneity of variance before analysis. Where necessary, we standardized parameters that were on different scales (i.e., when lab report scores were included in models with lab style). This was achieved by centralizing predictor variables to a mean of zero and an SD of 0.5, which places continuous variables and binary variables on a common scale for direct comparison. All analyses were performed using the lm4 package in the R program (R Development Core Team, 2012).

    For model selection, we used an information-theoretic approach to evaluate support for competing candidate models (Burnham and Anderson, 2002) with Akaike's information criterion corrected for small sample sizes (AICc) to determine the most parsimonious model that best explained overall lab rating. We ranked models based on differences in AICc scores (ΔAICc; full model-selection results are available in the Supplemental Material).

    RESULTS

    Does Lab Style Influence the Lab Reports Produced by Students?

    Student lab reports were influenced by lab style. Specifically, the learning cycle format (relative to expository treatment) had an independent, negative influence on LO in the lab report (Table 3 and Figure 2). In contrast, the learning cycle format had a small positive effect on HO in the lab report relative to the expository format (Table 3 and Figure 2). See Supplemental Material for results of the complete analysis.

    Figure 2.

    Figure 2. Mean student scores with one standard error for LO and HO cognition in relation to lab style on lab reports, the practical quiz, and relative transfer from the lab report to the quiz. Lab report scores were converted from the total number of boxes checked in Figure 1 to a percentage normalized by placing the highest student score as the maximum. Quiz scores were adjusted by placing the highest student score as the maximum.

    Is the Transfer/Retention of Information from Lab Report to Quiz for LO and HO Different for the Different Lab Styles?

    Results indicate that lab style did not influence the relative transfer of HO from the lab report to the quiz (i.e., the null received more support than the model including lab style [ΔAICc = 2.01]). However, the learning cycle format (relative to expository format) had a strong positive effect on the relative transfer of LO from the lab report to the quiz (i.e., the effect size was very high; Table 3 and Figure 2).

    Does the Combination of Classroom Activities and the Lab Report Influence Learning by Students in the Different Treatments as Assessed by the Quiz and Exam?

    Models including lab style and both LO and HO lab scores received substantially more support over null models in explaining LO and HO quiz scores (ΔAICc = 63.56 and 29.63, respectively) and LO and HO exam scores (ΔAICc = 15.69 and 10.41, respectively). However, there were differences in the relative influences of these variables on quiz and exam scores. For example, the learning cycle style had a negative effect on quiz LO scores relative to the expository style, but did not influence HO quiz scores or any exam scores (Table 3 and Figure 2). As might be expected, LO and HO lab report scores had positive effects on LO and HO exam and quiz scores. The only exception being that LO lab report scores did not have an effect on HO exam scores. Aside from this exception, LO and HO performances on lab reports were predictors of both LO and HO performance on quizzes and exams (Table 3).

    What Were Students’ Attitudes toward the Different Treatments?

    To assess whether the student groups in each treatment were different in their attitudes toward specific labs, we had students indicate their attitudes toward two control labs in addition to the plant biodiversity lab. We found no evidence that student attitudes toward control labs differed between treatments. In this case, the null (ΔAICc = 0.00) had better support from the data than the model including lab style (ΔAICc = 2.01). For the plant biodiversity lab in which students were subjected to different treatments, the learning cycle treatment appeared to have negative rather than positive effects on student attitudes toward the lab. For example, relative to expository, the learning cycle format had a strong negative effect on the overall rating and ease of the lab rating (i.e., they thought the learning cycle format was more difficult) but not on any other variables. Overall rating was positively influenced by low difficulty, high excitement, more help with the lecture part of the class, and high time efficiency (Table 4).

    Table 4. Model-averaged coefficient estimates for all variables present in models with strong support (ΔAICc < 2) from linear regression models

    Explanatory variableCoefficient estimateSELower CIUpper CIRelative importance
    How do different treatments influence students’ attitudes?
    Overall lab rating
    Style-LC−0.4670.219−0.896−0.0381.00
    Excitement
    Style-LC0.3560.2140.7750.0630.59
    Help
    Style-LC0.4280.2420.9020.0460.64
    Time
    Style-LC0.2780.2290.7270.1710.43
    Ease
    Style-LC−0.4860.214−0.906−0.0661.00
    Which attitude variables influence overall lab score?
    Overall lab rating
    Time0.3990.0360.3280.4701.00
    Ease0.2250.0380.1500.3011.00
    Help0.1220.0330.0570.1881.00
    Excitement0.2390.0380.1660.3131.00

    When more than one model had strong support (ΔAICc < 2), we used Akaike weights to calculate model-averaged variable coefficients, unconditional standard errors (SE), and 95% confidence intervals (95% CIs, lower = LCI, upper = UCI). Akaike weights were also used to weigh the evidence of importance for each variable included in supported models (ΔAICc < 2.00). There was little evidence for the effect when the 95% CIs included or overlapped zero. A negative effect size indicates a negative effect. Style-LC = learning cycle (relative to expository), LR-HO = lab report–higher-order cognition, LR-LO = lab report–lower-order cognition. Bold denotes parameters with strong effects because the 95% CI does not overlap zero.

    Did Any of the Treatments Cause the Students to Think More Like Experts?

    The premise of the CLASS is that experts in biology have different beliefs than novices and instructional techniques that promote students to have more expert views are better. The CLASS was given as a pre/post assessment. There was no support for an influence of the learning cycle treatment (relative to expository treatment) on attitude shifts in either the favorable (more like experts) or unfavorable (less like experts) direction (i.e., null models had ΔAICc scores < 2.00 and the 95% confidence interval (CI) for the model-averaged coefficient estimates for lab style overlapped zero in both cases).

    DISCUSSION

    In this study, we compared two hands-on plant biodiversity labs that were both designed to emphasize the same learning goals (LO and HO) for undergraduate college science majors. Most studies that have examined lab-like, hands-on learning in college have compared an outdated traditional style (i.e., what is currently in practice) with an alternative inquiry-oriented design (Suits, 2004; Lord and Orkwiszewski, 2006; Pook et al., 2007; Brickman et al., 2009). Furthermore, studies examining the learning cycle in college-level biology classes have compared student learning with a traditional style that was not specifically designed for the learning goals examined. In this study, the design did not favor either treatment. The GTAs were randomly assigned to a treatment and were removed from the analysis if they did not properly implement the treatment. For cognitive assessments in which GTA identity was known, we factored out the GTA effect in the statistical analysis. Thus, we were able to compare the efficacy of the expository style and three-phase learning cycle style for meeting biodiversity learning goals in the context of large introductory lab classes with multiple sections taught by new GTAs.

    Because we could not find a verified plant biodiversity practical assessment for the desired learning goals in a lab setting, we developed the two cognitive assessments for this study (quiz and exam), and the postassessment design for the cognitive assessments was appropriate for comparing student learning resulting from the two treatments. The attitude assessments were both previously verified and well supported. Other research on the learning cycle has used a generalized measure of reasoning ability to assess reasoning gains (Johnson and Lawson, 1998; Lawson and Johnson, 2002; Brickman et al., 2009). Given that our ultimate goal was biodiversity education, we evaluated reasoning in the context of higher-order reasoning about plant biodiversity content based on Bloom's taxonomy.

    The format of the lab report was a key difference between the treatments. The lab report for the expository format was explicitly prompted and thus focused students on specific content related to the learning goals. In contrast, the lab report prompt for the learning cycle format was open-ended and potentially allowed students the opportunity to incorporate a wider range of content information related to the learning goals. Theoretically, this difference is an important difference for student learning. With the open-ended lab reports, the students had to construct their own meaning and create a document that included vital evidence to support contentions. Thus, they needed to initially evaluate more of the information from the lab to decide what to include in the lab report. In contrast, students in the expository style could focus on answering the specific questions from the report. Hence, we expected the transfer from the formative lab report to be greater for the learning cycle than for the expository style.

    The study addressed two questions: 1) How does lab report content (quality and thoroughness) compare between treatments? 2) How does content included by students in the formative lab reports transfer to learning as assessed with the summative quiz? Student lab reports were influenced by lab style. Specifically, the open-ended lab reports had less LO content and more HO reasoning than the prompted lab reports (Table 3 and Figure 2). Lab reports from the expository classrooms followed the format and organization of the teacher's prompts, while lab reports from learning cycle classrooms varied greatly. Learning cycle reports were more likely to include full life-cycle diagrams for each phylum studied in lab, while guided reports provided individual, labeled pictures to answer the prompt. While lack of teacher-given guidelines provided learning cycle students with the possibility of including more extensive content, it also meant that students did not have organized and layered foundational information to build upon. Although LO was lower in the learning cycle as assessed by the quiz, there was not a difference for HO between the two lab styles (Table 3 and Figure 2). Because the quiz tested a subset of the information that could be included in the lab report, we quantified items on the quiz that were specifically included in the lab report of each individual student and cross-referenced student quiz performance specifically on those items (relative transfer). For each fraction of LO quiz content addressed by a student on the lab report, the learning cycle student performed relatively better on the associated quiz question than did an expository student (Table 3 and Figure 2). This may have been a result of a learning cycle student relative to an expository student understanding the importance of an LO item for inclusion in the open-ended lab report, or alternatively, that relative to the expository student, the learning cycle student elected not to include specific LO information in the lab report that he/she knew and comprehended. No differences were seen between the two styles for transfer/retention of HO from the lab report to the quiz (Table 3 and Figure 2).

    For the practical quiz, LO questions assessed a student's abilities to: use microscopes to make observations, use observations to identify specimens and/or parts of specimens, and comprehend material. The quiz was given in the lab before any coverage of plant biodiversity in lecture and thus assessed student learning primarily from the lab and report experience. Students in the learning cycle format did not perform as well on the LO quiz questions as students in the expository format (Table 3 and Figure 2). Logically, then, something about the learning cycle format did not prepare students as well on LO foundational information as the expository format. There are two likely explanations. The first is that the lack of guidance in the open-ended lab report did not focus students on the LO foundational information associated with desired learning goals. The second may be due to the design of the preparatory lab experience.

    For the learning cycle treatment, the initial engagement phase is critical to motivate students (Karplus, 1977). In the learning cycle format used here, the initial engagement phase may have fallen short relative to the expository. The engagement in the preparatory lab activities was based on excitement associated with identifying a “mystery organism.” Theoretically, after they acted like research biologists, made their observations, and derived their life cycle of the “mystery organism,” we expected extra engagement during the explicit instruction phase and a potential “favorable” shift in their CLASS scores, but no such shift was present. (Although it is also possible that since the total yearlong curriculum was filled with inquiry-oriented experiences that the effect of the single lab experience on the CLASS was lessened.) Observations (not quantified) indicated that while some students were excited about solving the mystery, many students were excited about the observations themselves. Further evidence stems from results of the attitude assessment that indicated students did not differ in their rating of excitement between lab styles, but they rated the expository style as easier than the learning cycle style. While students in the learning cycle treatment were discovering, students in the expository treatment were learning terminology associated with plant life cycles; and thus, students in the expository treatment were exposed longer to the novel terminology associated with plant biodiversity. It is possible that with the high quantity of novel terminology, the students needed a little more exposure to the terms before HO integration than was supplied by this learning cycle module.

    Other research has shown that the learning cycle style is particularly important for generalized reasoning gains not necessarily associated with specific content (Johnson and Lawson, 1998; Lawson and Johnson, 2002). Thus, one of our expectations for the learning cycle style relative to the expository style was to see students perform better on the HO assessment questions on the quiz. No such result was seen. Logically, in order to integrate information at HO, students need to have a strong grasp of the foundational LO information. Thus, if the learning cycle format did not provide the LO foundation as well as the expository format, even though students may have gained some HO reasoning practice with the learning cycle format relative to the expository format (as seen with the positive relationship between HO and the learning cycle format in the lab report), without the LO they may not have been able to demonstrate their reasoning gains on the quiz that required foundational LO information to build upon for the HO integration questions.

    The lecture exam provided information about longer-term learning/retention, but was problematic in that students were exposed to instruction on plant biodiversity in lecture in addition to the lab experience. The lecture exam was also a multiple-choice format in which students had to recognize the correct answer rather than generate the answer themselves. Both factors lessen the relative impact of the lab treatments relative to time from instruction and the quiz. Results showed no discernible long-term impacts were present on the exam for HO and LO due to differences in the lab treatments.

    In this study, all attitude variables (time efficiency, excitement, ease of lab, and lecture help) made up the best model to explain student rating of the plant biodiversity lab. The two explanatory variables with the greatest effect sizes (time efficiency and excitement) produced no significant differences between the two lab styles. Basey et al. (2008) found excitement to be the most important of these four factors in determining student overall attitudes toward lab. Basey and Francis (2011) showed that, in a guided-inquiry (expository) versus a problem-based lab, students did not differ in their view of how exciting the lab was, but did differ in the other three factors examined (ease of lab, time efficiency, and lecture help). The expository lab was considered by students to be less difficult. Basey and Francis (2011) concluded that lab style may not be a driving force that changes student perception of the lab excitement rating, but that other factors such as the subject matter, personal relevance to students, and material addressed could be more important in changing the excitement rating. These data are consistent with that contention.

    Educational Implications

    These results lead to a complicated interpretation and a dilemma concerning the style that was best for learning about plant biodiversity within a college-level, inquiry-oriented lab curriculum for science majors. While the expository style produced better student learning of LO and better attitudes from students, it is possible that the open-ended lab report and problem-based application phase of the learning cycle format produced better transfer of LO from the formative process (lab and lab report) to the summative assessment (quiz). With the highly integrated and novel foundational information, a guided preparatory lab experience may be critical for building the foundation upon which HO can be erected. Perhaps the best model may be a guided 3-wk prelab experience with the C-fern, followed by the problem-based application phase with the open-ended lab report format.

    ACKNOWLEDGMENTS

    The project was funded by the Integrating STEM (iSTEM) Education Initiative. Support for C.D.F. was provided by the University of Colorado Graduate School and the National Evolutionary Synthesis Center (NSF EF-0905606). Manuscript preparation was independent of any funding agency.

    REFERENCES

  • Addy HD, Longair RW (2009). Laboratory Manual for Biology 223, Organismal Biology of Plants and Animals, Plymouth, MI: Hayden McNeil. Google Scholar
  • American Association for the Advancement of Science (2010). Vision and Change in Undergraduate Biology Education: A Call to Action, Washington, DC. Google Scholar
  • Anders C, Berg R, Christina V, Bergendahl B, Bruno K, Lundberg S (2003). Benefiting from an open-ended experiment? A comparison of attitudes to, and outcomes of, an expository versus an open-inquiry version of the same experiment. Int J Sci Educ 25, 351-372. Google Scholar
  • Balci S, Cakiroglu J, Tekkaya C (2006). Engagement, exploration, explanation, extension, and evaluation (5E) learning cycle and conceptual change text as learning tools. Biochem Mol Biol Educ 34, 199-203. MedlineGoogle Scholar
  • Barbera J, Perkins KK, Adams WK, Wieman CE (2008). Modifying and validating the Colorado Learning Attitudes about Science Survey for use in chemistry. J Chem Educ 85, 1435-1439. Google Scholar
  • Basey JM, Francis CD (2011). Design of inquiry-oriented science labs: impacts on students’ attitudes. Res Sci Tech Educ 29, 241-256. Google Scholar
  • Basey JM, Maines AP, Francis CD, Melbourne B, Wise SB, Safran RJ, Johnson PTJ (2014). Impact of pre-lab learning activities, a post-lab written report, and content reduction on evolution-based learning in an undergraduate plant biodiversity lab. Evol Educ Outr 7, 10. Google Scholar
  • Basey JM, Sackett LS, Robinson NS (2008). Optimal science lab design: impacts of various components of lab design on students’ attitudes toward lab. Int J Schol Teach Learn 2, article 15. Google Scholar
  • Brickman P, Gormally C, Armstrong N (2009). Effects of inquiry based learning on students’ science literacy skills and confidence. Int J Schol Teach Learn 3, 22. Google Scholar
  • Burnham KP, Anderson DR (2002). Model Selection and Inference: A Practical Information-Theoretic Approach, New York: Springer-Verlag. Google Scholar
  • Cohen JN (1968). Weighted kappa: nominal scale agreement provision for scaled disagreement or partial credit. Psychol Bull 70, 213-222. MedlineGoogle Scholar
  • Crowe A, Dirks C, Wenderoth MP (2008). Biology in Bloom: implementing Bloom's taxonomy to enhance student learning in biology. CBE Life Sci Educ 7, 368-381. LinkGoogle Scholar
  • Dogru-Atay P, Tekkaya C (2008). Promoting students’ learning in genetics with the learning cycle. J Exp Educ 76, 259-280. Google Scholar
  • Domin DS (1999). A review of laboratory instruction styles. J Chem Educ 76, 543-547. Google Scholar
  • Domin DS (2007). Students’ perceptions of when conceptual development occurs during laboratory instruction. Chem Educ Res Pract 8, 140-152. Google Scholar
  • Gelman A (2008). Scaling regression inputs by dividing by two standard deviations. Stat Med 27, 2865-2873. MedlineGoogle Scholar
  • Harris-Haller T (2008). Laboratory Manual for Biology 112, 4th ed., Plymouth, MI: Hayden McNeil. Google Scholar
  • Hohenshell LM, Hand B (2006). Writing-to-learn strategies in secondary school cell biology: a mixed method study. Int J Sci Educ 28, 261-289. Google Scholar
  • Jackman LE, Moellenberg WP, Brabson GD (1987). Evaluation of three instructional methods for teaching general chemistry. J Chem Educ 64, 794-796. Google Scholar
  • Jensen JL, Lawson A (2011). Effects of collaborative group composition and inquiry instruction on reasoning gains and achievement in undergraduate biology. CBE Life Sci Educ 10, 64-73. LinkGoogle Scholar
  • Johnson MA, Lawson AE (1998). What are the relative effects of reasoning ability and prior knowledge on biology achievement in expository and inquiry classes?. J Res Sci Teach 35, 89-103. Google Scholar
  • Karplus R (1977). Science teaching and development of reasoning. J Res Sci Teach 14, 169-175. Google Scholar
  • Lawson AE (2001). Using the learning cycle to teach biology concepts and reasoning patterns. J Biol Educ 35, 165-169. Google Scholar
  • Lawson AE, Johnson M (2002). The validity of Kolb learning styles and neo-Piagetian developmental levels in college biology. Stud High Educ 27, 79-90. Google Scholar
  • Lindemann-Matthies P, Bose E (2008). How many species are there? Public understanding and awareness of biodiversity in Switzerland. Hum Ecol 36, 731-742. Google Scholar
  • Lindemann-Matthies P, Constantinou C, Junge X, Kohler K, Mayer J, Nagel U, Raper G, Schule D, Kadji-Beltran C (2009). The integration of biodiversity education in the initial education of primary school teachers: four comparative case studies from Europe. Environ Educ Res 15, 17-37. Google Scholar
  • Lord T, Orkwiszewski T (2006). Moving from didactic to inquiry-based instruction in a science laboratory. Am Biol Teach 68, 342-346. Google Scholar
  • Luckie DB, Malezewski JJ, Loznak SD, Krha M (2004). Infusion of collaborative inquiry throughout a biology curriculum increases student learning: a four-year study of “Teams and Streams.”. Adv Phys Educ 287, 199-209. Google Scholar
  • Morgan JG, Carter MEB (2008). Investigating Biology, Laboratory Manual, 6th ed., Glenview, IL: Pearson Education. Google Scholar
  • National Academy of Sciences (2010). A New Biology for the 21st Century: Ensuring the United States Leads the Coming Biology Revolution, Washington, DC: National Academies Press. Google Scholar
  • National Research Council (2003). BIO2010: Transforming Undergraduate Education for Future Research Biologists, Washington, DC: National Academies Press. Google Scholar
  • National Science Teachers Association (2007). The Integral Role of Laboratories in Science Instruction, Arlington, VA: NSTA. Google Scholar
  • Pook JR, Burke KA, Greenbowe TJ, Hand BM (2007). Using the Science Writing Heuristic in general chemistry laboratory to improve students’ academic performance. J Chem Educ 84, 1371-1379. Google Scholar
  • R Development Core Team (2012). R: A language and Environment for Statistical Computing, Vienna, Austria: R Foundation for Statistical Computing, www.R-project.org. Google Scholar
  • Sadi O, Cakiroglu J (2012). Relation of cognitive variables with students’ human circulatory system achievements in traditional and learning cycle classrooms. Soc Behav Sci 46, 399-403. Google Scholar
  • Scully TA, Fisher RWW (2009). Discovering Biology in the Lab, An Introductory Laboratory Manual, New York: Norton. Google Scholar
  • Semsar K, Knight JK, Birol G, Smith MK (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for use in biology. CBE Life Sci Educ 10, 268-278. LinkGoogle Scholar
  • Singer SR, Nielsen NR, Schweingruber HA (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering, Washington, DC: National Academies Press. Google Scholar
  • Suits J (2004). Assessing investigative skill development in inquiry-based and traditional college science laboratory courses. School Sci Math 104, 248-257. Google Scholar
  • Thomas CD, et al. (2004a). Extinction risk from climate change. Nature 427, 145-148. MedlineGoogle Scholar
  • Thomas JA, Telfer MG, Roy DB, Preston CD, Greenwood JJD, Asher J, Fox R, Clarke RT, Lawton JH (2004b). Comparative losses of British butterflies, birds, and plants and the global extinction crisis. Science 303, 1879-1881. MedlineGoogle Scholar
  • Vodopich D, Moore R (2008). Biology Laboratory Manual to Accompany Brooker Biology, New York: McGraw-Hill. Google Scholar
  • Yalcin FA, Bayrakceken S (2010). The Effect of 5E learning model on pre-service science teachers’ achievement of acids-bases subject. Int Online J Educ Sci 2, 508-531. Google Scholar
  • Yilmaz D, Tekkaya C, Sungur S (2011). The comparative effects of prediction/discussion-based learning cycle, conceptual change text, and traditional instructions on student understanding of genetics. Int J Sci Educ 33, 607-628. Google Scholar