ASCB logo LSE Logo

General Essays and ArticlesFree Access

A Hierarchical Mentoring Program Increases Confidence and Effectiveness in Data Analysis and Interpretation for Undergraduate Biology Students

    Published Online:https://doi.org/10.1187/cbe.19-10-0201

    Abstract

    Science instructors are increasingly incorporating teaching techniques that help students develop core competencies such as critical-thinking and communication skills. These core competencies are pillars of career readiness that prepare undergraduate students to successfully transition to continuing education or the workplace, whatever the field. Course-based undergraduate research experiences that culminate in written research papers can be effective at developing critical-thinking and communication skills but are challenging to implement as class size (and student-to-instructor ratio) grows. We developed a hierarchical mentoring program in which graduate student mentors guided groups of four to five undergraduate students through the scientific process in an upper-level ecology course. Program effectiveness was evaluated by grading final research papers (including previous year papers, before the program was implemented) and surveys (comparing to a course that did not implement the program). Results indicated that primary benefits of hierarchical mentoring were improvements in perceived and demonstrated ability in data analysis and interpretation, leading to a median increase in paper score of ∼10% on a 100-point scale. Future directions indicated by our study were a need to incorporate more approaches (e.g., low-stakes writing exercises) and resources into a revised program to improve outcomes for students whose primary language is not English.

    INTRODUCTION

    It is becoming increasingly recognized that, to prepare undergraduate students for careers in biological sciences, they need more than a “textbook” understanding of biological terms and concepts. A report by the American Association for the Advancement of Science (AAAS, 2011) asserted that curricula should develop “core competencies,” including the ability to apply the process of science (e.g., hypothesis testing); use quantitative reasoning (e.g., applying statistical methods); communicate and collaborate (e.g., scientific writing); and understand the relationship between science and society (see Clemmons et al., 2019). These goals are not to the exclusion of mastering fundamental disciplinary knowledge; improvement in critical-thinking and communication skills is strongly, positively related to subject matter learning (Rivard, 1994; Bean, 2011; Reynolds et al., 2012; Merkle, 2019). Even beyond demonstrated competency, students’ perceptions of their abilities is a strong predictor of persistence in science, technology, engineering, and mathematics (STEM) fields (Simon et al., 2015). In this paper, we describe the implementation of an educational innovation in the life sciences—a writing mentorship program—that incorporates research experience and community building (AAAS, 2011), and we test whether mentoring increased perceived ability and demonstrated effectiveness of undergraduate science writers.

    Scientific writing involves many steps and skills, creating a challenge for undergraduate students who have had little opportunity for practice (see Sampson and Walker, 2012). Furthermore, students’ opinions about the writing process could decrease their confidence and effectiveness (Zhang and Cheung, 2018; Dowd et al., 2019). The many steps involved in writing a research paper include: defining the research question and hypothesis, designing the experimental test, searching databases for and reading primary literature, collecting and managing data, conducting statistical analyses, interpreting statistical results, conveying results in the context of primary literature, and, finally, crafting an interesting “story.” Reynolds et al. (2012) proposed that, as opposed to focusing on learning to write, there is great potential in a writing-to-learn model, in which students move from memorizing facts to understanding scientific concepts throughout the writing process (also see Rivard, 1994). Indeed, each step in the writing process contributes to “core competencies” in its own right (AAAS, 2011; Clemmons et al., 2019), culminating in a product that demonstrates the students’ abilities to think critically and communicate their new scientific knowledge (Reynolds et al., 2012; Brownell et al., 2013; Dowd et al., 2019). However, despite recommendations (AAAS, 2011; Clemmons et al., 2019), few courses are designed to improve learning outcomes related to communication (Brownell et al., 2013). It is particularly important to consider the challenges of multilingual learners who are developing language skills while grappling with the challenges of disciplinary learning (Van Roekel, 2008).

    In addition to the challenges faced by students, instructors are challenged to develop curricula for teaching disciplinary fundamentals, data-collection techniques, and science writing within the timeline of a university term, which may be as short as a 10-week quarter. To accommodate this, courses may be built around independent research projects based on novel but unpublished data sets that are not—or are only partially—generated by the students themselves. In a course-based undergraduate research experience (CURE; Brownell et al., 2015; Dolan, 2017, p. 3), “whole classes of students address a research question or problem with unknown outcomes or solutions that are of interest to external stakeholders.” CUREs can be scaled up to provide many of the same benefits of lab-based undergraduate research experiences, including developing students’ critical-thinking skills and increasing students’ motivation to communicate research results, but for cohorts of students rather than individual apprentices (Dolan, 2017). However, it is likely that these benefits are related to the instructor-to-student ratio. Ideally, the instructor provides individualized feedback at multiple steps (perhaps proposal and rough-draft stages) of this process, but this feedback is most likely to be written and concise, with the addition of beneficial verbal (Salamone, 2020) and reciprocal feedback (Evans and Hartshorn, 2010) dependent on time constraints and student proactiveness.

    We endeavored to address many of these challenges, faced by students and instructors alike, in teaching science writing by initiating a writing mentorship program. Mentoring has been repeatedly shown to benefit college students in terms of several indicators and drivers of success, including academic performance, social integration, and persistence in STEM careers (McGee, 2001; Crisp and Cruz, 2009; Gershenfeld, 2014). The SWIM (Science Writing Integrated Mentoring) program was built on a hierarchical mentoring model, in which mentoring occurs at three or more levels (rather than only between the instructor and undergraduate students). There is some evidence that hierarchical mentoring can promote undergraduate success, particularly of underrepresented minorities in STEM fields (Wilson et al., 2011). Benefits of hierarchical mentoring include building confidence in mentees and the increased scholarly proficiency of mentors that comes with teaching (McGee, 2001).

    In this study, we addressed the question: Does hierarchical mentoring increase science writing confidence and effectiveness? Here, we define effective science writing as clearly and accurately communicating information, which comprises aspects of composition (grammar, structure), data analysis and interpretation, and grounding in science theory (also see Yore et al., 2004). We hypothesized, first, that confidence would be directly related to the level of mentoring, predicting that confidence (as indicated by changes between pre- and postclass surveys) would increase more for students in the mentoring program than those in a comparison class that did not receive additional mentoring. Second, we hypothesized that writing effectiveness would be related to the level of mentoring, predicting that effectiveness (indicated by score on a grading rubric) would be higher for students taking the course the year the mentoring program was implemented than for those attending the year before its initiation. Finally, we predicted that mentors would report benefits consistent with the research and teaching objectives of their graduate programs. In our analyses, we incorporated additional factors that we anticipated could influence outcomes, particularly whether English was a student’s primary language.

    METHODS

    The SWIM (Science Writing Integrated Mentoring) Program

    The SWIM program was implemented at University of California, Irvine (UCI) in Fall 2018 in an upper-division ecology class, Population and Community Ecology. In this program, undergraduate students were mentored by two groups—graduate students and the instructor—with the instructor also providing mentoring guidance to the graduate students. Graduate student SWIM mentors provided data sets to undergraduate students and assisted the mentees in writing their research papers. SWIM mentors were recruited via email from the Department of Ecology and Evolutionary Biology PhD program. Mentors were responsible for contributing a data set for analysis and committing a total of 25 hours during an academic term per an established schedule (Table 1). Mentors were selected after review of a cover letter, curriculum vitae, and writing sample, and they received compensation in the form of research funds ($500) from an education grant. One of the main anticipated strengths of this program was that undergraduate students would work with data sets collected by their research mentors. These data sets were by definition appropriate for CUREs, because they were part of original, unpublished research projects underway in order to address issues of societal relevance (Dolan, 2017). Thus, because mentors would be intimately familiar with the study systems, we anticipated that this would enhance their ability to assist with finding relevant references, analyzing data, and interpreting results. At the same time, the mentors were expected to benefit from the opportunity to dive deeper into their own data sets through the diverse “new eyes” of the undergraduate mentees. Although undergraduate student writing was used for the course only (not for publication), continuing collaborations were possible and encouraged. Graduate student mentors also gained experience in training and pedagogy (Dolan and Johnson, 2009; Limeri et al., 2019): each mentor advised a team of four to five undergraduate students, and in-class group meetings (Table 1) were intended to promote idea sharing and horizontal mentoring among peers (e.g., Jin et al., 2019). Table 1 details the time spent and order of mentoring activities, as well as the steps and formats in which mentors gave feedback.

    TABLE 1. Curriculum for SWIM mentoring program implemented in an upper-level ecology class with undergraduate student activities and due dates shown on the left and activities involving mentors (as well as number of hours allotted) shown on the right

    aIn-class mentoring activities are shaded in gray.

    Assessment

    We assessed impacts of the SWIM program via comparison with two groups that did not participate in the program. Writing confidence was compared between students in the SWIM program and students in a second upper-division ecology class (Field Biology) that did not implement the SWIM program. Writing effectiveness was compared between students who took Population & Community Ecology in 2018 (the year the SWIM program was implemented) and the previous year (2017, before program implementation). There was one section of each course, and Population & Community Ecology was taught by the same instructor (C.J.B.S.) and teaching assistant in both years, while Field Biology was taught by two instructors (including J.D.P.). Evaluation of this program was determined to be exempt by the UCI Institutional Review Board (human subjects research project no. 2018-4595), and both undergraduate students and graduate mentors were provided with study information and opt-out procedures before data collection.

    Writing confidence was assessed through surveys (Table 2).We evaluated differences between SWIM students (treatment group that received mentoring) and a similar upper-division ecology class, Field Biology (comparison group without mentoring). These courses were similar in that they were both upper-division ecology classes (respectively, 100 and 95% of students were fourth and fifth years) that culminated in independent research projects. However, Population & Community Ecology included significant ecological theory, while Field Biology was more methods focused. As upper-division writing courses, both courses (Population & Community Ecology and Field Biology) fulfilled UCI expectations that instructors would provide training in discipline-specific writing practices and that students would write papers totaling more than 4000 words, receiving feedback and having the opportunity for revision. In both Population & Community Ecology and Field Biology, students wrote research papers based on a self-defined question, although the sources of these data differed. In Field Biology, papers focused on data collected independently or as a class (intertidal field surveys or a desert ant experiment). In the SWIM program, data were collected and provided by the graduate mentors; however, it is interesting to note that most students had experience in collecting similar data, because the class included a field experience conducting intertidal surveys, and five of the six graduate mentors were intertidal ecologists. In addition, because Field Biology was a required course for the ecology and evolution major, and both courses were taught in the same term (Fall 2018), there were more ecology and evolution majors in Field Biology, while most Population & Community Ecology students were majoring in general biology. Two students took both classes concurrently and were only included in the SWIM (treatment) group for these evaluations. Thus, our analyses related to science writing confidence included 48 students total: 28 students (no ecology and evolution majors) from the SWIM course (Population & Community Ecology) and 20 students from the non-SWIM course (Field Biology), of which nine (43%) were ecology and evolution majors.

    TABLE 2. Survey questions used to assess perceptions and attitudes about science in general and writing in particulara

    Science skills (1 = none, 2 = little, 3 = some, 4 = much, 5 = extensive)
    Run statistical tests (in any computer program)
    Make figures (in a program such as Excel)
    Complete problem sets in small groups
    Write a research proposal
    Design a study or experiment that follows up on one I read about
    Read scientific papers (also called the “primary literature”)
    Present results orally
    Analyze data
    Find primary literature articles relevant to a particular question
    Give poster presentations
    Develop a logical argument
    Enter and format data (in a program such as Excel)
    Conduct a lab or field study that is designed by the instructor
    Write a research paper or report
    Conduct a lab or field study entirely of student design
    Use functions for calculations (in a program such as Excel)
    Critique the work of other students
    Complete problem sets individually
    Recognize a sound argument and appropriate use of evidence
    Collect data
    Writing confidence (1 = almost always, 2 = often, 3 = sometimes, 4 = occasionally, 5 = almost never)
    My teachers are familiar with so much good writing that my writing must look bad by comparison.
    I’ve seen really good writing, but my writing doesn’t match up to it.
    I think my writing is good.
    I think of my instructors as reaching positively to my writing.
    Writing is a very unpleasant experience for me.
    I enjoy writing, though writing is difficult at times.
    I like having the opportunity to express my ideas in writing.
    I’m not sure at times, how to organize all the information I have collected for a paper.
    Writing on topics that can have different focuses is difficult for me.
    I have trouble deciding how to write on issues that have many interpretations.
    To write essays on books and articles that are very complex is difficult for me.
    I have trouble with assignments that ask me to compare or contrast or to analyze.
    I run over deadlines because I get stuck while trying to write my paper.
    I have to hand in assignments late because I can’t get the words on paper.
    Each sentence I write has to be just right before I’ll go on to the next.
    When I write, I’ll wait until I’ve found just the right phrase.
    I find myself writing a sentence, then erasing it, trying another sentence, then scratching it out. I might do this for some time.
    My first paragraph has to be perfect before I’ll go on.
    While writing a paper, I’ll hit places that keep me stuck for an hour or more.
    At times, I find it hard to write what I mean.
    At times, my first paragraph takes me over 2 hours to write.
    Starting a paper is very hard for me.
    At times, I sit for hours unable to write a thing.
    Some people experience periods when, no matter how hard they try, they can produce little, if any, writing. When these periods last for a considerable amount of time, we say the person has a writing block. Estimate how often you experience writer’s block.
    Attitudes about science (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, 5 = strongly agree)
    I get personal satisfaction when I solve a scientific problem by figuring it out myself.
    Being able to write well is an essential skill that I will use throughout my life.
    Students who are not majoring in science should not have to take science courses.
    I can do well in science courses.
    If an experiment shows that something doesn’t work, the experiment was a failure.
    I wish science instructors would just tell us what we need to know so we can learn it.
    The process of writing in science is helpful for understanding scientific ideas.
    I can do well in non-science courses.
    Select “strongly disagree” as your answer to this question.
    Creativity does not play a role in science.
    Even if I forget the facts, I’ll still be able to use the thinking skills I learn in science.
    Explaining science ideas to others has helped me understand the ideas better.
    Scientific experts are the only members of the public who are qualified to make judgments on scientific issues.
    There is too much emphasis in science classes on figuring things out for yourself.
    Science is essentially an accumulation of facts, rules, and formulas.
    The main job of the instructor is to structure the work so that we can learn it ourselves.

    aSurveys were conducted in the first and last week of the 10-week term for students who received SWIM mentoring (Population & Community Ecology course) and those in a course without mentoring.

    We investigated impacts of the SWIM program on writing effectiveness through performance on a written final paper. We compared paper scores for students who took the Population & Community Ecology course during the year SWIM was implemented (2018) with those for students who attended the year before implementation (2017). The independent research project (and component assignments: proposal, rough draft, and final draft) was largely consistent between years, except that all students used the same data set in 2017 (from intertidal field surveys). The primary difference in course content was that time devoted to the SWIM program in 2018 was used for student-led paper discussions in 2017, and related shifts in grade accounting led the student project to increase in importance to the overall class grade from 40% in 2017 to 50% in 2018. There were also differences in the length of class meeting time (3 vs. 4 hours per week in 2017 and 2018, respectively). Our analyses related to writing effectiveness included 46 total Population & Community Ecology students: 28 students (no ecology and evolution majors) from the 2018 cohort with mentoring and 18 students (four, or 22%, of whom were ecology and evolution majors) from the 2017 cohort without mentoring. Students whose primary language was not English represented, respectively, 39 and 25% of the cohorts with and without mentoring.

    Data Collection

    We evaluated writing confidence using surveys conducted at the beginning (first week of class) and end (last week of class) of the academic term in both the course with mentoring (Population & Community Ecology) and without mentoring (Field Biology; Table 2 and Appendix 1 in the Supplemental Material). The survey was adapted from the Student Assessment of their Learning Gains website (https://salgsite.net, which includes information on survey validity at https://salgsite.net/about) and the Classroom Undergraduate Research Experience Survey (CURE Survey) by D. Lopatto (available at www.grinnell.edu/academics/resources/ctla/assessment/cure-survey). We included self-assessment questions related to 1) perceived scientific ability/experience (which included skills in three subcategories: experimental design, data analysis, and science communication), referred to in this paper as writing “confidence”; 2) writing attitudes/behaviors; and 3) opinions about science in general. Questions for each section were answered on a five-point Likert scale (Table 2). The postclass survey included questions about perceived benefits of each class (also based on the CURE Survey by D. Lopatto), and SWIM students also answered questions specifically regarding satisfaction with the SWIM program. Surveys were conducted in class on paper, except for the survey at the end of the term in the course without mentoring (Field Biology), which was conducted electronically. It is important to note that survey results are highly subjective and difficult to standardize between students, in part due to differential interpretation of the response scale (e.g., “small gain” vs. “large gain”). Here, we restrict our conclusions to the relative perceptions of students between groups that did and did not receive mentoring, acknowledging that we are unable to ensure that this subjectivity is randomly distributed between groups. Furthermore, throughout the text, we refer to perceived abilities when results are based on survey data and demonstrated abilities when results are based on instructor evaluation, given that student self-assessments often vary greatly from more objective assessments (Falchikov and Boud, 1989; Deslauriers et al., 2019).

    Writing effectiveness was quantified as performance on the final paper and compared between 2018 (with mentoring) and 2017 (without mentoring) students. We developed a rubric (Appendix 2 in the Supplemental Material) based initially on the one used and provided to students in class with two main additions. First, to increase objectivity, particularly between graders, we increased the number of scoring criteria within each heading and articulated what was required to achieve each point value. Second, we associated these criteria with specific indicators of science writing effectiveness: writing composition (the “nuts and bolts”), data analysis and interpretation, and demonstrated understanding of ecological theory. Thus, the rubric given to the students was different (less detailed) than that used for the grading in order to be consistent between years (2018 and 2017, before this study was conceived).

    Papers were graded by three of us (A.K.H., N.M.A.-R., J.D.P.), and we began by calibrating our assessments based on the grading rubric with a random sample of five papers. N.M.A.-R., who was not an instructor for any of the courses, deidentified and assigned a random identifier to each student paper. Subsequently, all papers were evaluated by two of the three graders, as well as the third grader in the 15 cases where the difference between the first two graders’ scores was more than 10 points out of 100. The first two graders were assigned using a stratified randomization to ensure graders each read the same proportion of papers from both years of the course (2018 with mentoring and 2017 without mentoring) and of students who did not identify English as their primary language (5/18 and 11/28 students in 2017 and 2018, respectively). The average scores across all graders were used in the analyses, with total score across the three categories indicating overall writing effectiveness.

    Finally, we informally collated mentors’ motivations for participating in the SWIM program, as well as satisfaction with and challenges of the program, using surveys conducted at the beginning and end of the term (Appendix 3 in the Supplemental Material).

    Data Analysis

    To assess whether SWIM program participation led to gains in science writing confidence, we first excluded students who incorrectly answered the control question (“select strongly disagree as your answer to this question”) as well as students who skipped (however inadvertently) entire pages of the survey. This led us to exclude 10 students for a final comparison of n = 21 and 18 students in the groups with mentoring and without mentoring, respectively. We then calculated individual-level (paired) change in Likert score for each individual question. We also averaged Likert scores for each of the categories (science skills, writing behaviors, science opinions) and science skills subcategories (experimental design, data analysis/interpretation, and science communication) and calculated medians (for each category and subcategory) of the individual-level changes. Finally, we conducted Mann-Whitney tests for each question and category between the course with mentoring and the comparison course without mentoring. Given the large number of statistical tests, we used Bonferroni-corrected p values in all of the writing confidence analyses to assess statistical significance. We report significance at both the level of p < 0.05 and the Bonferroni-corrected level.

    We tested for internal consistency of our survey data, broken down by category and subcategory. Survey responses categorized as science skills and writing confidence had high internal consistency (Cronbach’s alpha; α = 0.875, α = 0.908, respectively). Science opinions were less internally consistent (α = 0.680), reflecting a more varied set of questions. All subcategories of science skills were above or near α = 0.7 as well (data analysis, α = 0.674; science communication, α = 0.743; experimental design, α = 0.722), which indicates acceptable consistency given the low number of questions in each subcategory. We concluded that these categorical summaries reliably reflect underlying confidence and attitudes among respondents.

    To evaluate the effect of the SWIM program on writing effectiveness, we used a Mann-Whitney nonparametric test to quantify the impact of course (2018 with mentoring vs. 2017 without mentoring) on overall paper score, as well as the subcategories of writing composition, data analysis/interpretation, and ecological theory. We also evaluated the effects of additional factors that we hypothesized could influence writing effectiveness, including primary language (English only, another language only, or both English and another language), gender, declared major, and underrepresented minority and first-generation college student status (both as yes or no) using a generalized linear model (GLM). Mann-Whitney (M-W) tests were conducted in R (R Core Team, 2019), while generalized linear model analyses (Proc GENMOD) were conducted in SAS v. 9.4.

    RESULTS

    Impacts of Mentoring on Science Writing Effectiveness

    Students who participated in the SWIM program tended to write more effectively (score higher) on their final paper than those who took Population & Community Ecology in 2017, the year before the SWIM program was implemented (M-W p = 0.081; Figure 1). Median paper scores increased by almost 10%, from 71.9% [68.9, 80.2] (median [95% CI]) without mentoring to 81.4% [76.1, 85.4]) with mentoring. Increased overall effectiveness was driven by improvement in data interpretation (M-W p = 0.014), with relatively minor changes in writing composition ability (M-W p = 0.220) and ability to demonstrate understanding of fundamental ecological theory (M-W p = 0.090). Across these three categories of writing effectiveness, students whose first language was not English scored lower in writing composition (GLM p = 0.048) but did not score differently in data interpretation (GLM p = 0.897), ecological theory (GLM p = 0.503), or overall writing effectiveness (GLM p = 0.306).

    FIGURE 1.

    FIGURE 1. Frequency of overall paper scores for students (n = 28) who participated in the SWIM mentoring program (black) and those (n = 18) who took the course the previous year without mentoring (gray). Median scores and 95% confidence intervals are shown in dashed and dotted lines, respectively.

    Impacts of Mentoring on Science Confidence and Opinions

    Students in both the SWIM mentoring program and comparison group without mentoring (Field Biology) reported an increase in perceived ability and experience in science skills, including across all three subcategories of planning, executing, and communicating scientific research (Figure 2 and Data Set 1 in the Supplemental Material). On a scale where 2/3/4 were equivalent to little/some/much ability and experience, overall science skills score increased from a mean of 2.99 (±0.57 SD) to 3.87 (±0.53) in the group with mentoring and from 3.19 (±0.46) to 3.77 (±0.48) in the group without mentoring. For overall science skills, as for the three subcategories, SWIM student scores started slightly (but not significantly) lower than those of the group without mentoring and ended higher (but not significantly so; Figure 2 and Data Set 1 in the Supplemental Material). After correcting for multiple comparisons, activities for which there was significant improvement in survey scores between the first and last week of the term for both groups included writing a research proposal and designing an experiment that follows up on one they read about (p < 0.001; Data Set 1 in the Supplemental Material). In addition, SWIM participants also reported an increase in perceived ability and experience with conducting a lab or field study entirely of student design (from 1.96 to 3.33) and data analysis (running statistical tests; from 2.57 to 3.76). There were no significant changes in attitudes about writing or science opinions in either class (Figure 2 and Data Set 1 in the Supplemental Material).

    FIGURE 2.

    FIGURE 2. Distribution of per-student mean Likert scores from surveys administered in the first (before) and last (after) week of the term, in a course with mentoring (black) and a comparison group without mentoring (gray). Students were asked to rank their ability/experience with course activities, frequency of writing behaviors, and opinions about science based on the questions and scales in Table 2. Values shown are medians (dark bars), 95% CIs (edges of notch), interquartile ranges (ends of boxes), and maximum and minimum values (bars; when values extended more than 1.5 times the interquartile range, they are shown as data points). Values are based on a mean of all questions within each category or subcategory for each student (n = 21 with mentoring and n = 18 without mentoring). Asterisks indicate significant changes (p < 0.05) over the duration of the course based on Mann-Whitney tests.

    Participant Perspectives on Mentoring Program Outcomes

    Undergraduate students participating in the SWIM program perceived an overall “moderate” gain in science skills and abilities based on a survey of benefits administered at the end of the term (Table 3). For SWIM students, average gain across categories was 3.17 (on a scale of 2 = small, 3 = moderate, and 4 = large gain). Largest gains were reported in “ability to analyze data” (3.76) and “understanding of the research process in your field” (3.76). Areas where SWIM students tended to perceive larger gains than the group without mentoring were in “ability to read primary literature” and “skill in science writing.” However, there were no significant differences between courses, with the group without mentoring (Field Biology students) reporting an average across categories of 3.45 and larger gains than the SWIM group in “clarification of a career path” (which we discuss later in relation to differing student populations).

    TABLE 3. Benefits perceived by students due to participation in their ecology course, based on a survey conducted at the end of the term for n = 21 students in a course with mentoring and n = 18 students in a course without mentoringa

    CourseWith mentoring (PCE)Without mentoring (FB)
    Likert score12345Mean (SD)12345Mean (SD)p value
    Clarification of a career path0.330.4800.140.052.1 (1.18)0.110.060.50.220.113.17 (1.1)0.006*
    Tolerance for obstacles faced in the research process0.050.190.430.240.13.14 (1.01)00.170.280.390.173.56 (0.98)0.203
    Readiness for more demanding research0.10.140.290.330.143.29 (1.19)00.060.330.50.113.67 (0.77)0.214
    Understanding how knowledge is constructed00.290.290.290.143.29 (1.06)0.060.060.220.560.113.61 (0.98)0.268
    Ability to analyze data00.10.290.380.243.76 (0.94)00.060.170.330.444.17 (0.92)0.045
    Understanding of the research process in your field0.0500.380.290.293.76 (1.04)00.060.170.560.223.94 (0.8)0.184
    Ability to integrate theory and practice00.050.670.190.13.33 (0.73)00.220.170.560.063.44 (0.92)0.343
    Understanding of how scientists work on real problems00.140.330.290.243.62 (1.02)00.110.280.50.113.61 (0.85)0.819
    Understanding that scientific assertions require supporting evidence00.140.430.10.333.62 (1.12)000.110.610.284.17 (0.62)0.125
    Understanding science00.140.520.240.13.29 (0.85)00.060.560.330.063.39 (0.7)0.888
    Learning ethical conduct in your field0.240.330.240.10.12.48 (1.25)0.170.330.170.280.062.72 (1.23)0.415
    Learning lab and field techniques00.10.330.380.193.67 (0.91)00.170.170.440.223.72 (1.02)0.361
    Ability to read primary literature00.240.190.290.293.62 (1.16)00.220.330.330.113.33 (0.97)0.67
    Skill in science writing000.430.380.193.76 (0.77)0.110.060.280.390.173.44 (1.2)0.89
    Self-confidence0.10.430.330.10.052.57 (0.98)0.110.110.440.3303 (0.97)0.143
    Understanding of how scientists think00.330.480.140.052.9 (0.83)00.220.390.280.113.28 (0.96)0.248
    Learning to work independently00.240.240.380.143.43 (1.03)00.110.390.390.113.5 (0.86)0.656
    Becoming part of a learning community0.050.240.380.290.053.05 (0.97)00.220.220.390.173.5 (1.04)0.283
    Confidence in my potential to be a teacher of science0.140.430.3800.052.38 (0.92)0.170.220.390.170.062.72 (1.13)0.291
    Effectiveness in oral presentation0.10.520.290.050.052.43 (0.93)0.060.220.390.280.063.06 (1)0.044

    aStudents reported gains on the following scale: 1 = little to no gain, 2 = small gain, 3 = moderate gain, 4 = large gain, 5 = very large gain. Values shown are proportions, and significant differences between courses (from Mann-Whitney tests) are indicated based on Bonferroni-corrected p values (* p < 0.0025). FB, Field Biology; PCE, Population & Community Ecology.

    The six graduate student mentors described their goals of participating in the SWIM program as increasing mentoring experience and skills, further analyzing their data sets, and learning “more about the pedagogy of active teaching” (Data Set 2 in the Supplemental Material). After the program, when asked to describe the most beneficial aspect of participating in the program, mentors focused on developing their mentoring skills along with benefits that were not articulated in the preclass surveys. Two mentors brought up the value of “getting to work with a diverse group of students” and “seeing students … analyze the data differently than expected,” with four of six mentors strongly agreeing (and one agreeing, one neutral) that their “teaching and mentoring styles ha[d] become more diversified.” In addition, mentors described benefits of developing their own skills in scientific writing. Time management was a challenge for some mentors (two of six mentors found that responsibilities took longer than the time allotted). Challenges identified by graduate student mentors were: motivating student mentees to be creative in their topics, motivating students to take initiative and request feedback in a timely manner, guiding a set of students with diverse baseline knowledge, and completing their mentorship with limited in-person meeting time. All mentors found participating in the program to be a valuable experience (three agree, three strongly agree) and would consider participating again in the future (one maybe, three likely, two very likely), with none indicating that the compensation was inadequate. Mentor suggestions for improving the program included more frequent, shorter meetings with their groups (including some individual meeting time), as well as the need for training and resources to support students for whom English is not their primary language.

    DISCUSSION

    Our results indicated that hierarchical mentoring increased perceived and demonstrated ability in science writing, particularly the data analysis and interpretation components. By improving in this area, students who took the Population & Community Ecology course the year that the SWIM program was implemented earned a median paper score that was a full letter grade higher than those who did not participate in the program (Figure 1), leading to greater success in the course as a whole. Even more importantly, these gains indicate improvement in skills at the foundation of critical thinking, which is itself a core aspect of career readiness that is transferable between disciplines and sectors.

    Increases in communication effectiveness—based on rubric scores averaged across two to three objective graders—were due to improvement in data analysis and interpretation but not changes in composition ability or level of demonstrated understanding of ecology fundamentals (Figure 2). These findings likely represent key similarities and differences in the teaching of the course during the year with SWIM program implemented (in 2018) and without (in 2017). In both years, students received the same level of feedback from the instructor on writing composition, which included equivalent written instructions (formatting requirements, grading rubric, and writing resources) and individual comments on the rough draft (minimal to moderate text editing). Mentors were encouraged to focus their comments on rubric areas that needed the most work, prioritizing content over composition. In addition, the disciplinary course content (on ecology fundamentals) was largely the same between years. Finally, in both years, the course was built around a CURE that included participating in data collection (field surveys) and writing a research paper (which included multiple steps: proposal, rough draft, and final draft). Thus, benefits of this CURE approach (e.g., as shown by Stephenson and Sadler-McKnight, 2016) are likely seen in both years.

    Improvement in data analysis and interpretation was likely driven by the main difference between years, which was the addition of the SWIM program in 2018. All students in the SWIM program received feedback on data analysis and interpretation from their mentors before submitting and receiving feedback on the rough draft (from their mentor and then the instructor; Table 1). In the previous year, only students who sought help from the instructor or course teaching assistant (e.g., during office hours) received individual feedback before submitting the rough draft. After receiving written comments on the rough draft, mentors acted as a dedicated point of contact for SWIM students during class time and via email. Our study cannot discern which aspect(s) of the SWIM program were responsible for the improvements that we observed, including whether these were related to the quantity, quality, or format of feedback and project mentoring. Removing the student-led paper discussions may have also impacted course outcomes, particularly as this had been the only oral presentation requirement of the course (besides the small-group poster presentation). It is unlikely that differences in student populations drove differences in overall writing effectiveness, because we found that overall effectiveness was not related to any of the demographic characteristics tested, including gender, major, minority or first-generation status, or first language. Similarly, post hoc analyses showed that these results are not due to differences in scores between graders or rubric components. The SWIM program appears to have enhanced students’ abilities in the areas of data analysis and interpretation even beyond gains that likely already existed in the earlier course, given that both courses incorporated a CURE, which has been shown to increase these same skills (Brownell et al., 2015).

    The increase in demonstrated effectiveness in data analysis and interpretation was also reflected in survey results, which indicated that students perceived an increase in ability and experience in science skills. Of these science skills, the only cases in which SWIM students perceived a greater increase in ability/experience during the term than the students in the comparison course without mentoring were also related to data analysis and interpretation. We cannot discount other differences between the courses (course instructors, curricula aside from the SWIM program, etc.) as the driver of this difference in perceived effectiveness. However, the fact that both demonstrated and perceived effectiveness in data analysis and interpretation were higher in the group with mentoring than the group without mentoring suggests that having a dedicated mentor allowed students to develop their confidence in “conducting a lab or field study entirely of student design” and “running statistical tests” (Data Set 1 in the Supplemental Material). In a post hoc analysis, students who reported that English was not their primary language on the course surveys also reported lower perceived ability/experience with science skills.

    Students across both courses reported perceived gains in ability/experience for the two other categories of science skills: experimental design and science communication. Both courses included a CURE culminating in a research paper, as required for the university’s writing-intensive lab courses. However, Field Biology students reported greater gains in “clarification of a career path,” which might be expected for a course with a higher proportion of disciplinary majors. Our findings of gains in science skills across both courses highlights the benefits of upper-level ecology lab courses for building confidence in core competencies (including critical thinking and communication) that can be further developed with the addition of more individualized mentoring through a curriculum such as the SWIM program.

    It is particularly notable that we found clear gains in data analysis and interpretation given the limitations associated with self-selecting populations, which led to skewed populations and low sample sizes for some student groups. For example, none of the SWIM students were ecology and evolution majors, while 24% of students in the 2017 Population & Community Ecology course and 43% of Field Biology students were ecology and evolution majors, which were our comparison courses without mentoring. Although we might have expected differences based on major, a post hoc analysis of Field Biology students indicated no difference between responses on the preclass survey between students who were and were not ecology and evolution majors. Similarly, we did not find differences in preclass survey responses associated with gender (another characteristic for which courses had skewed populations). However, due to low sample sizes, we were limited in our ability to assess the degree to which the effectiveness of the SWIM program depended on combinations of student characteristics.

    We encountered several challenges that would need to be considered in order to successfully adopt a similar mentoring program curriculum. First, the program relied on ability to recruit and compensate graduate student mentors. We successfully recruited even more well-qualified mentors than originally planned, and all of these mentors indicated that they would at least consider participating again. Ideally, creating a cohort of mentoring alumnae would increase recruitment of new mentors. We did find from a post hoc analysis that student paper scores differed between mentors, which we anticipate was largely due to variation in the complexity of the data sets provided by the mentors. Thus, data set complexity should be considered in the mentor selection and/or preparation process. Mentor compensation could be a challenge, as at the rate used previously, the cost averaged $100 per undergraduate student. This one term required funding from two small teaching development grants. This brings up the second main challenge, which is additional effort required by the instructor, both to raise funds (by writing grants or soliciting university-level support) and to administer the program, including coordinating communication with the mentors. A potential alternative to monetary compensation would be to include this mentoring responsibility in a mentoring certificate program, with training and certification serving as compensation. Third, the length of the course/term was limiting, and we suggest 10 weeks at a minimum to accomplish the steps in the scientific process given in Table 1. Courses with a more limited time allotment may be able to effect gains in statistical reasoning by incorporating a workshop like the one described by Olimpo et al. (2018). Finally, class size interacts with the previously discussed challenges, including recruiting and compensating mentors. However, in theory, this program could be scaled to the size of a large lecture as long as the mentor-to-undergraduate ratio could be maintained and the instructor received additional support in grading.

    Our study results highlight several areas for improvement on the current mentoring program curriculum, particularly in order to best support students for whom English is not their primary language. We found that whether or not English was a student’s primary language only affected overall communication effectiveness due to changes in composition ability. The program already used some approaches that have been shown to improve the outcomes of English language writing instruction for those who are not native speakers, including feedback at multiple steps in the writing process (Evans and Hartshorn, 2010) and providing “novel and authentic” writing assignments (Zhang and Cheung, 2018). These approaches included increasing motivation by allowing students autonomy in choosing their study system and research question (students were assigned to mentor groups based on their ranked preferences), integrating feedback from multiple sources (i.e., peers, mentors), and providing opportunity for revision. The composition score was also related more to a paper’s organizational structure and clarity of ideas than to grammar per se (Appendix 2 in the Supplemental Material). Additional strategies that could be employed to address challenges of students whose primary language is not English (see Zhang and Cheung 2018) are 1) increasing the number of—and time spent discussing—scientific papers students are assigned to read (Anderson, 2016); 2) providing students with a document listing common composition mistakes (e.g., as compiled in the tool kit by Dirrigl and Noe, 2014); 3) adding low-stakes writing assignments as practice for structuring a clear and sound argument (Brownell et al., 2013; Anderson, 2016; Spix and Brasier, 2018); and 4) partnering with other offices on campus (particularly writing centers; Pfrenger et al., 2017) to provide resources for students, mentors, and instructors alike. Ultimately, changes to the program would need to be weighed by their costs and benefits, with those that could be enacted outside class time given preference due to time limitations. Importantly, any improvements in teaching writing composition are likely to build competency in communication across all students in the course.

    In summary, we found consistent benefits of a hierarchical mentoring program for building data analysis and interpretation skills in upper-level undergraduate students. Students participating in this program both perceived an increase in their ability to analyze and interpret data and demonstrated enhanced ability in this area, indicating an increase in two factors—self-efficacy and competency—that are known to individually promote persistence of students in STEM fields (Simon et al., 2015). At the same time, mentors perceived benefits in aspects of career readiness beyond their initial motives for participation (see Dolan and Johnson, 2009), including communication, leadership, teamwork, and awareness and fluency in working with a diverse student population. Furthermore, the effects of hierarchical mentoring can “cascade” through learning communities (Feldon et al., 2019) and might continue to increase skill development even beyond the metrics and time period of our study. Together, these findings provide support for continuing to incorporate a hierarchical mentoring program in the undergraduate science curriculum. Future iterations of this mentoring approach should consider challenges encountered by students for whom English is not their primary language and integrate methods to evaluate continued and increased effectiveness in building skills needed for career success.

    ACKNOWLEDGMENTS

    We especially thank the undergraduate students and graduate student mentors (S. Bedgood, L. Elsberry, K. Gallego, S. Mahanes, L. Pandori, and P. Wallingford) who participated in the SWIM program. Course teaching assistants P. Wallingford and N. Carvajal facilitated this study. K. Denaro (UCI Teaching and Learning Research Center) provided helpful feedback on study design and analytical approaches. We are thankful for comments by the editor and two anonymous reviewers that greatly improved this paper. The SWIM program was supported by a UCI Campus Writing Center Writing Pedagogy Grant to C.J.B.S. and a Development Fellowship to C.J.B.S. from CUREnet (funded by National Science Foundation award no. 1730273).

    REFERENCES

  • American Association for the Advancement of Science. (2011). Vision and change: A call to action, Washington, DC. Retrieved April 23, 2019, from https://live-visionandchange.pantheonsite.io/wp-content/uploads/2013/11/aaas-VISchange-web1113.pdf Google Scholar
  • Anderson, K. L. (2016). Active learning in the undergraduate classroom: A journal-club experience designed to accentuate course content. American Biology Teacher, 78, 67–69. Google Scholar
  • Bean, J. C. (2011). Engaging ideas: The professor’s guide to integrating writing, critical thinking, and active learning in the classroom. San Francisco, CA: Jossey-Bass. Google Scholar
  • Brownell, S. E., Hekmat-Scafe, D. S., Singla, V., Seawell, P. C., Conklin Imam, J. F., Eddy, S. L., … & Cyert, M. S. (2015). A high-enrollment course-based undergraduate research experience improves student conceptions of scientific thinking and ability to interpret data. CBE—Life Sciences Education, 14(2), ar21. LinkGoogle Scholar
  • Brownell, S. E., Price, J. V., & Steinman, L. (2013). A writing-intensive course improves biology undergraduates’ perception and confidence of their abilities to read scientific literature and communicate science. Advances in Physiology Education, 37, 70–79. MedlineGoogle Scholar
  • Clemmons, A., Timbrook, J., Herron, J., & Crowe, A. (2019). BioSkills Guide: Core competencies for undergraduate biology, Version 4.0. QUBES Educational Resources. doi: 10.25334/3MNW-KJ05 Google Scholar
  • Crisp, G., & Cruz, I. (2009). Mentoring college students: A critical review of the literature between 1990 and 2007. Research in Higher Education, 50(6), 525–545. Google Scholar
  • Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences USA, 116(39), 19251–19257. MedlineGoogle Scholar
  • Dirrigl, F. J., & Noe, M. (2014). The student writing toolkit: Enhancing undergraduate teaching of scientific writing in the biological sciences. Journal of Biological Education, 48, 163–171. Google Scholar
  • Dolan, E. L. (2017). Course-based undergraduate research experiences: Current knowledge and future directions (National Research Council commissioned paper). Washington, DC: National Research Council. Google Scholar
  • Dolan, E. L., & Johnson, D. (2009). Toward a holistic view of undergraduate research experiences: An exploratory study of impact on graduate/postdoctoral mentors. Journal of Science Education and Technology, 18, 487. Google Scholar
  • Dowd, J. E., Thompson, R. J. Jr., Schiff, L., Haas, K., Hohmann, C., Roy, C., … & Reynolds, J. A. (2019). Student learning dispositions: Multidimensional profiles highlight important differences among undergraduate STEM honors thesis writers. CBE—Life Sciences Education, 18(2), ar28. LinkGoogle Scholar
  • Evans, N. W., & Hartshorn, K. J. (2010). Contextualizing corrective feedback in second language writing pedagogy. Language Teaching Research, 14, 445–463. Google Scholar
  • Falchikov, N., & Boud, D. (1989). Student self-assessment in higher education: A meta-analysis. Review of Educational Research, 59(4), 395–430. Google Scholar
  • Feldon, D. R., Litson, K., Jeon, S., Blaney, J. M., Kang, J., Miller, C., … & Roksa, J. (2019). Postdocs’ lab engagement predicts trajectories of PhD students’ skill development. Proceedings of the National Academy of Sciences USA, 116, 20910–20916. MedlineGoogle Scholar
  • Gershenfeld, S. (2014). A review of undergraduate mentoring programs. Review of Educational Research, 84(3), 365–391. Google Scholar
  • Jin, L., Doser, D., Lougheed, V., Walsh, E. J., Hamdan, L., Zarei, M., & Corral, G. (2019). Experiential learning and close mentoring improve recruitment and retention in the undergraduate environmental science program at an Hispanic-serving institution. Journal of Geosciences Education, 67(4), 384–399. Google Scholar
  • Limeri, L. B., Zaka Asif, M., & Dolan, E. L. (2019). Volunteered or voluntold? Motivations and perceived outcomes of graduate and postdoctoral mentors of undergraduate researchers. CBE—Life Sciences Education, 18, 1–18. Google Scholar
  • McGee, C. D. (2001). Calming fears and building confidence: A mentoring process that works. Mentoring & Tutoring: Partnership in Learning, 9, 201–209. Google Scholar
  • Merkle, B. G. (2019). Writing science: Transforming students’ science writing by tapping into writing instruction scholarship and best practices. Bulletin of the Ecological Society of America, 100, e01487. Google Scholar
  • Olimpo, J. T., Pevey, R. S., & McCabe, T. M. (2018). Incorporating an interactive statistics workshop into an introductory biology course-based undergraduate research experience (CURE) enhances students’ statistical reasoning and quantitative literacy skills. Journal of Microbiology & Biology Education, 19(1), 19.1.49. MedlineGoogle Scholar
  • Pfrenger, W., Blasiman, R. N., & Winter, J. (2017). “At first it was annoying”: Results from requiring writers in developmental courses to visit the writing center. Praxis: A Writing Center Journal, 15, 22–35. Google Scholar
  • R Core Team. (2019). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Google Scholar
  • Reynolds, J. A., Thaiss, C., Katkin, W., & Thompson, R. J. Jr. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach. CBE—Life Sciences Education, 11, 17–25. LinkGoogle Scholar
  • Rivard, L. P. (1994). A review of writing to learn in science—implications for practice and research. Journal of Research in Science Teaching, 31, 969–983. Google Scholar
  • Salamone, M. (2020). Using written and verbal feedback to improve students’ writing outcomes (Doctoral dissertation). University of Massachusetts Lowell. Google Scholar
  • Sampson, V., & Walker, J. P. (2012). Argument-driven inquiry as a way to help undergraduate students write to learn by learning to write in chemistry. International Journal of Science Education, 34(10), 1443–1485. Google Scholar
  • Simon, R. A., Aulls, M. W., Dedic, H., Hubbard, K., & Hall, N. C. (2015). Exploring student persistence in STEM programs: A motivational model. Canadian Journal of Education, 38(1), n1. Google Scholar
  • Spix, T. A., & Brasier, D. J. (2018). Using blogs as practice writing about original neuroscience papers enhances students’ confidence in their critical analysis of research. Journal of Undergraduate Neurosciences Education, 16, A120–A125. MedlineGoogle Scholar
  • Stephenson, N. S., Sadler-McKnight, N. P. (2016). Developing critical thinking skills using the Science Writing Heuristic in the chemistry laboratory. Chemistry Education Research and Practice, 17, 72–79. Google Scholar
  • Van Roekel, D. (2008). English language learners face unique challenges (An NEA policy brief). Washington, DC: National Education Association. Google Scholar
  • Wilson, Z. S., Holmes, L., Sylvain, M. R., Batiste, L., Johnson, M., McGuire, S. Y., … & Warner, I. M. (2011). Hierarchical mentoring: A transformative strategy for improving diversity and retention in undergraduate STEM disciplines.Journal of Science Education and Technology, 21, 148–156. Google Scholar
  • Yore, L. D., Hand, B. M., & Florence, M. K. (2004). Scientists’ views of science, models of writing, and science writing practices. Journal of Research in Science Teaching, 41, 338–369. Google Scholar
  • Zhang, W., & Cheung, Y. L. (2018). Researching innovations in English language writing instruction: A state-of-the-art review. Journal of Language Teaching and Research, 9, 80–89. Google Scholar