ASCB logo LSE Logo

General Essays and ArticlesFree Access

Modifying Summer Undergraduate Research Programs during COVID-19 Increased Graduate School Intentions but Exacerbated Anxieties

    Published Online:https://doi.org/10.1187/cbe.22-12-0243

    Abstract

    COVID-19 created unprecedented challenges for college students, highlighting the need to provide educational contexts that foster well-being. Summer undergraduate research experiences (SUREs) constitute a high-impact practice, yet little systematic knowledge exists about how the first surge of COVID-19 influenced undergraduate researchers’ well-being. This knowledge is important for preparing for future disruptions. This study applies the student well-being model (SWBM) to examine how SURE status (e.g., modification vs. cancellation) impacted students’ mental health and graduate school intentions using primary survey data collected from U.S. undergraduate researchers in science, technology, engineering, and mathematics (STEM) fields in Summer 2020 (n = 408, from 131 institutions). Just under half had their SURE canceled, and the others engaged in modified SUREs. Students whose SUREs were canceled had reduced anxiety severity (p < 0.05), but greater concerns about graduate school matriculation (p < 0.001), compared with students with modified SUREs. Results suggest that modified SUREs are a reasonable path forward under conditions where in-person contact is untenable. Results point toward potential improvements in higher education practices that may enhance student well-being following disruptive events. Program directors can address potential causes of anxiety in modified SUREs, advocate for student-centered adjustments to graduate admission processes, and use experiences during COVID-19 as a springboard to broaden participation in undergraduate research.

    INTRODUCTION

    Faculty-mentored undergraduate research experiences are a high-impact practice in higher education (Kuh, 2008). Participants report improved critical thinking, increased interactions with faculty, enhanced academic achievement and retention, greater science self-efficacy, greater persistence to science, technology, engineering, and mathematics (STEM) degree completion, and improved readiness for graduate education and research careers (Hurtado et al., 2009; Adedokun et al., 2013; Gilmore et al., 2015; Robnett et al., 2015; Collins et al., 2017; Hernandez et al., 2018; Little, 2020). During the first surge of COVID-19 (in Spring–Summer 2020), undergraduate research training was disrupted by the COVID-19 pandemic, with many students’ research experiences being canceled or switching to remote engagement.

    To date, there is little systematic knowledge available about how this initial surge of COVID-19 influenced the well-being of undergraduate researchers (for exceptions, see Grineski et al., 2021; Morales et al., 2021; Speer et al., 2021; Yang Yowler et al., 2021; Erickson et al., 2022). We need better knowledge to document what happened in this historical moment and, more importantly, provide a basis for informed decision making during future surges or other disruptions. There will be regional and global surges of COVID-19 cases (e.g., delta variant in the United States in Fall 2021 and omicron in Winter 2022), and program directors will need to decide if they should cancel or modify undergraduate research programs periodically. Lessons learned during COVID-19 will also be applicable during future coronavirus-caused disruptions, which may become more common under climate change scenarios (Beyer et al., 2021). More generally, understanding how student well-being operated during a global crisis is imperative so that universities can implement new and adjusted measures to better support students on their academic journeys (Burns et al., 2020).

    We know that undergraduate research participation rates dipped following the initial COVID-19 surge, but have now rebounded. In the year before the COVID-19 pandemic (2019), 22% of all U.S. college seniors and 46% of biological science seniors had participated in faculty-mentored research (National Survey of Student Engagement, 2020). As of 2021, participation rates dipped very modestly (i.e., standardized difference scores, comparing participation in 2020 with 2021, averaged −0.08 across six high-impact practices, including “research with faculty”; National Survey of Student Engagement, 2021). As of 2022, participation rates had recovered (i.e., 22% of seniors had participated; National Survey of Student Engagement, 2022).

    While many STEM students engage in academic-year research, summer undergraduate research experiences (SUREs) are an important context for research training. The emergence of the COVID-19 pandemic in Spring 2020 was difficult for SURE programs. The majority of SURE program directors in the United States had already selected their cohorts and were planning for Summer 2020 when universities began to pivot from in-person engagement to online delivery modes for many functions, including undergraduate classes. While some SURE programs were canceled, others continued to operate in modified format (e.g., through remote research experiences). Biology students taking part in modified SURE programs in the United States saw the quality of mentorship, opportunities for learning and professional development, and development of a sense of community as strengths of modified programs, but felt that limited opportunities for cohort building, challenges with insufficient structure, and issues with technology were weaknesses (Erickson et al., 2022). Erickson et al. (2022) concluded that remote SUREs can be experienced favorably by students, and another study of one 2020 remote SURE program reached a similar conclusion (Yang Yowler et al., 2021). Other studies have examined undergraduate research experiences during COVID-19, without an explicit focus on SUREs. These studies have found that the pandemic altered mentor–mentee communication (Speer et al., 2021) and impacted graduate school intentions (Morales et al., 2021). During March and April 2020, economic hardship was an important correlate of barriers to research progress (e.g., lack of motivation, lack of access to tools needed to conduct the research), as were communication challenges and sexual minority status at U.S. universities (Grineski et al., 2022).

    While the two studies on SUREs during COVID-19 (Yang Yowler et al., 2021; Erickson et al., 2022) took steps toward building a knowledge base about experiences of students engaging in modified/remote SUREs due to COVID-19, little is known about students whose programs were canceled. We, alongside other SURE program directors, wondered whether program cancellation might affect student well-being, especially in terms of mental health and the pursuit of a graduate education, which are important outcomes in higher education. To investigate this, we use data collected from students enrolled at 131 U.S. universities in Spring 2020 who attended (or planned to attend) more than 100 different 2020 SUREs.

    We frame our analyses of these data with the student well-being model, or SWBM (Soutter et al., 2014) to understand SURE students’ well-being during COVID-19. This frame reflects the growing interest in college student well-being at universities and among researchers and the need to recognize and interact with undergraduate students as whole humans in order to make SUREs more inclusive and equitable. Well-being is a multifaceted concept that reflects factors in someone’s life that contribute toward fulfillment (Burns et al., 2020). Maintaining well-being is critically important to ensuring that undergraduate students thrive and flourish in their lives. However, COVID-19 has “shifted the student well-being domain considerably due in part to the extensive pragmatic changes that have been introduced to curb the spread of COVID-19” (Burns et al., 2020, p. 6), including changes to SUREs, the focus of this paper.

    Theoretical Model: Student Well-Being

    Challenges experienced by college students during COVID-19 have highlighted the importance of providing educational contexts to promote student well-being. While often considered narrowly in terms of test scores or attendance, the education community has sought to broaden conceptualizations of well-being (Soutter et al., 2014). As such, programming for student well-being has expanded to focus on mental and physical health, resilience, risk reduction, social support, relationships, and engagement (Soutter et al., 2014). Soutter et al. (2014) introduced a conceptual framework to advance understanding of student well-being. While Soutter et al.’s (2014) SWBM has rarely been applied to higher education (e.g., Shek and Leung, 2015), it has applicability to this context. The seven domains of the SWBM are organized into three categories, which are individually and collectively integral to well-being (Soutter et al., 2014). Each domain captures a specific element of well-being, and the SWBM highlights how these seven domains of well-being are interconnected (Soutter et al., 2014).

    In the SWBM, the category Assets includes the domains of Having, Being, and Relating. Assets are conditions and circumstances associated with well-being that are immediately accessible or invested for later use. Having captures the dimension of well-being that relates to resources, tools, and opportunities that the person has gained, such as through their income. Being relates to the conditions of someone’s life and their identity, which includes genetic makeup as well as social and cultural circumstances that shape sense of self, including racial/ethnic and health statuses. Relating emphasizes relationships and interpersonal connections that shape emotions, thoughts, and choice of actions, such as student–teacher relationships (Soutter et al., 2014).

    The second category, Appraisals, includes two domains: Thinking and Feeling. These are the cognitive and affective interpretations of how and why Assets are valuable to well-being. Thinking captures how student well-being is related to opportunities to be creative and mindful as well as to be actively engaged in cognitive tasks requiring time, continued questioning, collaboration, and cross-disciplinary study. The Feeling domain emphasizes the emotional component of well-being and how students recognize, express, and manage their feelings, including mental health and contentedness (Soutter et al., 2014).

    The third category, Actions, has two domains: Functioning and Striving. Actions relate to motivations to pursue well-being Assets (Soutter et al., 2014). Functioning relates to how well students engage in educational experiences that are not bounded by disciplines, generation, culture, or ideology, such that they extend beyond their immediate context. We consider grit to be an example of Functioning. Striving connects to students’ future goals and their abilities to stay motivated to achieve those goals in the face of adversity, such as gaining secure employment (Soutter et al., 2014). Table 1 illustrates specifically how variables we use in this study (which are used in other research on SUREs) fit within each of the seven domains in the SWBM.

    TABLE 1. How variables employed in this paper (and commonly used in SURE research) fit within the SWBM

    DomainDescriptionVariables related to SUREs by domain with references supporting their applicabilityJustification for variable assignment by domain
    Assets
    HavingRelates to what the students’ gain through their time in college
    • Because college credits earned and GPA reflect an accumulation of learning

    BeingRelates to the conditions of students’ lives and their identities
    • Because race/ethnicity, first-generation student status, sexuality, gender, and international student status are important elements of college student identity

    • Because pre-existing mental health conditions can shape sense of self

    RelatingEmphasizes relationships and interpersonal connections
    • Because the strength of one’s perceived social ties with others (e.g., family, friends, mentors) reflects the depth of one’s interpersonal connections

    Appraisals
    ThinkingIncludes opportunities to be creative and actively engaged in cognitive tasks
    • Because longer-duration research opportunities, especially in multiple places, can lead students to cultivate a nuanced understanding of the factors that comprise a research environment and provide more opportunities to engage in research tasks

    FeelingIncludes the emotional component of well-being
    • Because mental health severity metrics capture how students feel during their SURE experiences

    Actions
    FunctioningIncludes how students engage in educational experiences
    • Because challenges experienced due to COVID-19 shape how college students engaged in college life more generally

    • Because program status (canceled vs. modified) influences student engagement

    StrivingCaptures students’ future goals and their abilities to stay motivated to achieve those goals
    • Because graduate school is often an aspirational goal of undergraduate STEM students engaging in SUREs

    Research Questions

    In this study, we focus on how program status affected student outcomes through two research questions. The first research question is: How were students affected by SURE program changes, including cancellations and modifications? This question relies on univariate analyses and characterizes well-being within the Functioning domain of the SWBM. The second question is: How did SURE status (i.e., program cancellation vs. modification) impact students’ mental health and plans for attending graduate school, while controlling for other variables in the SWBM? We address this question via multivariable analyses and hypothesize that students whose programs were canceled will report worse mental health (in the SWBM Feeling domain) and attenuated graduate school intentions (in the SWBM Striving domain) relative to those whose programs were modified.

    The dependent variables for the second research question are in the domains of Feeling and Striving. We focus on those two domains for the dependent variables, as we believe they are particularly relevant to the well-being of undergraduate students in the context of the COVID-19 pandemic. The first dependent variable relates to mental health, a core element of student well-being within the Feeling domain. High rates of mental health problems have been reported by college students during the pandemic (Cao et al., 2020; Son et al., 2020; Giuntella et al., 2021) and before it (Hunt and Eisenberg, 2010; Xiao et al., 2017; Duffy et al., 2019). Few studies have examined the mental health of undergraduate researchers (Cooper et al., 2020a, b; Grineski et al., 2021). One paper studied undergraduate students at U.S. universities conducting research during Spring 2020 (during the first months of the pandemic) and found that 63% reported at least mild anxiety and 73% reported at least mild depression as of July 2020 (Grineski et al., 2021). Undergraduate researchers who reported more (vs. fewer) COVID-19–related adverse event experiences suffered from more severe anxiety and depression. Women and LGBQ+ students had more severe symptoms than men and non-LGBQ+ students (Grineski et al., 2021). While these three studies are important first steps, additional research on undergraduate researchers’ mental health is needed, especially in the context of COVID-19, which has dramatically impacted college student well-being. We begin to address that gap by investigating COVID-19–associated SURE cancellations and mental health.

    Aspirations to attend graduate school are the second dependent variable, and they fit within the Striving domain of well-being. The transition from undergraduate to graduate education is a critical stage for students’ pursuit of STEM and research careers. Many undergraduates decide to leave science at this moment (Myers and Pavel, 2011). Undergraduate research is associated with graduate school enrollment and later success (Carter et al., 2009; Gilmore et al., 2015; Hernandez et al., 2018). Undergraduate research helps students gain mastery of research skills and develop close relationships with faculty members, which can lead them to graduate school; students interested in graduate school often seek out SUREs (Gentile et al., 2017). Researchers found that the pandemic (as of July 2020) had strengthened undergraduate researchers’ graduate school intentions when they were highly impacted by COVID-19. (Morales et al., 2021). This suggests that it is also important to examine how summer program status influenced undergraduate researchers’ graduate school intentions.

    METHODS

    Participants and Data Collection

    We conducted a survey of undergraduate researchers attending U.S. universities, approved by the University of Utah’s Institutional Review Board (no. 00133477). We worked with directors of academic-year and SURE programs at 18 different universities to develop a sampling frame of 2440 undergraduate researchers. Some of those directors oversaw one program, and others oversaw many programs, depending on their roles (e.g., principal investigator on one training grant vs. director of an undergraduate research office) and the institution. We administered the survey online using QuestionPro between July 6 and July 31, 2020. Program directors emailed students three times using scripts provided by the authors. Students received the initial invitation to participate and nonrespondents received two additional reminders one week apart. Student participants received a $20 Amazon gift card as an incentive. The survey covered domains including personal and educational circumstances in Spring 2020; research experiences during COVID-19; prior research experiences and mentoring; general COVID-19 experiences; mental health and social support; and sociodemographic characteristics. The survey took 30 minutes on average to complete. In total, 2246 students were initially emailed about the survey; six of those bounced back as undeliverable, and three were screened due to being current graduate students. This left n = 2237. Of those, 188 students clicked on the survey but did not take it, while 1220 responded to the survey. Our response rate was 54.5%, and our cooperation rate was 86.6%.

    In this paper, we focus on the subset of respondents who were accepted to Summer 2020 SUREs “to work with a faculty mentor, for pay, and through a formal program” and whose programs were canceled or modified (n = 457). We excluded 49 students who were missing data for more than 25% of the analysis variables included in the multivariable models, leaving an analysis n of 408 students. These students were accepted into more than 100 different SURES.

    In terms of participant characteristics, the three largest majors represented are life sciences (50%), engineering (23%), and math (12%). Just under half (49%) are underrepresented racial/ethnic minority students (i.e., from Hispanic, non-Hispanic Black, Native American, Native Hawaiian, Pacific Islander, multiracial, or other [non-Asian or non-White] racial backgrounds), while one-third (32%) are first-generation college students. In terms of gender, 36% are men, 63% are women, and 1% are another gender; 16% are LGBQ+. Less than one-tenth are international students, while just more than three-quarters are juniors and seniors.

    Measures

    Variables for the first research question come from a series of items we asked about each student’s summer program. We asked two blocks of questions: one for students whose programs ran under modified conditions, which addressed how their programs were modified and their satisfaction with their modified programs; and another for students whose programs were canceled, which reported on how their programs’ cancellations affected them. Descriptive statistics for the items are included in the Results.

    Descriptive statistics for the variables used to answer the second research question are presented in Table 2. These variables are constructed from survey questions that were asked of all students, regardless of their program status. Appendix A in the Supplemental Material provides a complete list of survey items used to create those variables. Variables in each domain are described in the following sections.

    TABLE 2. Descriptive statistics for analysis variables in each SWBM domain (n = 408)a

     NMin.Max.MeanSD% MissingYes (%)No (%)
    Independent variable
    Functioning domain
    SURE canceled (vs. modified)407010.02193 (47.4)214 (52.6)
    Control variables
    Having domain
    GPA3982.743.760.302.5N/AN/A
    Junior2407010.02166 (40.8)241 (59.2)
    Seniorb407010.02145 (35.6)262 (64.4)
    Being domain
    Underrepresented racial/ethnic minority407010.02200 (49.1)207 (50.9)
    First-generation student396013.9197 (49.7)269 (50.3)
    LGBQ+401011.766 (16.5)335 (*3.5)
    Manc408010145 (35.6)263 (64.4)
    Other genderc4080104 (0.01)404 (99.09)
    International student40801032 (7.8)396 (92.2)
    Pre-existing psychological conditiond3670110.076 (20.7)291 (79.3)
    Relating domain
    Social support3971.3375.471.062.7N/AN/A
    Thinking domain
    Did Spring 2020 research408010310 (76.0)98 (24.0)
    Semesters of research experience406082.972.130.05N/AN/A
    SURE at home university407010.02106 (26.0)301 (74.0)
    Functioning domain
    COVID-19 adverse event experiences score406092.511.84N/AN/A
    Dependent variables
    Feeling domain
    Anxiety severity40362713.424.631.2N/AN/A
    Depression severity40483517.255.311.0N/AN/A
    Striving domain
    Applying to graduate school406152.931.230.05N/AN/A
    Getting in to graduate school407153.11.120.02N/AN/A

    aWhile this table shows all variables by SWBM domain, the order in which the domains are presented is not identical to Table 1, as we have given primacy to independent vs. dependent variables here.

    bReference: freshman/sophomore/unclassified.

    cReference: woman.

    dThe question about pre-existing psychological disorders listed anxiety, depression, and PTSD as examples of disorders. See Appendix A in the Supplemental Material for complete item.

    Independent Variable.

    The focal independent variable is SURE program status, which falls in the Functioning domain. We asked students, “Which statement best characterizes the current status of your summer research opportunity?” Response options included “Research opportunity was canceled” and “Research opportunity has changed, but is still running.” We coded this variable as 1 = program was canceled and 0 = research opportunity has changed (i.e., program was modified).

    Control Variables

    Having Domain: Academic Standing.

    We used two variables: self-reported cumulative grade point average (GPA) and self-reported classification (i.e., junior, senior, and freshman/sophomore/unclassified [reference category]).

    Being Domain: Social Demographic Characteristics and Pre-existing Mental Health Conditions.

    We used five sociodemographic variables in this domain. We coded race/ethnicity into two categories (i.e., 1 = Hispanic and non-Hispanic Black, Native American, Native Hawaiian, Pacific Islander, multiracial, and other race; 0 = non-Hispanic White or Asian). We examined first-generation status based on highest level of schooling for each parent such that first-generation students (1 = yes; 0 = no) are those for whom neither parent earned a bachelor’s degree. We also included LBGQ+ status (1 = gay, bisexual, lesbian, pansexual, asexual, or other sexuality; 0 = not LBGQ+), gender (i.e., man, woman [reference category]; other [trans, genderqueer, gender non-confirming, and other]), and international student status (1 = yes; 0 = no). In the models predicting mental health, we also asked about lifetime prevalence of pre-existing conditions to create a variable assessing lifetime presence of a psychological disorder (1 = psychological disorder [e.g., anxiety, depression, PTSD]; 0 = no).

    Relating Domain: Social Support.

    When predicting mental health outcomes, we include social support using the Multidimensional Scale Of Perceived Social Support (MSPSS), initially designed for use with university students (Zimet et al., 1988; Dahlem et al., 1991). MSPSS had strong factorial validity and good internal and test–retest reliability (Zimet et al., 1988). Following others (Zimet et al., 1988; Dahlem et al., 1991), we calculated MSPSS scores by averaging across responses to 12 items, scored on a seven-item Likert scale ranging from very strongly disagree (1) to very strongly agree (7); higher scores correspond to more social support. The scale had high internal consistency (Cronbach’s α = 0.898).

    Thinking Domain: Previous Research Experience.

    We used three variables. First, we calculated the total number of semesters each student had conducted research, counting each summer as one semester, before Spring 2020. We used an indicator for whether they conducted research during Spring 2020 (1 = yes; 0 = no). Given the COVID-19 disruptions to Spring 2020 research, we considered it as a separate variable. We also considered whether the 2020 SURE program was at the students’ own home institutions (i.e., the university a student attended during the academic year; = 1) or another institution (= 0).

    Functioning Domain: Adverse Event Experiences.

    We created an adverse event experiences score by summing responses (1 = yes; 0 = no) to items representing a potential impact of COVID-19, following Grineski et al. (2021). We asked students: “Have you ever had the following experiences during COVID-19?,” and the options included items including “difficulty traveling,” “experienced a salary cut,” and “took care of others who were sick.” See Grineski et al. (2021) for a complete list of items. At the time, there were no pre-existing survey items to assess adverse event experiences specifically caused by COVID-19 among college students, so we created these items based on a review of the gray literature and our experiences as faculty who worked closely with students during the pandemic. We revised the list of experiences iteratively with feedback from undergraduate and graduate students.

    Dependent Variables

    Feeling Domain: Mental Health.

    We examined both anxiety and depression severity using pre-existing scales as dependent variables. For anxiety, we used the seven-item Generalized Anxiety Disorder Scale (GAD-7; Spitzer et al., 2006), which is one of the most widely used instruments for anxiety disorder screening (Cao et al., 2020) due to its criterion, construct, factorial, and procedural validity (Spitzer et al., 2006). We calculated the GAD-7 by summing the seven items, scored from 0 (not at all) to 3 (nearly every day). We used a self‐administered depression scale called the Patient Health Questionnaire (PHQ-9; Spitzer et al., 1999). Each of the nine criteria is scored as 0 (not at all) to 3 (nearly every day), and then scores are summed (Kroenke et al., 2001). Both scales had high internal consistency in this sample (Cronbach’s α = 0.913 for GAD-7 and 0.891 for PHQ-9) and have been validated in other college student samples (Keum et al., 2019; Sriken et al., 2021).

    Striving Domain: Graduate School Intentions.

    We treat graduate school intentions as dependent variables. We asked each student how they thought that their summer program’s status (i.e., the cancellation of their summer research opportunity or the changes to their summer research opportunity, when applicable) would affect their future plans in terms of the likelihood of applying to graduate school. We asked the same question about “getting in to graduate school.” Both measures were assessed on a scale, which we reverse coded to be 1 (greatly increased likelihood) to 5 (greatly reduced the likelihood).

    Data Analysis

    To answer the first research question, we used univariate statistics, including frequencies and means. To answer the second research question, we used multivariable generalized estimating equations (GEEs). We used IBM SPSS Statistics 25 (for Windows 10) to run all analyses. GEEs extend from the generalized linear model to adjust for clustering (Garson, 2012). We used the student’s home institution (n = 131) and major/discipline grouping (n = 6) to define clusters. We created the six categories of majors by recoding student responses to a list of 90 majors taken from the CIRP Freshman Survey (University of California at Los Angeles, 2020) into the following groupings: life sciences, engineering, health professions, social/behavioral sciences, math/computer science/physical science, and all other majors. The number of students in each cluster ranged from one to 44 students.

    We ran four GEEs, one for each dependent variable. We use the independent variable and all the control variables in the models predicting anxiety and depression severity. For the two graduate school intention variables, we used all the same variables, except for social support and pre-existing mental health conditions, as they are mental-health specific.

    Due to missing data (see Table 2 for percent missing for each variable), we applied multiple imputation (MI) before running the GEEs. We used MI, because missing values across a set of variables can substantially reduce the sample size, statistical power, and precision and introduce bias if the values are not missing completely at random (Sterne et al., 2009). MI involved using a regression-based approach to create multiple sets of values for missing observations (Enders, 2010). We created 20 multiply imputed data sets, each with 200 iterations, and the imputed values at the maximum iteration were saved to the imputed data set (Enders, 2010). When analyzing MI data in multivariable models like GEEs, the standard errors take into account the uncertainty associated with the missing values (Rubin, 1987). To explore whether model results were sensitive to our use of MI, we also replicated the four GEEs, including only cases with complete data in a sensitivity analysis.

    GEE models use an intracluster correlation matrix that we specified as exchangeable, which assumes constant intracluster dependency (Garson, 2012). To refine fit, we tested normal, gamma, and inverse Gaussian distributions with logarithmic (log) and identity link functions (Garson, 2012). The best-fitting specifications for all models was inverse Gaussian with log link. Results from the GEEs are not affected by multicollinearity based on tolerance and variance inflation factor criteria (Hair et al., 2013).

    RESULTS

    Research Question 1

    Well-Being and Summer Research Program Cancellations.

    Just under half of students (47.4%) who had been accepted into a SURE that did not run “as usual” had their programs canceled. The majority of those students whose programs were canceled were planning to attend their SURE at another university (88.6%). In terms of losses students experienced due to SURE cancellations, 93.8% reported having lost the opportunity to learn new skills, and 92.7% reported that they had lost opportunity to network with students and professors. Approximately 81% reported that they had lost their summer employment; 50.5% said the cancellation opened up new opportunities for the summer (e.g., summer classes); and 13.9% lost housing for the summer.

    Well-Being and Modified Summer Research Programs.

    More than half of students who reported that their SUREs that did not run “as usual” moved forward but with program changes. Of those students, 61% participated in a modified SURE at another university, while the rest had a modified SURE at their home institutions. In terms of how the SUREs were modified, 82.6% of students participated in a program where only remote research was allowed. More than half (58.5%) said that the program reduced the number of hours that students were expected to work. Under half (41.4%) saw a reduction in pay as compared with the initial offer. Approximately one-quarter (22.8%) lost subsidized housing due to the changes to their SUREs.

    Table 3 reports on students’ satisfaction with their modified SUREs. The average ratings were between “neutral” (3) and “satisfied” (4) for their programs and the different elements of the programs. Students were most satisfied with their relationships with their faculty mentors (68.4% satisfied) and least satisfied with opportunities to present (28.2% satisfied) and publish (32.7% satisfied) their research.

    TABLE 3. Student satisfaction with the modified summer research program (n = 214 students)

    MeanaSD% Dissatisfiedb% Neutral% Satisfiedc
    Research experience in general3.410.98719.631.349.1
    Relationship with your faculty mentor3.891.0018.423.268.4
    Relationship with other lab/research members3.561.05914.531.853.7
    Opportunity to learn new research skills3.361.16125.721.552.8
    Opportunity to share your research through publications2.991.08832.736.930.4
    Opportunity to share your research through presentations3.181.11028.229.142.7
    Opportunity to clarify your future career path3.391.17725.221.153.7
    Provision of laptop or other research supplies to conduct research3.300.95714.048.637.4

    aThe scale is: 1 = very dissatisfied, 2 = not satisfied, 3 = neutral, 4 = satisfied, 5 = very satisfied.

    bThis includes very dissatisfied and not satisfied.

    cThis includes satisfied and very satisfied.

    Research Question 2

    SURE Status and Mental Health (the Feeling Domain).

    Table 4A shows that students whose SUREs were canceled had reductions in anxiety severity as compared with students with modified SUREs. Specifically, they had GAD-7 scores that were 8% lower than those with modified programs (p < 0.05). We examined this finding using estimated marginal means. When all other variables were at the mean, the average anxiety score for students with canceled SURE programs was 12.36 (anxiety score range: 6–27); that score increased to 13.49 when the program was modified instead, holding all other variables at their mean. As shown in Table 4B, the association between cancellation and depression severity was not significant (p < 0.10) but was in the same direction, such that cancellation was associated with reduced depression severity. Students with more adverse event experiences had greater anxiety (p < 0.001) and depression (p < 0.001) severity.

    TABLE 4. Results of generalized estimating equations predicting anxiety symptoms (A) and depression symptoms (B; Feeling domain) and student self-assessment as to how the status their summer programs would influence their likelihood of applying to graduate school (C) and being accepted into graduate school (D; Striving domain), scaled such that higher scores on these dependent variables correspond to reductions in likelihood (n = 408)a

    (A)(B)(C)(D)
    BLCI, UCIpExp (B)BLCI, UCIpExp (B)BLCI, UCIpExp (B)BLCI, UCIpExp (B)
    Intercept2.9612.35,3.57***19.323.3962.95,3.84***29.840.8250.20,1.45**2.280.7980.25,1.35**2.22
    Independent variable
    F: SURE canceled−0.081−0.16,−0.01*0.92−0.066−0.14,0.010.940.1810.08,0.28***1.200.3180.25,0.39***1.37
    Control variables
    H: GPA−0.055−0.20,0.090.95−0.095−0.20,0.010.910.080−0.08,0.241.080.068−0.07,0.201.07
    H: Juniorb0.021−0.07,0.111.020.006−0.07,0.091.01−0.168−0.28,−0.06**0.85−0.091−0.18,0.01*0.91
    H: Seniorb0.024−0.07,0.121.02−0.002−0.10,0.101.00−0.183−0.30,−0.07**0.83−0.113−0.22,−0.01*0.89
    B: Underrepresented racial/ethnic minority0.031−0.03,0.101.030.1310.06,0.20***1.14−0.095−0.19,−0.01*0.91−0.065−0.14,0.010.94
    B: First-generation student0.005−0.07,0.081.010.016−0.05,0.081.020.063−0.03,0.161.070.073−0.02,0.161.08
    B: LGBQ+0.082−0.03,0.191.090.020−0.05,0.091.020.010−0.10,0.121.010.009−0.08,0.101.01
    B: Manc−0.107−0.18,−0.04**0.90−0.035−0.10,0.030.97−0.019−0.12,0.080.98−0.034−0.12,0.050.97
    B: Other genderc−0.077−0.34,0.190.93−0.087−0.35,0.170.92−0.208−0.43,0.020.81−0.213−0.61,0.180.81
    B: International student0.046−0.07,0.171.050.025−0.08,0.131.03−0.137−0.32,0.050.87−0.053−0.20,0.100.95
    B: Pre-existing psychological condition0.1870.08,0.29***1.210.1180.03,0.21*1.13
    R: Social support−0.057−0.09,−0.02***0.94−0.060−0.09,−0.03***0.94
    T: Did Spring 2020 research−0.036−0.12,0.050.96−0.074−0.16,0.010.93−0.059−0.18,0.060.94−0.036−0.14,0.070.96
    T: Semesters of research experience0.018−0.001,0.041.020.016−0.01,0.031.020.019−0.01,0.041.020.004−0.02,0.031.00
    T: SURE at home university0.1020.02,0.19*1.110.0850.01,0.16*1.090.102−0.01,0.211.110.056−0.06,0.171.06
    F: COVID-19 adverse event experiences score0.0500.03,0.07***1.050.0440.025,0.06***1.04−0.001−0.02,0.021.000.007−0.01,0.031.01

    a Models control for clustering by major grouping and home university and use an inverse Gaussian distribution, log link, and exchangeable correlation matrix. Pooled results from 20 multiple imputed data sets are presented. The GAD-7 and PHQ-9 scores were summed by 1 before entering them in the GEEs. UCI, upper 95% confidence interval; LCI, lower 95% confidence interval. H, Having domain; B, Being domain; R, Relating domain; F, Functioning domain in the SWBM.

    b Reference: freshman/sophomore/unclassified.

    c Reference: woman.

    *p < 0.05.

    **p < 0.01.

    ***p < 0.001.

    SURE Status and Graduate School Intentions (the Striving Domain).

    Students with canceled SUREs perceived that their program status would more negatively affect their likelihood of applying to graduate school (1.2, p < 0.001, Table 4C) and getting into graduate school (1.4, p < 0.001, Table 4D) than did students who participated in modified SUREs. Remember that the dependent variables are reverse scaled, such that positive effects reflect a reduced likelihood. We used estimated marginal means to predict the average “applying to graduate school” value when the students’ programs were canceled versus modified and all other variables were at the mean. “Applying to graduate school” was measured on a scale of 1–5 (with higher scores signaling a reduction in intentions), and the estimated mean value was 3.21 for students with canceled programs and 2.68 when the programs were modified, holding all other variables at the mean. For “graduate school acceptance,” assessed on the same scale, the average score was 3.62 for those from canceled programs and 2.63 for those with modified programs, when all other variables were at the mean.

    Sensitivity Analysis.

    When using only data for which we had complete cases, findings generally agreed with the MI results presented in Table 4. In terms of our focal independent variable, SURE program status, results were identical in terms of the direction of effect and statistical significance (p < 0.05) across the four models when we used and did not use MI. There were only small shifts in control variable findings between the MI results (Table 4) and the complete case results (described next). When predicting anxiety (n = 338), underrepresented racial/ethnic minority status became significant and positive, and SURE at home university lost significance. When predicting depression (n = 336), SURE at home university lost significance. When predicting applying to graduate school (n = 379), there were no differences. When predicting acceptance into graduate school (n = 380), senior lost significance. This suggests that results are not particularly sensitive to our use of MI, although the loss of several findings in the complete case analysis likely reflects how statistical power is improved through the use of MI (Zha and Harel, 2021).

    DISCUSSION

    Our analysis addressed two research questions. The first was: How were students affected by SURE program changes, including cancellations and modifications? The second was: How did SURE status (i.e., program cancellation vs. modification) impact students’ mental health and plans for attending graduate school? We answered these questions using data from a U.S. nationwide survey conducted during Summer 2020.

    In terms of answering our first research question, we found that just under half of students accepted into 2020 SUREs that did not run “as usual” had their programs canceled. Program cancellations were more common among students who had planned to travel to another university for their SUREs than those who planned to stay at their home universities, suggesting the programs targeting national cohorts may have been more likely to cancel than those targeting local cohorts. These students reported substantial losses, yet more than half of them reported that SURE cancellation opened up new opportunities, suggesting that they were making the best of a difficult situation. Students who participated in a modified SURE were somewhat satisfied with their programs (Table 3). These findings suggests that students were Functioning (i.e., engaging in educational experiences that expanded their horizons; Soutter et al., 2014) relatively well, or at least not poorly, given challenging circumstances.

    In terms of the second research question, we found that participation in modified SUREs was associated with greater anxiety severity (Table 4A), which ran counter to our hypothesis. Elements of modified programs could certainly be anxiety inducing and thus harmful in the Feeling domain of well-being. This domain is important to well-being, as it captures the emotional component of the concept (Soutter et al., 2014). The vast majority of students with modified SUREs were doing only remote research, which was not their original plan or the original plan of their mentors. A study of undergraduate researchers at Washington University in St. Louis, conducted in May and June of 2020, revealed that altered communication due to remote mentoring relationships was a major change in undergraduate students’ research experiences in the early months of the pandemic. Students either indicated that communication from their mentors was slower or nonexistent, which posed a major challenge to their research progress or that, while they communicated regularly with their mentors over email or videoconference, they missed speaking with other research team members (Speer et al., 2021). Housing and financial challenges among students with modified SUREs may have added to their anxiety. A full one-fifth in our study were “dissatisfied” with their modified programs, which implies that their experiences in programs could have added to their distress. However, we cannot rule out that doing research itself via any SURE is to some degree anxiety inducing (apart from any elements unique to modified programs), as we assume that students in the canceled group did not do research during that summer. Either way, findings suggest that program directors should expect anxiety symptoms among participants in modified programs, regardless of the source.

    How high were the levels of anxiety (and depression) symptomology experienced by these students in relative terms? There is no baseline in terms of published levels of anxiety and depression severity among undergraduate researcher populations pre-COVID-19. Therefore, we must compare our data with statistics on U.S. college students pre–COVID-19. While not a perfect comparison, undergraduate researchers are included within this broader group. A U.S. national study of college students using the same scales as we did found that students scored 7.69 on the GAD-7 (anxiety) and 8.45 on the PHQ-9 (depression) during the 2017–2018 school year (Duffy et al., 2019). The average values here were 13.42 and 17.25, for anxiety and depression respectively (Table 2), which are double those baseline values. This dramatic difference is likely reflective of the pandemic’s negative impact on college student mental health (Cao et al., 2020; Son et al., 2020), rather than differences between researcher versus non-researcher student populations. While students with canceled programs experienced significantly less anxiety and marginally less depression than those in modified programs, the average student in the sample suffered from moderate to moderately severe anxiety and depression, regardless of program cancellation or modification.

    Also in response to the second research question, we found that students with canceled programs reported the belief that their programs’ status would negatively influence their likelihood of applying to and being accepted into graduate school to a greater degree than students with modified programs (Table 4, C and D), as we had hypothesized. This Striving finding aligns with other research showing that being highly affected by COVID-19 reduced students’ graduate school intentions during the pandemic (Morales et al., 2021). Our findings may reflect students’ awareness of how undergraduate research participation facilitates the transition to graduate school (Gilmore et al., 2015; Hernandez et al., 2018). We can infer that these students believed that participation in modified programs versus cancellation during Summer 2020 was advantageous in terms of gaining acceptance to graduate school. The difference in effect size for the program status variable between the “gaining acceptance” and “applying” (i.e., larger for “gaining acceptance”) models suggests that students with canceled programs were more worried about how program cancellation would impact decisions made about their application by admissions committees than they were about their own likelihood of applying.

    While acceptance decisions are out of students’ hands, students are in control of the decision to apply to graduate school. Thus, the belief we found among students with canceled programs (relative to those with modified programs) that they would be less likely to apply to graduate school is particularly concerning (even though the effect size is smaller), as it suggests that these students’ Striving was reduced and that they were more likely to disadvantage themselves by not submitting graduate school applications in the first place. This reflects how canceled programs harmed student well-being in the Striving domain, which captures the ability to stay motivated to achieve goals in the face of adversity (Soutter et al., 2014).

    The findings presented in this paper allow us to respond to Burns et al.’s (2020) call to use research to understand what provisions would be most suitable within a higher education context during COVID to enhance student well-being. Our results, when paired with a small but growing body of research on modified SURE participants (Yang Yowler et al., 2021; Erickson et al., 2022) and virtual mentorship (Speer et al., 2021), suggest that remote SUREs are a reasonable and workable option under conditions wherein face-to-face contact is difficult. Given that many programs transitioned online in Summer 2020 with very little lead time, it stands to reason that online undergraduate research programs can be better when designed under more reasonable time horizons.

    Research on student well-being in SUREs during COVID-19 is important, as it is likely that COVID-19 and other disruptions will be part of the SURE landscape for the foreseeable future. While some guidance exists for how to run remote SUREs (Qiang et al., 2020; Erickson et al., 2022), this analysis complements those studies by clarifying the impacts of summer program cancellation on students and by providing additional evidence that remote programs are a reasonable path forward. More generally, our findings, informed by the SWBM, speak to the importance of researchers, faculty mentors, and program directors taking a student wellness–centered approach when pivoting due to unforeseen circumstances. These circumstances can occur on a more macro-scale, such as in response to COVID-19 or a specific disaster, and on the micro-scale of the individual student (e.g., a family emergency, caregiving responsibilities, or individual health challenges). Being attuned to student wellness is particularly important for informing a discourse around how researchers, mentors, and program directors should interact with mentees as whole humans who need flexibility and empathy. A perspective grounded in well-being can inform undergraduate research programs that are inclusive and equitable in order to more fully support diverse students as knowledge creators.

    Limitations

    First, we asked students to reflect on SUREs in July 2020, when those doing modified programs were still engaged in the programs, so their perspectives could have changed by the ends of their programs. We also asked students about the SURE that they were initially accepted into when COVID-19 hit. We did not ask whether they had switched to another program at a later date. We conducted the survey in July 2020, due to concerns about recall bias regarding their pandemic experiences. Second, we focused on a limited number of short-term outcomes. Third, while we designed the survey to capture multiple genders, small numbers of other gender students limited analyses of this group. Fourth, there are unmeasured confounders occurring at the same time as the pandemic (e.g., political turmoil, heightened national consciousness of systemic racism) that could have influenced outcomes. Fifth, our response rate was 54%, and we lack data on why just under half of the students did not participate in the study. Possible reasons include not checking email or feeling too overwhelmed with pandemic-related challenges. Sixth, while our sample includes students from a wide variety of institutional contexts across the United States, it is not necessarily nationally representative. Generally speaking, the universities providing the most participants were research-intensive second-tier state universities, flagship state universities, and private universities. The sample also includes students with relatively extensive research experience, as they averaged three semesters of research (with a median of two semesters) before Spring 2020. It is possible that SURE program changes could have impacted those with less research experience to a greater degree than those with more, but our sample does not permit these analyses due to small counts of students with little research experience.

    Finally, we do not know whether certain social groups were more sensitive to the effects of program status on the outcomes than others. For example, students with pre-existing mental health conditions had higher GAD-7 (anxiety) and PHQ-9 (depression) scores (Table 4). It is not clear how influential these differences were on associations between program status and anxiety and depression severity. Because only 32 students reported pre-existing conditions, we could not conduct a multivariable sensitivity analysis. Looking at bivariate comparisons stratified by pre-existing conditions (unpublished data), students (with and without pre-existing conditions) in modified programs had higher anxiety and depression severity scores than their counterparts from canceled programs. Differences in severity scores between programs with different status were larger for those with pre-existing conditions, suggesting they may have been more sensitive to the effects of program modification.

    Recommendations

    There are three recommendations to enhance student well-being stemming from our findings for undergraduate research program directors, staff and faculty, all of which are also applicable to academic-year research experiences.

    Address Potential Causes of Mental Health Challenges.

    Recognizing that modified undergraduate research programs are likely be anxiety inducing, undergraduate research programs can address potential challenges with remote and hybrid research contexts and what to expect during program orientation sessions. More broadly, addressing anxiety among undergraduate researchers conducting research in any modality is important, as this mental health condition widely affects college students today. Sharing information about counseling services provided by the university with all research participants is also essential to promote well-being. Faculty members, including research mentors, are well positioned to share mental health resources with students as well (Chirikov et al., 2020). Program directors can even provide those materials directly to faculty, who may not know where to turn for such information (Kalkbrenner et al., 2021). While faculty mentors are not mental health professionals, they have an important role to play by being attentive to mental health issues, empathetic, and connectors, linking students with resources on campus. A national survey of more than 14,000 faculty revealed that more than half (63%) were not engaged in referring their students to mental health services, although the vast majority (87%) thought this was part of a faculty’s role.(Albright and Schwartz, 2017). When faculty work with small numbers of students (as is the case in undergraduate research), they feel more comfortable broaching mental health issues with students and referring them to services (Kalkbrenner et al., 2021).

    Programs can also sponsor trainings in self-care and well-being for trainees. It is important to be cognizant of the high levels of depression and anxiety symptoms among undergraduate researchers, apart from any additional impact of modified research experiences. Research programs and faculty mentors must prepare to deal with increasing numbers of trainees with mental health challenges moving forward, as increasing numbers of college students experience mental health problems (Hunt and Eisenberg, 2010; Xiao et al., 2017).

    Be an Advocate for Student-Centered Adjustments to Graduate Admission Processes.

    To address concerns about how the pandemic will influence students’ Striving via matriculation to graduate school, we recommend that faculty mentors advocate for adjustments to graduate admissions at their universities to account for the effects of COVID-19. These adjustments could include waiving Graduate Record Examination requirements, if they are still in place; recognizing that those attending college in places with strict COVID-19 protocols were kept out of in-person research for some time; waiving application fees due to economic barriers; and acknowledging that letters of reference from faculty mentors might be less personal due to a predominance of virtual interactions during 2020 and 2021.

    Use Experiences Learned during COVID-19 as a Springboard to Broaden Participation.

    Satisfaction data suggest that there is room for improvement in modified SUREs, which is not surprising, given how quickly they were developed for Summer 2020. In the future, programs can do a better job of promoting student Functioning from the outset by ensuring that students have access to needed laptops and research supplies. Actions at the university level have facilitated student access to laptops. Some have pivoted to requiring that all entering students have a laptop that meets specific requirements (e.g., College of Information Sciences and Technology at the University of Pennsylvania). Others have provided laptops directly to students at no charge thanks to donor support (e.g., Marietta College, Ohio) or federal COVID-19 relief funds (e.g., Cheyney University, Pennsylvania). The president of Cheyney, which is the nation’s oldest HBCU, recognized that the laptops were an essential resource for students engaging in undergraduate research, among other activities (Cheyney University, 2022).

    To improve interactions between mentors and mentees in an online environment and promote well-being in the Relating domain, students like platforms like Slack, GroupMe, WhatsApp, or Discord to help facilitate informal communication (Erickson et al., 2022), and they prefer videoconferencing, rather than email, when communicating with mentors (Speer et al., 2021). These platforms are useful to facilitate informal social interaction among researchers, even if they also meet in person. SURE students in a National Institutes of Environmental Health Sciences R25 at the University of Utah have used WhatsApp to communicate with one another, even after COVID-19 restrictions eased in 2021. Professional development curricula, which are a staple of SUREs, can be delivered via teleconference and via an online framework to facilitate students finding more clarity in their career paths. This is currently happening at the University of Utah, where the Office of Undergraduate Research Education Series consists of live in-person and virtual (Zoom) workshops, as well as “on demand” (prerecorded) workshops. The literature already provides guidelines for how to best structure online/remote research experiences for undergraduates. These include borrowing strategies from distance-learning resources (Qiang et al., 2020) and streamlining engagement to focus on remote data analysis, literature reviews and science writing, and science journal clubs (Chandrasekaran, 2020). Erickson et al. (2022) also provide a detailed list of recommendations for remote SUREs from undergraduate Research Experiences for Undergraduates (REU) students within 10 areas that we recommend reviewing. This “forced innovation” in SUREs could transform into a “pandemic dividend” (i.e., long-term and unintentional benefits emerging from the pandemic) as others have recognized with online educational innovation during COVID-19 (Yang and Huang, 2020).

    Above all, we can conceive of modified SUREs developed on the fly during COVID-19 as a pilot for future online SUREs. While there is some resistance to this at the faculty level (Kamariah and Azlina, 2022), these remote/online research opportunities can provide a model for how to create research opportunities for students who were previously unable to participate in SUREs (e.g., nontraditional students with dependents who cannot quit their jobs and move across the country for 10 weeks). If programs are willing to operate under a hybrid model in the future, a silver lining of the pandemic could be expanded SURE opportunities for a greater diversity of diverse students.

    REFERENCES

  • Adedokun, O. A., Bessenbacher, A. B., Parker, L. C., Kirkham, L. L., & Burgess, W. D. (2013). Research skills and STEM undergraduate research students’ aspirations for research careers: Mediating effects of research self-efficacy. Journal of Research in Science Teaching, 50(8), 940–951. Google Scholar
  • Albright, G., & Schwartz, V. (2017). Are campuses ready to support students in distress: Higher education survey white paper. Retrieved May 15, 2023, from https://go.kognito.com/rs/143-HCJ-270/images/HiEd_WP_080817_HigherEdSurveyWhitePaper.pdf Google Scholar
  • Berkes, E. (2008). Undergraduate research participation at the University of California, Berkeley. Berkeley, CA: Center for Studies in Higher Education. https://escholarship.org/uc/item/7tr4c62k Google Scholar
  • Beyer, R. M., Manica, A., & Mora, C. (2021). Shifts in global bat diversity suggest a possible role of climate change in the emergence of SARS-CoV-1 and SARS-CoV-2. Science of the Total Environmenthttps://doi.org/10.1016/j.scitotenv.2021.145413 MedlineGoogle Scholar
  • Burns, D., Dagnall, N., & Holt, M. (2020). Assessing the impact of the COVID-19 pandemic on student wellbeing at universities in the United Kingdom: A conceptual analysis. Frontiers in Education. https://doi.org/10.3389/feduc.2020.582882 Google Scholar
  • Cao, W., Fang, Z., Hou, G., Han, M., Xu, X., Dong, J., & Zheng, J. (2020). The psychological impact of the COVID-19 epidemic on college students in China. Psychiatry Research, 287, 112934. MedlineGoogle Scholar
  • Carter, F. D., Mandell, M., & Maton, K. I. (2009). The influence of on-campus, academic year undergraduate research on STEM Ph.D. outcomes: Evidence from the Meyerhoff Scholarship Program. Educational Evaluation and Policy Analysis, 31(4), 441–462. MedlineGoogle Scholar
  • Chandrasekaran, A. R. (2020). Transitioning undergraduate research from wet lab to the virtual in the wake of a pandemic. Biochemistry and Molecular Biology Education, 48(5), 436–438. MedlineGoogle Scholar
  • Cheyney University. (2022, 08/10/2022). Cheyney University provides laptops to all students. Retrieved March 30, 2023, from https://cheyney.edu/news/press-release-cheyney-university-provides-laptops-to-all-students/ Google Scholar
  • Chirikov, I., Soria, K. M., Horgos, B., & Jones-White, D. (2020). Undergraduate and graduate students’ mental health during the COVID-19 pandemic. Berkeley, CA: Center for Studies in Higher Education, University of California Berkeley. Retrieved May 15, 2023, from https://escholarship.org/uc/item/80k5d5hw Google Scholar
  • Cohen, A. K., Hoyt, L. T., & Dull, B. (2020). A descriptive study of COVID-19–related experiences and perspectives of a national sample of college students in Spring 2020. Journal of Adolescent Health, 67(3), 369–375. MedlineGoogle Scholar
  • Collins, T. W., Grineski, S., Shenberger-Trujillo, J., Morales, D. X., Morera, O. F., & Echegoyen, L. (2017). Undergraduate research participation is associated with improved student outcomes at a Hispanic-serving institution. Journal of College Student Development, 58(4), 583–600. MedlineGoogle Scholar
  • Cooper, K. M., Gin, L. E., Barnes, M. E., & Brownell, S. E. (2020a). An exploratory study of students with depression in undergraduate research experiences. CBE—Life Sciences Education, 19(2), ar19. LinkGoogle Scholar
  • Cooper, K. M., Gin, L. E., & Brownell, S. E. (2020b). Depression as a concealable stigmatized identity: What influences whether students conceal or reveal their depression in undergraduate research experiences? International Journal of STEM Education, 7, 1–18. Google Scholar
  • Dahlem, N. W., Zimet, G. D., & Walker, R. R. (1991). The multidimensional scale of perceived social support: A confirmation study. Journal of Clinical Psychology, 47(6), 756–761. MedlineGoogle Scholar
  • Daniels, H. A., Grineski, S. E., Collins, T. W., & Frederick, A. H. (2019). Navigating social relationships with mentors and peers: Comfort and belonging among men and women in STEM summer research programs. CBE—Life Sciences Education, 18(2), ar17. https://doi.org/10.1187/cbe.18-08-0150 LinkGoogle Scholar
  • Duffy, M. E., Twenge, J. M., & Joiner, T. E. (2019). Trends in mood and anxiety symptoms and suicide-related outcomes among U.S. undergraduates, 2007—2018: Evidence from two national surveys. Journal of Adolescent Health, 65(5), 590–598. MedlineGoogle Scholar
  • Enders, C. (2010). Applied missing data analysis. New York, NY: Guilford Press. Google Scholar
  • Erickson, O. A., Cole, R. B., Isaacs, J. M., Alvarez-Clare, S., Arnold, J., Augustus-Wallace, A., ... & Dolan, E. L. (2022). “How do we do this at a distance?!” A descriptive study of remote undergraduate research programs during COVID-19. CBE—Life Sciences Education, 21(1), ar1. https://doi.org/10.1187/cbe.21-05-0125 MedlineGoogle Scholar
  • Fechheimer, M., Webber, K., & Kleiber, P. B. (2011). How well do undergraduate research programs promote engagement and success of students? CBE—Life Sciences Education, 10(2), 156–163. LinkGoogle Scholar
  • Frederick, A., Grineski, S. E., Collins, T. W., Daniels, H. A., & Morales, D. X. (2021). The emerging STEM paths and science identities of Hispanic/Latinx college students: Examining the impact of multiple undergraduate research experiences. CBE—Life Sciences Education, 20(2), ar18. https://doi.org/10.1187/cbe.20-08-0191 LinkGoogle Scholar
  • Garson, G. D. (2012). Generalized linear models and generalized estimating equations. Raleigh, NC: Statistical Associates Publishing. Google Scholar
  • Gentile, J., Brenner, K., & Stephens, A. (2017). Undergraduate research experiences for STEM students: Successes, challenges, and opportunities. Washington, DC: National Academies Press. Retrieved May 15, 2023, from https://nap.nationalacademies.org/catalog/24622/undergraduate-research-experiences-for-stem-students-successes-challenges-and-opportunities Google Scholar
  • Gilmore, J., Vieyra, M., Timmerman, B., Feldon, D., & Maher, M. (2015). The relationship between undergraduate research participation and subsequent research performance of early career STEM graduate students. Journal of Higher Education, 86(6), 834–863. Google Scholar
  • Giuntella, O., Hyde, K., Saccardo, S., & Sadoff, S. (2021). Lifestyle and mental health disruptions during COVID-19. Proceedings of the National Academy of Sciences USA, 118(9), e2016632118. MedlineGoogle Scholar
  • Grineski, S. E., Daniels, H. A., Collins, T. W., Morales, D. X., Frederick, A. H., & Garcia, M. (2018). The conundrum of social class: Disparities in publishing among STEM students in undergraduate research programs. Science Education, 102(2), 283–303. MedlineGoogle Scholar
  • Grineski, S. E., Morales, D. X., Collins, T. W., Nadybal, S., & Trego, S. (2021). Anxiety and depression among US college students engaging in undergraduate research during the COVID-19 pandemic. Journal of American College Health, 1–11. doi: 10.1080/07448481.2021.2013237 MedlineGoogle Scholar
  • Grineski, S. E., Morales, D. X., Collins, T. W., Nadybal, S., & Trego, S. (2022). A US national study of barriers to science training experienced by undergraduate students during COVID-19 International Journal of Environmental Research and Public Health, 19(11), 6534. MedlineGoogle Scholar
  • Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2013). Advanced diagnostics for multiple regression (Online supplement to Multivariate Data Analysis). Hoboken, NJ: Pearson Prentice Hall Publishing. Retrieved May 15, 2023, from https://www.academia.edu/24541751/Advanced_Diagnostics_for_Multiple_Regression_A_Supplement_to_Multivariate_Data_Analysis_Multivariate_Data_Analysis Google Scholar
  • Hefner, J., & Eisenberg, D. (2009). Social support and mental health among college students. American Journal of Orthopsychiatry, 79(4), 491–499. MedlineGoogle Scholar
  • Hernandez, P. R., Woodcock, A., Estrada, M., & Schultz, P. W. (2018). Undergraduate research experiences broaden diversity in the scientific workforce. Biosciene, 68(3), 204–211. Google Scholar
  • Hughes, B. E. (2018). Coming out in STEM: Factors affecting retention of sexual minority STEM students. Scientific Advances, 4(3), eaao6373. https://doi.org/10.1126/sciadv.aao6373 Google Scholar
  • Hunt, J., & Eisenberg, D. (2010). Mental health problems and help-seeking behavior among college students. Journal of Adolescent Health, 46(1), 3–10. MedlineGoogle Scholar
  • Hurtado, S., Cabrera, N. L., Lin, M. H., Arellano, L., & Espinosa, L. L. (2009). Diversifying science: Underrepresented student experiences in structured research programs. Research in Higher Education, 50(2), 189–214. MedlineGoogle Scholar
  • Jones, M. T., Barlow, A. E., & Villarejo, M. (2010). Importance of undergraduate research for minority persistence and achievement in biology. The Journal of Higher Education, 81(1), 82–115. Google Scholar
  • Kalkbrenner, M. T., Jolley, A. L., & Hays, D. G. (2021). Faculty views on college student mental health: Implications for retention and student success. Journal of College Student Retention: Research, Theory & Practice, 23(3), 636–658. Google Scholar
  • Kamariah, I., & Azlina, A.-A. (2022). COVID-19 pandemic: Lessons learned for undergraduate research training. Biochemistry and Molecular Biology Education, 5(50), 476–478. Google Scholar
  • Keum, B. T., Miller, M. J., & Inkelas, K. K. (2019). Testing the factor structure and measurement invariance of the PHQ-9 across racially diverse U.S. college students. Psychological Assessment, 30(8), 1096–1106. Google Scholar
  • Kroenke, K., Spitzer, R. L., & Williams, J. B. W. (2001). The PHQ-9; Validity of a Brief Depression Severity Measure. Journal of General Internal Medicine, 16(9), 606–613. MedlineGoogle Scholar
  • Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Retrieved May 15, 2023, from https://www.aacu.org/publication/high-impact-educational-practices-what-they-are-who-has-access-to-them-and-why-they-matter Google Scholar
  • Little, C. (2020). Undergraduate research as a student engagement springboard: Exploring the longer-term reported benefits of participation in a research conference. Educational Research, 62(2), 229–245. Google Scholar
  • Lopatto, D. (2004). Survey of undergraduate research experiences (SURE): First findings. CBE­–Life Sciences Education, 3(4), 270–277. AbstractGoogle Scholar
  • Lopatto, D. (2007). Undergraduate research experiences support science career decisions and active learning. CBE­–Life Sciences Education, 6(4), 297–306. LinkGoogle Scholar
  • Means, B., & Neisler, J. (2020). Suddenly online: A national survey of undergraduates during the COVID-19 pandemic. San Mateo, CA: Digital Promise. http://hdl.handle.net/20.500.12265/98 Google Scholar
  • Morales, D. X., Grineski, S., & Collins, T. (2021). Undergraduate researchers’ graduate school intentions during COVID-19. Annals of the New York Academy of Science, 1508(1), 137–154. Google Scholar
  • Myers, C. B., & Pavel, D. M. (2011). Underrepresented students in STEM: The transition from undergraduate to graduate programs. Journal of Diversity in Higher Education, 42(4), 90–105. Google Scholar
  • National Survey of Student Engagement. (2020). Engagement insights: Survey findings on the quality of undergraduate education— Annual results 2019. Bloomington, IN: Indiana University Bloomington. Retrieved May 15, 2023, from https://scholarworks.iu.edu/dspace/handle/2022/25321 Google Scholar
  • National Survey of Student Engagement. (2021). The pandemic and student engagement: Trends, disparities, and opportunities. Bloomington, IN: Center for Postsecondary Research, Indiana University School of Education. Retrieved May 15, 2023, from https://nsse.indiana.edu/research/annual-results/2021/story1.html#shift Google Scholar
  • National Survey of Student Engagement. (2022). Digging deeper into the quality of high-impact practices. Bloomington, IN: Center for Postsecondary Research, Indiana University School of Education. Retrieved May 15, 2023,from https://nsse.indiana.edu/research/annual-results/2022/story2.html Google Scholar
  • Patricia, A. (2020). College students’ use and acceptance of emergency online learning due to COVID-19. International Journal of Educational Research Open, 1, 100011. https://doi.org/10.1016/j.ijedro.2020.100011 MedlineGoogle Scholar
  • Qiang, Z., Obando, A. G., Chen, Y., & Ye, C. (2020). Revisiting distance learning resources for undergraduate research and lab activities during COVID-19 pandemic. Journal of Chemical Education, 97(9), 3446–3449. Google Scholar
  • Robnett, R. D., Chemers, M. M., & Zurbriggen, E. L. (2015). Longitudinal associations among undergraduates' research experience, self-efficacy, and identity. Journal of Research in Science Teaching, 52(6), 847–867. Google Scholar
  • Rubin, D. (1987). Multiple imputation for nonresponse in surveys. New York, NY: Wiley. Google Scholar
  • Russell, S. H., Hancock, M. P., & McCullough, J. (2007). Benefits of undergraduate research experiences. Science, 316(5824), 548–549. MedlineGoogle Scholar
  • Shek, D. T. L., & Leung, H. (2015). Service leadership qualities in university students through the lens of student well-being. In Shek, D. T. L.Chung, P. (Eds.), Promoting service leadership qualities in university students. Quality of life in Asia, Vol. 6 (pp. 1–16). New York, NY: Springer. Google Scholar
  • Soria, K. M., Horgos, B., Chirikov, I.. & Jones-White, D. (2020). First-generation students’ experiences during the COVID-19 pandemic. Berkeley, CA: Center for Studies in Higher Education. Retrieved May 15, 2023, from https://escholarship.org/uc/item/19d5c0ht Google Scholar
  • Son, C., Hegde, S., Smith, A., Wang, X., & Sasangohar, F. (2020). Effects of COVID-19 on college students’ mental health in the United States: Interview survey study. Journal of Medical Internet Research, 22(9), e21279. MedlineGoogle Scholar
  • Soutter, A. K., O’Steen, B., & Gilmore, A. (2014). The student well-being model: A conceptual framework for the development of student well-being indicators. International Journal of Adolescence and Youth, 19(4), 496–520. Google Scholar
  • Speer, J. E., Lyon, M., & Johnson, J. (2021). Gains and losses in virtual mentorship: A descriptive case study of undergraduate mentees and graduate mentors in STEM research during the COVID-19 pandemic. CBE—Life Sciences Education, 20(2), ar14. https://doi.org/10.1187/cbe.20-06-0128 MedlineGoogle Scholar
  • Spitzer, R. L., Kroenke, K., & Williams, J. B. W. (1999). Patient Health Questionnaire Study Group. Validity and utility of a self-report version of PRIME-MD: The PHQ Primary Care Study. Journal of the American Medical Association, 282, 1737–1744. Google Scholar
  • Spitzer, R. L., Kroenke, K., Williams, J. W., & Lowe, B. (2006). A brief measure for assessing generalized anxiety disorder: The GAD-7. Journal of the American Medical Association Internal Medicine, 166(10), 1092–1097. Google Scholar
  • Sriken, J., Johnsen, S. T., Smith, H., Sherman, M. F., & Erford, B. T. (2021). Testing the factorial validity and measurement invariance of college student scores on the Generalized Anxiety Disorder (GAD-7) scale across gender and race. Measurement and Evaluation in Counseling and Development, 55(1), 1–16. Google Scholar
  • Sterne, J. A. C., White, I. R., Carlin, J. B., Spratt, M., Royston, P., Kenward, M. G., ... & Carpenter, J. R. (2009). Multiple imputation for missing data in epidemiological and clinical research: Potential and pitfalls. British Medical Journal, 338, b2393. MedlineGoogle Scholar
  • Thiry, H., Weston, T. J., Laursen, S. L., & Hunter, A.-B. (2012). The benefits of multi-year research experiences: differences in novice and experienced students’ reported gains from undergraduate research. CBE—Life Sciences Education, 11(3), 260–272. LinkGoogle Scholar
  • University of California at Los Angeles. (2020). Higher Education Research Institute & Cooperative Institutional Research Program. 2021. Retrieved May 15, 2023, from https://heri.ucla.edu/ Google Scholar
  • Xiao, H., Carney, D. M., Youn, S. J., Janis, R. A., Castonguay, L. G., Hayes, J. A., & Locke, B. D. (2017). Are we in crisis? National mental health and treatment trends in college counseling centers. Psychological Services, 14(4), 407–415. MedlineGoogle Scholar
  • Yang, B., & Huang, C. (2020). Turn crisis into opportunity in response to COVID-19: Experience from a Chinese university and future prospects. Studies in Higher Education, 46(1), 121–132. Google Scholar
  • Yang Yowler, J., Knier, K., WareJoncas, Z., Ehlers, S. L., Ekker, S. C., Reyes, F. G., ... & Pierret, C. (2021). Rapid adaptation and remote delivery of undergraduate research training during the COVID-19 pandemic. Sustainability, 13(11), 6133. MedlineGoogle Scholar
  • Zha, R., & Harel, O. (2021). Power calculation in multiply imputed data. Statistical Papers, 62, 533–559. Google Scholar
  • Zimet, G. D., Dahlem, N. W., Zimet, S. G., & Farley, G. K. (1988). The Multidimensional Scale of Perceived Social Support. Journal of Personality Assessment, 52(1), 30–41. Google Scholar