Modifying Summer Undergraduate Research Programs during COVID-19 Increased Graduate School Intentions but Exacerbated Anxieties
Abstract
COVID-19 created unprecedented challenges for college students, highlighting the need to provide educational contexts that foster well-being. Summer undergraduate research experiences (SUREs) constitute a high-impact practice, yet little systematic knowledge exists about how the first surge of COVID-19 influenced undergraduate researchers’ well-being. This knowledge is important for preparing for future disruptions. This study applies the student well-being model (SWBM) to examine how SURE status (e.g., modification vs. cancellation) impacted students’ mental health and graduate school intentions using primary survey data collected from U.S. undergraduate researchers in science, technology, engineering, and mathematics (STEM) fields in Summer 2020 (n = 408, from 131 institutions). Just under half had their SURE canceled, and the others engaged in modified SUREs. Students whose SUREs were canceled had reduced anxiety severity (p < 0.05), but greater concerns about graduate school matriculation (p < 0.001), compared with students with modified SUREs. Results suggest that modified SUREs are a reasonable path forward under conditions where in-person contact is untenable. Results point toward potential improvements in higher education practices that may enhance student well-being following disruptive events. Program directors can address potential causes of anxiety in modified SUREs, advocate for student-centered adjustments to graduate admission processes, and use experiences during COVID-19 as a springboard to broaden participation in undergraduate research.
INTRODUCTION
Faculty-mentored undergraduate research experiences are a high-impact practice in higher education (Kuh, 2008). Participants report improved critical thinking, increased interactions with faculty, enhanced academic achievement and retention, greater science self-efficacy, greater persistence to science, technology, engineering, and mathematics (STEM) degree completion, and improved readiness for graduate education and research careers (Hurtado et al., 2009; Adedokun et al., 2013; Gilmore et al., 2015; Robnett et al., 2015; Collins et al., 2017; Hernandez et al., 2018; Little, 2020). During the first surge of COVID-19 (in Spring–Summer 2020), undergraduate research training was disrupted by the COVID-19 pandemic, with many students’ research experiences being canceled or switching to remote engagement.
To date, there is little systematic knowledge available about how this initial surge of COVID-19 influenced the well-being of undergraduate researchers (for exceptions, see Grineski et al., 2021; Morales et al., 2021; Speer et al., 2021; Yang Yowler et al., 2021; Erickson et al., 2022). We need better knowledge to document what happened in this historical moment and, more importantly, provide a basis for informed decision making during future surges or other disruptions. There will be regional and global surges of COVID-19 cases (e.g., delta variant in the United States in Fall 2021 and omicron in Winter 2022), and program directors will need to decide if they should cancel or modify undergraduate research programs periodically. Lessons learned during COVID-19 will also be applicable during future coronavirus-caused disruptions, which may become more common under climate change scenarios (Beyer et al., 2021). More generally, understanding how student well-being operated during a global crisis is imperative so that universities can implement new and adjusted measures to better support students on their academic journeys (Burns et al., 2020).
We know that undergraduate research participation rates dipped following the initial COVID-19 surge, but have now rebounded. In the year before the COVID-19 pandemic (2019), 22% of all U.S. college seniors and 46% of biological science seniors had participated in faculty-mentored research (National Survey of Student Engagement, 2020). As of 2021, participation rates dipped very modestly (i.e., standardized difference scores, comparing participation in 2020 with 2021, averaged −0.08 across six high-impact practices, including “research with faculty”; National Survey of Student Engagement, 2021). As of 2022, participation rates had recovered (i.e., 22% of seniors had participated; National Survey of Student Engagement, 2022).
While many STEM students engage in academic-year research, summer undergraduate research experiences (SUREs) are an important context for research training. The emergence of the COVID-19 pandemic in Spring 2020 was difficult for SURE programs. The majority of SURE program directors in the United States had already selected their cohorts and were planning for Summer 2020 when universities began to pivot from in-person engagement to online delivery modes for many functions, including undergraduate classes. While some SURE programs were canceled, others continued to operate in modified format (e.g., through remote research experiences). Biology students taking part in modified SURE programs in the United States saw the quality of mentorship, opportunities for learning and professional development, and development of a sense of community as strengths of modified programs, but felt that limited opportunities for cohort building, challenges with insufficient structure, and issues with technology were weaknesses (Erickson et al., 2022). Erickson et al. (2022) concluded that remote SUREs can be experienced favorably by students, and another study of one 2020 remote SURE program reached a similar conclusion (Yang Yowler et al., 2021). Other studies have examined undergraduate research experiences during COVID-19, without an explicit focus on SUREs. These studies have found that the pandemic altered mentor–mentee communication (Speer et al., 2021) and impacted graduate school intentions (Morales et al., 2021). During March and April 2020, economic hardship was an important correlate of barriers to research progress (e.g., lack of motivation, lack of access to tools needed to conduct the research), as were communication challenges and sexual minority status at U.S. universities (Grineski et al., 2022).
While the two studies on SUREs during COVID-19 (Yang Yowler et al., 2021; Erickson et al., 2022) took steps toward building a knowledge base about experiences of students engaging in modified/remote SUREs due to COVID-19, little is known about students whose programs were canceled. We, alongside other SURE program directors, wondered whether program cancellation might affect student well-being, especially in terms of mental health and the pursuit of a graduate education, which are important outcomes in higher education. To investigate this, we use data collected from students enrolled at 131 U.S. universities in Spring 2020 who attended (or planned to attend) more than 100 different 2020 SUREs.
We frame our analyses of these data with the student well-being model, or SWBM (Soutter et al., 2014) to understand SURE students’ well-being during COVID-19. This frame reflects the growing interest in college student well-being at universities and among researchers and the need to recognize and interact with undergraduate students as whole humans in order to make SUREs more inclusive and equitable. Well-being is a multifaceted concept that reflects factors in someone’s life that contribute toward fulfillment (Burns et al., 2020). Maintaining well-being is critically important to ensuring that undergraduate students thrive and flourish in their lives. However, COVID-19 has “shifted the student well-being domain considerably due in part to the extensive pragmatic changes that have been introduced to curb the spread of COVID-19” (Burns et al., 2020, p. 6), including changes to SUREs, the focus of this paper.
Theoretical Model: Student Well-Being
Challenges experienced by college students during COVID-19 have highlighted the importance of providing educational contexts to promote student well-being. While often considered narrowly in terms of test scores or attendance, the education community has sought to broaden conceptualizations of well-being (Soutter et al., 2014). As such, programming for student well-being has expanded to focus on mental and physical health, resilience, risk reduction, social support, relationships, and engagement (Soutter et al., 2014). Soutter et al. (2014) introduced a conceptual framework to advance understanding of student well-being. While Soutter et al.’s (2014) SWBM has rarely been applied to higher education (e.g., Shek and Leung, 2015), it has applicability to this context. The seven domains of the SWBM are organized into three categories, which are individually and collectively integral to well-being (Soutter et al., 2014). Each domain captures a specific element of well-being, and the SWBM highlights how these seven domains of well-being are interconnected (Soutter et al., 2014).
In the SWBM, the category Assets includes the domains of Having, Being, and Relating. Assets are conditions and circumstances associated with well-being that are immediately accessible or invested for later use. Having captures the dimension of well-being that relates to resources, tools, and opportunities that the person has gained, such as through their income. Being relates to the conditions of someone’s life and their identity, which includes genetic makeup as well as social and cultural circumstances that shape sense of self, including racial/ethnic and health statuses. Relating emphasizes relationships and interpersonal connections that shape emotions, thoughts, and choice of actions, such as student–teacher relationships (Soutter et al., 2014).
The second category, Appraisals, includes two domains: Thinking and Feeling. These are the cognitive and affective interpretations of how and why Assets are valuable to well-being. Thinking captures how student well-being is related to opportunities to be creative and mindful as well as to be actively engaged in cognitive tasks requiring time, continued questioning, collaboration, and cross-disciplinary study. The Feeling domain emphasizes the emotional component of well-being and how students recognize, express, and manage their feelings, including mental health and contentedness (Soutter et al., 2014).
The third category, Actions, has two domains: Functioning and Striving. Actions relate to motivations to pursue well-being Assets (Soutter et al., 2014). Functioning relates to how well students engage in educational experiences that are not bounded by disciplines, generation, culture, or ideology, such that they extend beyond their immediate context. We consider grit to be an example of Functioning. Striving connects to students’ future goals and their abilities to stay motivated to achieve those goals in the face of adversity, such as gaining secure employment (Soutter et al., 2014). Table 1 illustrates specifically how variables we use in this study (which are used in other research on SUREs) fit within each of the seven domains in the SWBM.
Domain | Description | Variables related to SUREs by domain with references supporting their applicability | Justification for variable assignment by domain |
---|---|---|---|
Assets | |||
Having | Relates to what the students’ gain through their time in college |
|
|
Being | Relates to the conditions of students’ lives and their identities |
|
|
Relating | Emphasizes relationships and interpersonal connections |
|
|
Appraisals | |||
Thinking | Includes opportunities to be creative and actively engaged in cognitive tasks |
|
|
Feeling | Includes the emotional component of well-being |
|
|
Actions | |||
Functioning | Includes how students engage in educational experiences |
|
|
Striving | Captures students’ future goals and their abilities to stay motivated to achieve those goals |
|
|
Research Questions
In this study, we focus on how program status affected student outcomes through two research questions. The first research question is: How were students affected by SURE program changes, including cancellations and modifications? This question relies on univariate analyses and characterizes well-being within the Functioning domain of the SWBM. The second question is: How did SURE status (i.e., program cancellation vs. modification) impact students’ mental health and plans for attending graduate school, while controlling for other variables in the SWBM? We address this question via multivariable analyses and hypothesize that students whose programs were canceled will report worse mental health (in the SWBM Feeling domain) and attenuated graduate school intentions (in the SWBM Striving domain) relative to those whose programs were modified.
The dependent variables for the second research question are in the domains of Feeling and Striving. We focus on those two domains for the dependent variables, as we believe they are particularly relevant to the well-being of undergraduate students in the context of the COVID-19 pandemic. The first dependent variable relates to mental health, a core element of student well-being within the Feeling domain. High rates of mental health problems have been reported by college students during the pandemic (Cao et al., 2020; Son et al., 2020; Giuntella et al., 2021) and before it (Hunt and Eisenberg, 2010; Xiao et al., 2017; Duffy et al., 2019). Few studies have examined the mental health of undergraduate researchers (Cooper et al., 2020a, b; Grineski et al., 2021). One paper studied undergraduate students at U.S. universities conducting research during Spring 2020 (during the first months of the pandemic) and found that 63% reported at least mild anxiety and 73% reported at least mild depression as of July 2020 (Grineski et al., 2021). Undergraduate researchers who reported more (vs. fewer) COVID-19–related adverse event experiences suffered from more severe anxiety and depression. Women and LGBQ+ students had more severe symptoms than men and non-LGBQ+ students (Grineski et al., 2021). While these three studies are important first steps, additional research on undergraduate researchers’ mental health is needed, especially in the context of COVID-19, which has dramatically impacted college student well-being. We begin to address that gap by investigating COVID-19–associated SURE cancellations and mental health.
Aspirations to attend graduate school are the second dependent variable, and they fit within the Striving domain of well-being. The transition from undergraduate to graduate education is a critical stage for students’ pursuit of STEM and research careers. Many undergraduates decide to leave science at this moment (Myers and Pavel, 2011). Undergraduate research is associated with graduate school enrollment and later success (Carter et al., 2009; Gilmore et al., 2015; Hernandez et al., 2018). Undergraduate research helps students gain mastery of research skills and develop close relationships with faculty members, which can lead them to graduate school; students interested in graduate school often seek out SUREs (Gentile et al., 2017). Researchers found that the pandemic (as of July 2020) had strengthened undergraduate researchers’ graduate school intentions when they were highly impacted by COVID-19. (Morales et al., 2021). This suggests that it is also important to examine how summer program status influenced undergraduate researchers’ graduate school intentions.
METHODS
Participants and Data Collection
We conducted a survey of undergraduate researchers attending U.S. universities, approved by the University of Utah’s Institutional Review Board (no. 00133477). We worked with directors of academic-year and SURE programs at 18 different universities to develop a sampling frame of 2440 undergraduate researchers. Some of those directors oversaw one program, and others oversaw many programs, depending on their roles (e.g., principal investigator on one training grant vs. director of an undergraduate research office) and the institution. We administered the survey online using QuestionPro between July 6 and July 31, 2020. Program directors emailed students three times using scripts provided by the authors. Students received the initial invitation to participate and nonrespondents received two additional reminders one week apart. Student participants received a $20 Amazon gift card as an incentive. The survey covered domains including personal and educational circumstances in Spring 2020; research experiences during COVID-19; prior research experiences and mentoring; general COVID-19 experiences; mental health and social support; and sociodemographic characteristics. The survey took 30 minutes on average to complete. In total, 2246 students were initially emailed about the survey; six of those bounced back as undeliverable, and three were screened due to being current graduate students. This left n = 2237. Of those, 188 students clicked on the survey but did not take it, while 1220 responded to the survey. Our response rate was 54.5%, and our cooperation rate was 86.6%.
In this paper, we focus on the subset of respondents who were accepted to Summer 2020 SUREs “to work with a faculty mentor, for pay, and through a formal program” and whose programs were canceled or modified (n = 457). We excluded 49 students who were missing data for more than 25% of the analysis variables included in the multivariable models, leaving an analysis n of 408 students. These students were accepted into more than 100 different SURES.
In terms of participant characteristics, the three largest majors represented are life sciences (50%), engineering (23%), and math (12%). Just under half (49%) are underrepresented racial/ethnic minority students (i.e., from Hispanic, non-Hispanic Black, Native American, Native Hawaiian, Pacific Islander, multiracial, or other [non-Asian or non-White] racial backgrounds), while one-third (32%) are first-generation college students. In terms of gender, 36% are men, 63% are women, and 1% are another gender; 16% are LGBQ+. Less than one-tenth are international students, while just more than three-quarters are juniors and seniors.
Measures
Variables for the first research question come from a series of items we asked about each student’s summer program. We asked two blocks of questions: one for students whose programs ran under modified conditions, which addressed how their programs were modified and their satisfaction with their modified programs; and another for students whose programs were canceled, which reported on how their programs’ cancellations affected them. Descriptive statistics for the items are included in the Results.
Descriptive statistics for the variables used to answer the second research question are presented in Table 2. These variables are constructed from survey questions that were asked of all students, regardless of their program status. Appendix A in the Supplemental Material provides a complete list of survey items used to create those variables. Variables in each domain are described in the following sections.
N | Min. | Max. | Mean | SD | % Missing | Yes (%) | No (%) | |
---|---|---|---|---|---|---|---|---|
Independent variable | ||||||||
Functioning domain | ||||||||
SURE canceled (vs. modified) | 407 | 0 | 1 | 0.02 | 193 (47.4) | 214 (52.6) | ||
Control variables | ||||||||
Having domain | ||||||||
GPA | 398 | 2.7 | 4 | 3.76 | 0.30 | 2.5 | N/A | N/A |
Junior2 | 407 | 0 | 1 | 0.02 | 166 (40.8) | 241 (59.2) | ||
Seniorb | 407 | 0 | 1 | 0.02 | 145 (35.6) | 262 (64.4) | ||
Being domain | ||||||||
Underrepresented racial/ethnic minority | 407 | 0 | 1 | 0.02 | 200 (49.1) | 207 (50.9) | ||
First-generation student | 396 | 0 | 1 | 3.9 | 197 (49.7) | 269 (50.3) | ||
LGBQ+ | 401 | 0 | 1 | 1.7 | 66 (16.5) | 335 (*3.5) | ||
Manc | 408 | 0 | 1 | 0 | 145 (35.6) | 263 (64.4) | ||
Other genderc | 408 | 0 | 1 | 0 | 4 (0.01) | 404 (99.09) | ||
International student | 408 | 0 | 1 | 0 | 32 (7.8) | 396 (92.2) | ||
Pre-existing psychological conditiond | 367 | 0 | 1 | 10.0 | 76 (20.7) | 291 (79.3) | ||
Relating domain | ||||||||
Social support | 397 | 1.33 | 7 | 5.47 | 1.06 | 2.7 | N/A | N/A |
Thinking domain | ||||||||
Did Spring 2020 research | 408 | 0 | 1 | 0 | 310 (76.0) | 98 (24.0) | ||
Semesters of research experience | 406 | 0 | 8 | 2.97 | 2.13 | 0.05 | N/A | N/A |
SURE at home university | 407 | 0 | 1 | 0.02 | 106 (26.0) | 301 (74.0) | ||
Functioning domain | ||||||||
COVID-19 adverse event experiences score | 406 | 0 | 9 | 2.51 | 1.84 | N/A | N/A | |
Dependent variables | ||||||||
Feeling domain | ||||||||
Anxiety severity | 403 | 6 | 27 | 13.42 | 4.63 | 1.2 | N/A | N/A |
Depression severity | 404 | 8 | 35 | 17.25 | 5.31 | 1.0 | N/A | N/A |
Striving domain | ||||||||
Applying to graduate school | 406 | 1 | 5 | 2.93 | 1.23 | 0.05 | N/A | N/A |
Getting in to graduate school | 407 | 1 | 5 | 3.1 | 1.12 | 0.02 | N/A | N/A |
Independent Variable.
The focal independent variable is SURE program status, which falls in the Functioning domain. We asked students, “Which statement best characterizes the current status of your summer research opportunity?” Response options included “Research opportunity was canceled” and “Research opportunity has changed, but is still running.” We coded this variable as 1 = program was canceled and 0 = research opportunity has changed (i.e., program was modified).
Control Variables
Having Domain: Academic Standing.
We used two variables: self-reported cumulative grade point average (GPA) and self-reported classification (i.e., junior, senior, and freshman/sophomore/unclassified [reference category]).
Being Domain: Social Demographic Characteristics and Pre-existing Mental Health Conditions.
We used five sociodemographic variables in this domain. We coded race/ethnicity into two categories (i.e., 1 = Hispanic and non-Hispanic Black, Native American, Native Hawaiian, Pacific Islander, multiracial, and other race; 0 = non-Hispanic White or Asian). We examined first-generation status based on highest level of schooling for each parent such that first-generation students (1 = yes; 0 = no) are those for whom neither parent earned a bachelor’s degree. We also included LBGQ+ status (1 = gay, bisexual, lesbian, pansexual, asexual, or other sexuality; 0 = not LBGQ+), gender (i.e., man, woman [reference category]; other [trans, genderqueer, gender non-confirming, and other]), and international student status (1 = yes; 0 = no). In the models predicting mental health, we also asked about lifetime prevalence of pre-existing conditions to create a variable assessing lifetime presence of a psychological disorder (1 = psychological disorder [e.g., anxiety, depression, PTSD]; 0 = no).
Relating Domain: Social Support.
When predicting mental health outcomes, we include social support using the Multidimensional Scale Of Perceived Social Support (MSPSS), initially designed for use with university students (Zimet et al., 1988; Dahlem et al., 1991). MSPSS had strong factorial validity and good internal and test–retest reliability (Zimet et al., 1988). Following others (Zimet et al., 1988; Dahlem et al., 1991), we calculated MSPSS scores by averaging across responses to 12 items, scored on a seven-item Likert scale ranging from very strongly disagree (1) to very strongly agree (7); higher scores correspond to more social support. The scale had high internal consistency (Cronbach’s α = 0.898).
Thinking Domain: Previous Research Experience.
We used three variables. First, we calculated the total number of semesters each student had conducted research, counting each summer as one semester, before Spring 2020. We used an indicator for whether they conducted research during Spring 2020 (1 = yes; 0 = no). Given the COVID-19 disruptions to Spring 2020 research, we considered it as a separate variable. We also considered whether the 2020 SURE program was at the students’ own home institutions (i.e., the university a student attended during the academic year; = 1) or another institution (= 0).
Functioning Domain: Adverse Event Experiences.
We created an adverse event experiences score by summing responses (1 = yes; 0 = no) to items representing a potential impact of COVID-19, following Grineski et al. (2021). We asked students: “Have you ever had the following experiences during COVID-19?,” and the options included items including “difficulty traveling,” “experienced a salary cut,” and “took care of others who were sick.” See Grineski et al. (2021) for a complete list of items. At the time, there were no pre-existing survey items to assess adverse event experiences specifically caused by COVID-19 among college students, so we created these items based on a review of the gray literature and our experiences as faculty who worked closely with students during the pandemic. We revised the list of experiences iteratively with feedback from undergraduate and graduate students.
Dependent Variables
Feeling Domain: Mental Health.
We examined both anxiety and depression severity using pre-existing scales as dependent variables. For anxiety, we used the seven-item Generalized Anxiety Disorder Scale (GAD-7; Spitzer et al., 2006), which is one of the most widely used instruments for anxiety disorder screening (Cao et al., 2020) due to its criterion, construct, factorial, and procedural validity (Spitzer et al., 2006). We calculated the GAD-7 by summing the seven items, scored from 0 (not at all) to 3 (nearly every day). We used a self‐administered depression scale called the Patient Health Questionnaire (PHQ-9; Spitzer et al., 1999). Each of the nine criteria is scored as 0 (not at all) to 3 (nearly every day), and then scores are summed (Kroenke et al., 2001). Both scales had high internal consistency in this sample (Cronbach’s α = 0.913 for GAD-7 and 0.891 for PHQ-9) and have been validated in other college student samples (Keum et al., 2019; Sriken et al., 2021).
Striving Domain: Graduate School Intentions.
We treat graduate school intentions as dependent variables. We asked each student how they thought that their summer program’s status (i.e., the cancellation of their summer research opportunity or the changes to their summer research opportunity, when applicable) would affect their future plans in terms of the likelihood of applying to graduate school. We asked the same question about “getting in to graduate school.” Both measures were assessed on a scale, which we reverse coded to be 1 (greatly increased likelihood) to 5 (greatly reduced the likelihood).
Data Analysis
To answer the first research question, we used univariate statistics, including frequencies and means. To answer the second research question, we used multivariable generalized estimating equations (GEEs). We used IBM SPSS Statistics 25 (for Windows 10) to run all analyses. GEEs extend from the generalized linear model to adjust for clustering (Garson, 2012). We used the student’s home institution (n = 131) and major/discipline grouping (n = 6) to define clusters. We created the six categories of majors by recoding student responses to a list of 90 majors taken from the CIRP Freshman Survey (University of California at Los Angeles, 2020) into the following groupings: life sciences, engineering, health professions, social/behavioral sciences, math/computer science/physical science, and all other majors. The number of students in each cluster ranged from one to 44 students.
We ran four GEEs, one for each dependent variable. We use the independent variable and all the control variables in the models predicting anxiety and depression severity. For the two graduate school intention variables, we used all the same variables, except for social support and pre-existing mental health conditions, as they are mental-health specific.
Due to missing data (see Table 2 for percent missing for each variable), we applied multiple imputation (MI) before running the GEEs. We used MI, because missing values across a set of variables can substantially reduce the sample size, statistical power, and precision and introduce bias if the values are not missing completely at random (Sterne et al., 2009). MI involved using a regression-based approach to create multiple sets of values for missing observations (Enders, 2010). We created 20 multiply imputed data sets, each with 200 iterations, and the imputed values at the maximum iteration were saved to the imputed data set (Enders, 2010). When analyzing MI data in multivariable models like GEEs, the standard errors take into account the uncertainty associated with the missing values (Rubin, 1987). To explore whether model results were sensitive to our use of MI, we also replicated the four GEEs, including only cases with complete data in a sensitivity analysis.
GEE models use an intracluster correlation matrix that we specified as exchangeable, which assumes constant intracluster dependency (Garson, 2012). To refine fit, we tested normal, gamma, and inverse Gaussian distributions with logarithmic (log) and identity link functions (Garson, 2012). The best-fitting specifications for all models was inverse Gaussian with log link. Results from the GEEs are not affected by multicollinearity based on tolerance and variance inflation factor criteria (Hair et al., 2013).
RESULTS
Research Question 1
Well-Being and Summer Research Program Cancellations.
Just under half of students (47.4%) who had been accepted into a SURE that did not run “as usual” had their programs canceled. The majority of those students whose programs were canceled were planning to attend their SURE at another university (88.6%). In terms of losses students experienced due to SURE cancellations, 93.8% reported having lost the opportunity to learn new skills, and 92.7% reported that they had lost opportunity to network with students and professors. Approximately 81% reported that they had lost their summer employment; 50.5% said the cancellation opened up new opportunities for the summer (e.g., summer classes); and 13.9% lost housing for the summer.
Well-Being and Modified Summer Research Programs.
More than half of students who reported that their SUREs that did not run “as usual” moved forward but with program changes. Of those students, 61% participated in a modified SURE at another university, while the rest had a modified SURE at their home institutions. In terms of how the SUREs were modified, 82.6% of students participated in a program where only remote research was allowed. More than half (58.5%) said that the program reduced the number of hours that students were expected to work. Under half (41.4%) saw a reduction in pay as compared with the initial offer. Approximately one-quarter (22.8%) lost subsidized housing due to the changes to their SUREs.
Table 3 reports on students’ satisfaction with their modified SUREs. The average ratings were between “neutral” (3) and “satisfied” (4) for their programs and the different elements of the programs. Students were most satisfied with their relationships with their faculty mentors (68.4% satisfied) and least satisfied with opportunities to present (28.2% satisfied) and publish (32.7% satisfied) their research.
Meana | SD | % Dissatisfiedb | % Neutral | % Satisfiedc | |
---|---|---|---|---|---|
Research experience in general | 3.41 | 0.987 | 19.6 | 31.3 | 49.1 |
Relationship with your faculty mentor | 3.89 | 1.001 | 8.4 | 23.2 | 68.4 |
Relationship with other lab/research members | 3.56 | 1.059 | 14.5 | 31.8 | 53.7 |
Opportunity to learn new research skills | 3.36 | 1.161 | 25.7 | 21.5 | 52.8 |
Opportunity to share your research through publications | 2.99 | 1.088 | 32.7 | 36.9 | 30.4 |
Opportunity to share your research through presentations | 3.18 | 1.110 | 28.2 | 29.1 | 42.7 |
Opportunity to clarify your future career path | 3.39 | 1.177 | 25.2 | 21.1 | 53.7 |
Provision of laptop or other research supplies to conduct research | 3.30 | 0.957 | 14.0 | 48.6 | 37.4 |
Research Question 2
SURE Status and Mental Health (the Feeling Domain).
Table 4A shows that students whose SUREs were canceled had reductions in anxiety severity as compared with students with modified SUREs. Specifically, they had GAD-7 scores that were 8% lower than those with modified programs (p < 0.05). We examined this finding using estimated marginal means. When all other variables were at the mean, the average anxiety score for students with canceled SURE programs was 12.36 (anxiety score range: 6–27); that score increased to 13.49 when the program was modified instead, holding all other variables at their mean. As shown in Table 4B, the association between cancellation and depression severity was not significant (p < 0.10) but was in the same direction, such that cancellation was associated with reduced depression severity. Students with more adverse event experiences had greater anxiety (p < 0.001) and depression (p < 0.001) severity.
(A) | (B) | (C) | (D) | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
B | LCI, UCI | p | Exp (B) | B | LCI, UCI | p | Exp (B) | B | LCI, UCI | p | Exp (B) | B | LCI, UCI | p | Exp (B) | |
Intercept | 2.961 | 2.35,3.57 | *** | 19.32 | 3.396 | 2.95,3.84 | *** | 29.84 | 0.825 | 0.20,1.45 | ** | 2.28 | 0.798 | 0.25,1.35 | ** | 2.22 |
Independent variable | ||||||||||||||||
F: SURE canceled | −0.081 | −0.16,−0.01 | * | 0.92 | −0.066 | −0.14,0.01 | 0.94 | 0.181 | 0.08,0.28 | *** | 1.20 | 0.318 | 0.25,0.39 | *** | 1.37 | |
Control variables | ||||||||||||||||
H: GPA | −0.055 | −0.20,0.09 | 0.95 | −0.095 | −0.20,0.01 | 0.91 | 0.080 | −0.08,0.24 | 1.08 | 0.068 | −0.07,0.20 | 1.07 | ||||
H: Juniorb | 0.021 | −0.07,0.11 | 1.02 | 0.006 | −0.07,0.09 | 1.01 | −0.168 | −0.28,−0.06 | ** | 0.85 | −0.091 | −0.18,0.01 | * | 0.91 | ||
H: Seniorb | 0.024 | −0.07,0.12 | 1.02 | −0.002 | −0.10,0.10 | 1.00 | −0.183 | −0.30,−0.07 | ** | 0.83 | −0.113 | −0.22,−0.01 | * | 0.89 | ||
B: Underrepresented racial/ethnic minority | 0.031 | −0.03,0.10 | 1.03 | 0.131 | 0.06,0.20 | *** | 1.14 | −0.095 | −0.19,−0.01 | * | 0.91 | −0.065 | −0.14,0.01 | 0.94 | ||
B: First-generation student | 0.005 | −0.07,0.08 | 1.01 | 0.016 | −0.05,0.08 | 1.02 | 0.063 | −0.03,0.16 | 1.07 | 0.073 | −0.02,0.16 | 1.08 | ||||
B: LGBQ+ | 0.082 | −0.03,0.19 | 1.09 | 0.020 | −0.05,0.09 | 1.02 | 0.010 | −0.10,0.12 | 1.01 | 0.009 | −0.08,0.10 | 1.01 | ||||
B: Manc | −0.107 | −0.18,−0.04 | ** | 0.90 | −0.035 | −0.10,0.03 | 0.97 | −0.019 | −0.12,0.08 | 0.98 | −0.034 | −0.12,0.05 | 0.97 | |||
B: Other genderc | −0.077 | −0.34,0.19 | 0.93 | −0.087 | −0.35,0.17 | 0.92 | −0.208 | −0.43,0.02 | 0.81 | −0.213 | −0.61,0.18 | 0.81 | ||||
B: International student | 0.046 | −0.07,0.17 | 1.05 | 0.025 | −0.08,0.13 | 1.03 | −0.137 | −0.32,0.05 | 0.87 | −0.053 | −0.20,0.10 | 0.95 | ||||
B: Pre-existing psychological condition | 0.187 | 0.08,0.29 | *** | 1.21 | 0.118 | 0.03,0.21 | * | 1.13 | ||||||||
R: Social support | −0.057 | −0.09,−0.02 | *** | 0.94 | −0.060 | −0.09,−0.03 | *** | 0.94 | ||||||||
T: Did Spring 2020 research | −0.036 | −0.12,0.05 | 0.96 | −0.074 | −0.16,0.01 | 0.93 | −0.059 | −0.18,0.06 | 0.94 | −0.036 | −0.14,0.07 | 0.96 | ||||
T: Semesters of research experience | 0.018 | −0.001,0.04 | 1.02 | 0.016 | −0.01,0.03 | 1.02 | 0.019 | −0.01,0.04 | 1.02 | 0.004 | −0.02,0.03 | 1.00 | ||||
T: SURE at home university | 0.102 | 0.02,0.19 | * | 1.11 | 0.085 | 0.01,0.16 | * | 1.09 | 0.102 | −0.01,0.21 | 1.11 | 0.056 | −0.06,0.17 | 1.06 | ||
F: COVID-19 adverse event experiences score | 0.050 | 0.03,0.07 | *** | 1.05 | 0.044 | 0.025,0.06 | *** | 1.04 | −0.001 | −0.02,0.02 | 1.00 | 0.007 | −0.01,0.03 | 1.01 |
SURE Status and Graduate School Intentions (the Striving Domain).
Students with canceled SUREs perceived that their program status would more negatively affect their likelihood of applying to graduate school (1.2, p < 0.001, Table 4C) and getting into graduate school (1.4, p < 0.001, Table 4D) than did students who participated in modified SUREs. Remember that the dependent variables are reverse scaled, such that positive effects reflect a reduced likelihood. We used estimated marginal means to predict the average “applying to graduate school” value when the students’ programs were canceled versus modified and all other variables were at the mean. “Applying to graduate school” was measured on a scale of 1–5 (with higher scores signaling a reduction in intentions), and the estimated mean value was 3.21 for students with canceled programs and 2.68 when the programs were modified, holding all other variables at the mean. For “graduate school acceptance,” assessed on the same scale, the average score was 3.62 for those from canceled programs and 2.63 for those with modified programs, when all other variables were at the mean.
Sensitivity Analysis.
When using only data for which we had complete cases, findings generally agreed with the MI results presented in Table 4. In terms of our focal independent variable, SURE program status, results were identical in terms of the direction of effect and statistical significance (p < 0.05) across the four models when we used and did not use MI. There were only small shifts in control variable findings between the MI results (Table 4) and the complete case results (described next). When predicting anxiety (n = 338), underrepresented racial/ethnic minority status became significant and positive, and SURE at home university lost significance. When predicting depression (n = 336), SURE at home university lost significance. When predicting applying to graduate school (n = 379), there were no differences. When predicting acceptance into graduate school (n = 380), senior lost significance. This suggests that results are not particularly sensitive to our use of MI, although the loss of several findings in the complete case analysis likely reflects how statistical power is improved through the use of MI (Zha and Harel, 2021).
DISCUSSION
Our analysis addressed two research questions. The first was: How were students affected by SURE program changes, including cancellations and modifications? The second was: How did SURE status (i.e., program cancellation vs. modification) impact students’ mental health and plans for attending graduate school? We answered these questions using data from a U.S. nationwide survey conducted during Summer 2020.
In terms of answering our first research question, we found that just under half of students accepted into 2020 SUREs that did not run “as usual” had their programs canceled. Program cancellations were more common among students who had planned to travel to another university for their SUREs than those who planned to stay at their home universities, suggesting the programs targeting national cohorts may have been more likely to cancel than those targeting local cohorts. These students reported substantial losses, yet more than half of them reported that SURE cancellation opened up new opportunities, suggesting that they were making the best of a difficult situation. Students who participated in a modified SURE were somewhat satisfied with their programs (Table 3). These findings suggests that students were Functioning (i.e., engaging in educational experiences that expanded their horizons; Soutter et al., 2014) relatively well, or at least not poorly, given challenging circumstances.
In terms of the second research question, we found that participation in modified SUREs was associated with greater anxiety severity (Table 4A), which ran counter to our hypothesis. Elements of modified programs could certainly be anxiety inducing and thus harmful in the Feeling domain of well-being. This domain is important to well-being, as it captures the emotional component of the concept (Soutter et al., 2014). The vast majority of students with modified SUREs were doing only remote research, which was not their original plan or the original plan of their mentors. A study of undergraduate researchers at Washington University in St. Louis, conducted in May and June of 2020, revealed that altered communication due to remote mentoring relationships was a major change in undergraduate students’ research experiences in the early months of the pandemic. Students either indicated that communication from their mentors was slower or nonexistent, which posed a major challenge to their research progress or that, while they communicated regularly with their mentors over email or videoconference, they missed speaking with other research team members (Speer et al., 2021). Housing and financial challenges among students with modified SUREs may have added to their anxiety. A full one-fifth in our study were “dissatisfied” with their modified programs, which implies that their experiences in programs could have added to their distress. However, we cannot rule out that doing research itself via any SURE is to some degree anxiety inducing (apart from any elements unique to modified programs), as we assume that students in the canceled group did not do research during that summer. Either way, findings suggest that program directors should expect anxiety symptoms among participants in modified programs, regardless of the source.
How high were the levels of anxiety (and depression) symptomology experienced by these students in relative terms? There is no baseline in terms of published levels of anxiety and depression severity among undergraduate researcher populations pre-COVID-19. Therefore, we must compare our data with statistics on U.S. college students pre–COVID-19. While not a perfect comparison, undergraduate researchers are included within this broader group. A U.S. national study of college students using the same scales as we did found that students scored 7.69 on the GAD-7 (anxiety) and 8.45 on the PHQ-9 (depression) during the 2017–2018 school year (Duffy et al., 2019). The average values here were 13.42 and 17.25, for anxiety and depression respectively (Table 2), which are double those baseline values. This dramatic difference is likely reflective of the pandemic’s negative impact on college student mental health (Cao et al., 2020; Son et al., 2020), rather than differences between researcher versus non-researcher student populations. While students with canceled programs experienced significantly less anxiety and marginally less depression than those in modified programs, the average student in the sample suffered from moderate to moderately severe anxiety and depression, regardless of program cancellation or modification.
Also in response to the second research question, we found that students with canceled programs reported the belief that their programs’ status would negatively influence their likelihood of applying to and being accepted into graduate school to a greater degree than students with modified programs (Table 4, C and D), as we had hypothesized. This Striving finding aligns with other research showing that being highly affected by COVID-19 reduced students’ graduate school intentions during the pandemic (Morales et al., 2021). Our findings may reflect students’ awareness of how undergraduate research participation facilitates the transition to graduate school (Gilmore et al., 2015; Hernandez et al., 2018). We can infer that these students believed that participation in modified programs versus cancellation during Summer 2020 was advantageous in terms of gaining acceptance to graduate school. The difference in effect size for the program status variable between the “gaining acceptance” and “applying” (i.e., larger for “gaining acceptance”) models suggests that students with canceled programs were more worried about how program cancellation would impact decisions made about their application by admissions committees than they were about their own likelihood of applying.
While acceptance decisions are out of students’ hands, students are in control of the decision to apply to graduate school. Thus, the belief we found among students with canceled programs (relative to those with modified programs) that they would be less likely to apply to graduate school is particularly concerning (even though the effect size is smaller), as it suggests that these students’ Striving was reduced and that they were more likely to disadvantage themselves by not submitting graduate school applications in the first place. This reflects how canceled programs harmed student well-being in the Striving domain, which captures the ability to stay motivated to achieve goals in the face of adversity (Soutter et al., 2014).
The findings presented in this paper allow us to respond to Burns et al.’s (2020) call to use research to understand what provisions would be most suitable within a higher education context during COVID to enhance student well-being. Our results, when paired with a small but growing body of research on modified SURE participants (Yang Yowler et al., 2021; Erickson et al., 2022) and virtual mentorship (Speer et al., 2021), suggest that remote SUREs are a reasonable and workable option under conditions wherein face-to-face contact is difficult. Given that many programs transitioned online in Summer 2020 with very little lead time, it stands to reason that online undergraduate research programs can be better when designed under more reasonable time horizons.
Research on student well-being in SUREs during COVID-19 is important, as it is likely that COVID-19 and other disruptions will be part of the SURE landscape for the foreseeable future. While some guidance exists for how to run remote SUREs (Qiang et al., 2020; Erickson et al., 2022), this analysis complements those studies by clarifying the impacts of summer program cancellation on students and by providing additional evidence that remote programs are a reasonable path forward. More generally, our findings, informed by the SWBM, speak to the importance of researchers, faculty mentors, and program directors taking a student wellness–centered approach when pivoting due to unforeseen circumstances. These circumstances can occur on a more macro-scale, such as in response to COVID-19 or a specific disaster, and on the micro-scale of the individual student (e.g., a family emergency, caregiving responsibilities, or individual health challenges). Being attuned to student wellness is particularly important for informing a discourse around how researchers, mentors, and program directors should interact with mentees as whole humans who need flexibility and empathy. A perspective grounded in well-being can inform undergraduate research programs that are inclusive and equitable in order to more fully support diverse students as knowledge creators.
Limitations
First, we asked students to reflect on SUREs in July 2020, when those doing modified programs were still engaged in the programs, so their perspectives could have changed by the ends of their programs. We also asked students about the SURE that they were initially accepted into when COVID-19 hit. We did not ask whether they had switched to another program at a later date. We conducted the survey in July 2020, due to concerns about recall bias regarding their pandemic experiences. Second, we focused on a limited number of short-term outcomes. Third, while we designed the survey to capture multiple genders, small numbers of other gender students limited analyses of this group. Fourth, there are unmeasured confounders occurring at the same time as the pandemic (e.g., political turmoil, heightened national consciousness of systemic racism) that could have influenced outcomes. Fifth, our response rate was 54%, and we lack data on why just under half of the students did not participate in the study. Possible reasons include not checking email or feeling too overwhelmed with pandemic-related challenges. Sixth, while our sample includes students from a wide variety of institutional contexts across the United States, it is not necessarily nationally representative. Generally speaking, the universities providing the most participants were research-intensive second-tier state universities, flagship state universities, and private universities. The sample also includes students with relatively extensive research experience, as they averaged three semesters of research (with a median of two semesters) before Spring 2020. It is possible that SURE program changes could have impacted those with less research experience to a greater degree than those with more, but our sample does not permit these analyses due to small counts of students with little research experience.
Finally, we do not know whether certain social groups were more sensitive to the effects of program status on the outcomes than others. For example, students with pre-existing mental health conditions had higher GAD-7 (anxiety) and PHQ-9 (depression) scores (Table 4). It is not clear how influential these differences were on associations between program status and anxiety and depression severity. Because only 32 students reported pre-existing conditions, we could not conduct a multivariable sensitivity analysis. Looking at bivariate comparisons stratified by pre-existing conditions (unpublished data), students (with and without pre-existing conditions) in modified programs had higher anxiety and depression severity scores than their counterparts from canceled programs. Differences in severity scores between programs with different status were larger for those with pre-existing conditions, suggesting they may have been more sensitive to the effects of program modification.
Recommendations
There are three recommendations to enhance student well-being stemming from our findings for undergraduate research program directors, staff and faculty, all of which are also applicable to academic-year research experiences.
Address Potential Causes of Mental Health Challenges.
Recognizing that modified undergraduate research programs are likely be anxiety inducing, undergraduate research programs can address potential challenges with remote and hybrid research contexts and what to expect during program orientation sessions. More broadly, addressing anxiety among undergraduate researchers conducting research in any modality is important, as this mental health condition widely affects college students today. Sharing information about counseling services provided by the university with all research participants is also essential to promote well-being. Faculty members, including research mentors, are well positioned to share mental health resources with students as well (Chirikov et al., 2020). Program directors can even provide those materials directly to faculty, who may not know where to turn for such information (Kalkbrenner et al., 2021). While faculty mentors are not mental health professionals, they have an important role to play by being attentive to mental health issues, empathetic, and connectors, linking students with resources on campus. A national survey of more than 14,000 faculty revealed that more than half (63%) were not engaged in referring their students to mental health services, although the vast majority (87%) thought this was part of a faculty’s role.(Albright and Schwartz, 2017). When faculty work with small numbers of students (as is the case in undergraduate research), they feel more comfortable broaching mental health issues with students and referring them to services (Kalkbrenner et al., 2021).
Programs can also sponsor trainings in self-care and well-being for trainees. It is important to be cognizant of the high levels of depression and anxiety symptoms among undergraduate researchers, apart from any additional impact of modified research experiences. Research programs and faculty mentors must prepare to deal with increasing numbers of trainees with mental health challenges moving forward, as increasing numbers of college students experience mental health problems (Hunt and Eisenberg, 2010; Xiao et al., 2017).
Be an Advocate for Student-Centered Adjustments to Graduate Admission Processes.
To address concerns about how the pandemic will influence students’ Striving via matriculation to graduate school, we recommend that faculty mentors advocate for adjustments to graduate admissions at their universities to account for the effects of COVID-19. These adjustments could include waiving Graduate Record Examination requirements, if they are still in place; recognizing that those attending college in places with strict COVID-19 protocols were kept out of in-person research for some time; waiving application fees due to economic barriers; and acknowledging that letters of reference from faculty mentors might be less personal due to a predominance of virtual interactions during 2020 and 2021.
Use Experiences Learned during COVID-19 as a Springboard to Broaden Participation.
Satisfaction data suggest that there is room for improvement in modified SUREs, which is not surprising, given how quickly they were developed for Summer 2020. In the future, programs can do a better job of promoting student Functioning from the outset by ensuring that students have access to needed laptops and research supplies. Actions at the university level have facilitated student access to laptops. Some have pivoted to requiring that all entering students have a laptop that meets specific requirements (e.g., College of Information Sciences and Technology at the University of Pennsylvania). Others have provided laptops directly to students at no charge thanks to donor support (e.g., Marietta College, Ohio) or federal COVID-19 relief funds (e.g., Cheyney University, Pennsylvania). The president of Cheyney, which is the nation’s oldest HBCU, recognized that the laptops were an essential resource for students engaging in undergraduate research, among other activities (Cheyney University, 2022).
To improve interactions between mentors and mentees in an online environment and promote well-being in the Relating domain, students like platforms like Slack, GroupMe, WhatsApp, or Discord to help facilitate informal communication (Erickson et al., 2022), and they prefer videoconferencing, rather than email, when communicating with mentors (Speer et al., 2021). These platforms are useful to facilitate informal social interaction among researchers, even if they also meet in person. SURE students in a National Institutes of Environmental Health Sciences R25 at the University of Utah have used WhatsApp to communicate with one another, even after COVID-19 restrictions eased in 2021. Professional development curricula, which are a staple of SUREs, can be delivered via teleconference and via an online framework to facilitate students finding more clarity in their career paths. This is currently happening at the University of Utah, where the Office of Undergraduate Research Education Series consists of live in-person and virtual (Zoom) workshops, as well as “on demand” (prerecorded) workshops. The literature already provides guidelines for how to best structure online/remote research experiences for undergraduates. These include borrowing strategies from distance-learning resources (Qiang et al., 2020) and streamlining engagement to focus on remote data analysis, literature reviews and science writing, and science journal clubs (Chandrasekaran, 2020). Erickson et al. (2022) also provide a detailed list of recommendations for remote SUREs from undergraduate Research Experiences for Undergraduates (REU) students within 10 areas that we recommend reviewing. This “forced innovation” in SUREs could transform into a “pandemic dividend” (i.e., long-term and unintentional benefits emerging from the pandemic) as others have recognized with online educational innovation during COVID-19 (Yang and Huang, 2020).
Above all, we can conceive of modified SUREs developed on the fly during COVID-19 as a pilot for future online SUREs. While there is some resistance to this at the faculty level (Kamariah and Azlina, 2022), these remote/online research opportunities can provide a model for how to create research opportunities for students who were previously unable to participate in SUREs (e.g., nontraditional students with dependents who cannot quit their jobs and move across the country for 10 weeks). If programs are willing to operate under a hybrid model in the future, a silver lining of the pandemic could be expanded SURE opportunities for a greater diversity of diverse students.