ASCB logo LSE Logo

General Essays and ArticlesFree Access

A Meta-analysis of University STEM Summer Bridge Program Effectiveness

    Published Online:https://doi.org/10.1187/cbe.20-03-0046

    Abstract

    University science, technology, engineering, and math (STEM) summer bridge programs provide incoming STEM university students additional course work and preparation before they begin their studies. These programs are designed to reduce attrition and increase the diversity of students pursuing STEM majors and STEM career paths. A meta-analysis of 16 STEM summer bridge programs was conducted. Results showed that program participation had a medium-sized effect on first-year overall grade point average (d = 0.34) and first-year university retention (Odds Ratio [OR] = 1.747). Although this meta-analytic research reflects a limited amount of available quantitative academic data on summer STEM bridge programs, this study nonetheless provides important quantitative inroads into much-needed research on programs’ objective effectiveness. These results articulate the importance of thoughtful experimental design and how further research might guide STEM bridge program development to increase the success and retention of matriculating STEM students.

    A META-ANALYSIS OF UNIVERSITY STEM SUMMER BRIDGE PROGRAM EFFECTIVENESS

    Over the past few decades, many federal agencies (e.g., the National Science Foundation, the National Institutes of Health) have called for an increase in the overall number of workers in science, technology, engineering, and math (STEM), particularly from underrepresented minorities (for the purposes of this paper, we refer to Black or African-American, Hispanic or Latinx, and Native American students as underrepresented minority students; National Science Board, 2015). STEM retention is a national concern, because there is a shortage of STEM workers qualified to engage with the next generation of technological and scientific advancements (National Science Board, 2015). Research suggests that almost a third of matriculating STEM students leave STEM by the end of their first year in college (National Science Board, 2018). In part, leaving STEM fields may be a function of students exploring a wide range of interests and majors in college; however, systemic factors (e.g., lack of campus resources oriented toward diverse students) also may impact the success of underrepresented minority students, who leave STEM majors and careers at higher rates than White and Asian students. Although the overall STEM dropout rate is high, the underlying differences in subgroup dropout rates are remarkable: of underrepresented minority students matriculating as STEM majors, only about one-fifth will ultimately go on to earn a STEM degree, compared with one-third of White students and nearly half of Asian students (Higher Education Research Institute, 2010). Underrepresented minority students are also more likely than White or Asian-American students to be the first in their immediate families to attend college and are less likely to have the economic or social support at home compared with later-generation college students (Jack, 2019). In response to these concerns, U.S. colleges and universities have developed a wide array of approaches to decrease STEM major attrition, including implementing STEM-specific summer bridge programs for matriculating first-year students. Although specific program goals and factors addressed vary (Ashley et al., 2017), STEM bridge programs are educational interventions designed to increase graduation rates and diversity in STEM majors and postsecondary careers (Sablan, 2014; Ashley et al., 2017).

    Despite the need for data and analyses informing the effectiveness of summer bridge programs, limited empirical research is available, and much of this research is in the form of highly descriptive accounts, qualitative results, and literature reviews (Sablan, 2014; Kitchen et al., 2018) rather than systematic and quantitative evaluations of bridge program success (Gullatt and Jan, 2003). To our knowledge, no meta-analysis has been conducted on STEM bridge programs. In the current paper, we examine the objective academic impact of STEM bridge program participation to reinforce and extend other informative work such as Ashley et al.’s (2017) systematic review of STEM bridge programs’ goals, student characteristics, research designs, and program success. We limit our analysis to academic outcomes associated with STEM retention and grade point average (GPA). Although we acknowledge that an array of outcomes is important and interesting to examine (e.g., motivation, STEM interest, self-efficacy), our relatively narrow focus is mostly a function of the outcomes currently examined in primary research studies. Increasing our understanding of the effectiveness of STEM bridge programs can provide insight into where future program directors might implement or improve features within their own programs to make them more effective. We also discuss ideas for future research on STEM bridge programs. For the purposes of this paper, we include the biological sciences (except majors specific to applied health science), physical sciences, mathematics, and computer science as “STEM” majors. For clarity of focus, and because primary research in these areas is limited, we have excluded consideration of social sciences such as psychology and anthropology.

    STEM Bridge Programs

    Increasing retention and diversity in STEM degree programs through program interventions may be an effective method to increase the number of STEM workers in the United States (President’s Council of Advisors on Science and Technology, 2012). More specifically, program interventions may bolster the success of students in terms of STEM retention by supplementing high school experiences and exposing students to resources at colleges and universities designed to support student success (Zuo et al., 2018). In particular, an academically challenging high school experience, especially in math and science (National Academy of Sciences, 2010), is beneficial for STEM students to succeed in college (Benbow and Arjmand, 1990). Students from underrepresented minority groups are more likely to miss out on academically challenging high school experiences, because high schools in low–socioeconomic status areas, where students from these backgrounds are often overrepresented (Estrada et al., 2016), are less likely to offer math classes higher than algebra II, to have laboratory STEM activities and equipment, and to employ teachers well qualified to teach STEM classes (Campbell et al., 2002; Peske and Haycock, 2006). As such, increasing retention and diversity in STEM requires augmenting student understanding of academically challenging content and providing meaningful support before students enter college.

    University STEM bridge programs are on-campus STEM interventions designed to increase STEM enrollment and retention (Wilson et al., 2012). STEM bridge programs provide intensive instruction in one or more STEM topics (Tsui, 2007) and expose students to realistic college expectations for STEM course work (Kezar, 2000). Bridge programs also often expose students to, and engage them in, other resources available at universities, such as tutoring, access to research opportunities, intensive advising, and mentorship programs (Maton et al., 2009). These resources and activities have multiple institutional goals, including improving the high school-to-college transition; providing a supportive campus community and climate; teaching students the importance and value of using college resources; and supporting students’ diverse backgrounds, needs, and perspectives (Wheatland, 2001). In addition to these institutional goals for students, common STEM-specific bridge program goals address student skills, attitudes, and their approach to work, including raising students’ confidence in their academic ability, developing problem-solving skills, increasing STEM career awareness and intentions, and augmenting math preparation (Yelamarthi and Mawasha, 2008).

    STEM Content Instruction.

    STEM bridge programs offer course work in one or more STEM topics, though whether this is introductory-level, remedial, or more advanced STEM course work varies by program (Ashley et al., 2017). Many bridge programs have the explicit goal of filling knowledge gaps and combating the “weeding out” experience in introductory-level gateway courses (Massey, 1992), because first-year experiences in STEM critically inform students’ decisions about whether to remain in or leave their STEM majors (Gainen and Willemsen, 1995).

    Tutoring.

    Many STEM bridge programs offer individual or group tutoring sessions. Going beyond in-class instruction, tutors can answer questions and correct student mistakes in understanding, and they can otherwise provide further in-depth explanation to increase student comprehension (Dioso-Henson, 2012). Required tutoring may be beneficial to students even when they do not request it, because students often underestimate how much academic help will benefit their performance (Hodges and White, 2001).

    Research Opportunities.

    Undergraduate research experience in STEM can involve working in applied or academic settings and with some combination of researchers, graduate and postdoctoral students, and faculty. These experiences allow students to identify, conceptualize, and execute various forms of correlational and experimental designs, as well as collect and analyze data, addressing basic science questions or real-world problems (Eagan et al., 2010). Research experiences may offer underrepresented minority students exposure to applied STEM subjects for the first time (Bauer and Bennett, 2003; Moore, 2006). The motivational, knowledge-based, and skill-based effects of obtaining research experience are significant and have been linked to greater STEM major retention (Gregerman et al., 1998), higher graduate school entrance rates, and enhanced pursuit of a STEM career (Zydney et al., 2002).

    Campus Orientation.

    Bridge programs provide exposure to the campus as well as information on campus resources, which may foster students’ sense of belonging to the university (i.e., the extent to which a student feels accepted at and fits into a college environment and major; Ostrove and Long, 2007). Campus orientation may be particularly important for first-generation college students, many of whom may need to be introduced to college not only academically, but also on informational, social, emotional, and cultural levels (McKenna and Lewis, 1986). Similarly, providing information about student organizations that may be relevant to underrepresented minority students may further promote a sense of belonging due to shared experiences, cultures, and networking opportunities (Torres, 2000).

    Faculty Mentoring.

    Mentorship is the process by which senior professionals support and advise less-experienced students or employees on their career plans (Hill et al., 1989). Students from all backgrounds have cited poor support from STEM faculty as a major reason for leaving STEM (Seymour and Hewitt, 1997), and lack of meaningful connections with STEM professors was a major theme in a qualitative analysis of STEM student attrition (Hong and Shull, 2010). This may be due in part to a common STEM classroom culture of professors expecting most students to struggle and a certain number of students to fail (Luppino and Sander, 2015). STEM bridge programs have the potential to create an environment designed to build closer relationships with professors, who then provide social and instructional support to participants (Ashley et al., 2017; Cooper et al., 2018). In turn, such positive faculty interactions can increase student science identity and STEM graduate degree intentions (Aikens et al., 2017), and STEM retention, GPA, and self-efficacy (Christe, 2013).

    Peer Mentoring and Tutoring.

    Many STEM bridge programs provide peer tutoring and mentoring. With peer tutoring, students receive tutoring from and give tutoring to their fellow students (Goodlad and Hirst, 1989). Outcomes of peer tutoring, such as retention of course material, often compare favorably with faculty tutoring (Moust and Schmidt, 1994). Peer mentoring can provide more immediate mentorship availability and accessibility than faculty mentorship, as well as rapport with, social connections to, and role modeling from people who have been on a similar academic journey (Budny et al., 2010). Peer mentoring in bridge programs can help incoming students develop social support networks, think critically, make informed academic choices (Brawer, 1996), and earn higher grades (Rodger and Tremblay, 2003). STEM applications in the practical setting (e.g., Stanich et al., 2018) suggest that peer mentors themselves benefit from mentoring in that they learn STEM material through teaching, given that teaching others is a form of active learning.

    Bridge Program Elements to Support Underrepresented Minority Students.

    Many modern STEM bridge programs seek to increase the social capital and support networks of underrepresented minority students (Arendale and Lee, 2018) and create greater diversity in STEM, understanding that students from these groups are more likely to face greater barriers to college and STEM fields, both socially (Stolle-McAllister, 2011) and academically (Wilson, 2000). Stereotype threat, or the psychosocial anxiety individuals may experience when they are concerned they will be judged based on the negative stereotypes about a group with which they identify (Steele and Aronson, 1995), may be especially salient: perceptions of stereotype threat by underrepresented minority STEM students have been linked to increased attrition to non-STEM majors (Beasley and Fischer, 2012). Bridge programs may be useful in addressing stereotype threat, because they can provide opportunities to gain STEM-related mastery experiences (Hernandez et al., 2013), which research has shown predicts STEM self-efficacy (e.g., Honicke and Broadbent, 2016; Dorfman and Fortus, 2019). Programs that offer diverse peer mentors may also be impactful, because diversity across peer mentoring in multiple STEM fields predicts higher diversity and successful graduation rates for underrepresented minority STEM students (Fox et al., 2009). Finally, bridge programs that address student cultures, such as by helping them identify prosocial connections with STEM topics and the impact they could make on their larger communities, may help students successfully integrate within the bridge program and the university (Estrada et al., 2016).

    Prior Research on STEM Bridge Programs

    A wide range of student outcomes, both STEM-specific and more general, are evaluated within and across STEM bridge programs. Because we were most interested in relatively objective outcomes related to student performance, and due to the limits of the primary studies in this area, we did not consider attitudinal outcomes such as science motivation, science interest, and bridge program satisfaction. Rather, we focused on outcomes such as STEM major retention (e.g., Smith, 2017), STEM graduation rates (e.g., Kopec and Blair, 2014), math assessment scores (e.g., Ami, 2001), and class-specific GPAs (e.g., chemistry; Graham et al., 2016). Other outcomes considered in STEM bridge program research are general (non–STEM specific) academic outcomes, including time to graduate (e.g., Whalin et al., 2017), overall GPA (e.g., Graham et al., 2013), and university retention (e.g., Wischusen et al., 2011). Although more distal outcomes, such as STEM retention and STEM graduation rates, may be most important in evaluating whether programs are meeting the ultimate goal of increasing STEM participation in the workforce, from our review, these are among the least common outcomes reported in published work. Further, research on STEM bridge programs does not generally conform to standard experimental design requirements that augment internal validity (e.g., the random assignment of students to control vs. experimental conditions; Estrada et al., 2016) or even quasi-experimental designs comparing bridge intervention and control conditions without random assignment. Each program also has unique implementation issues, as well as a unique profile of student and institutional characteristics, further complicating a quantitative review.

    As a result, few studies reported STEM-specific outcomes usable for meta-analytic purposes. Using a power analysis accounting for high levels of heterogeneity to detect a small effect size (d = 0.20), we estimated that at least 11 effects would be necessary to conduct a meta-analysis to exceed a statistical power of at least 0.70 (and 13 to exceed 0.80) to detect the meta-analytic mean (see Borenstein et al., 2011). As a result, we examine first-year GPA and university first-year retention, which were the only outcomes we considered that met the minimum threshold of 11 effects. We also limit our studies to those that report results on these outcomes for a comparable control group.

    First-Year Overall GPA.

    STEM research has linked students’ early overall GPA with STEM retention (Cromley et al., 2016). For example, in an analysis of ∼1200 STEM first-generation college students, Dika and D’Amico (2016) found that first-year GPA predicted STEM retention after three semesters. In a sample of 1925 college students, first-semester GPA was a moderate predictor of whether students ultimately received a STEM degree (Crisp et al., 2009). In a study of 137 freshman engineering students, higher first-year GPA predicted whether students would be retained in engineering into their second year in the program (Burtner, 2004). Based on the previously discussed aspects of bridge programs, we expect bridge participation to positively impact first-year GPA.

    Hypothesis 1: Bridge program participants will outperform control group participants on first-year university GPA.

    First-Year University Retention.

    Although we did not find enough studies that reported first-year STEM retention rates to use in our meta-analysis, first-year university retention may be worth exploring as a criterion of program success. For example, ∼20% of a nationally representative sample of college students entering a 4-year institution as STEM majors in 2003 dropped out of college rather than switching to a non-STEM major (Chen, 2013). To the extent that a STEM bridge program can increase university retention, the program may be providing a net positive impact to students, even if they leave STEM.

    Hypothesis 2: Bridge program participants will outperform control group participants on first-year university retention.

    We also explored publication bias using publication type as a moderator, meaning we explored whether publication type affected the strength of the relationship between bridge program participation and student outcomes. We compared published peer-reviewed articles with unpublished dissertations and conference papers. In our literature search, we discovered that many conference papers on bridge programs were program descriptions with very few data reported; consequently, we expected that unpublished outlets—like conference proceedings and dissertations—would include smaller effects than peer-reviewed papers, which would be more likely to include significant and larger effects.

    Hypothesis 3a: Effect sizes reported in studies of bridge programs published in peer-reviewed journals will tend to be larger for first-year overall GPA than the effects published in dissertations and conference papers.

    Hypothesis 3b: Studies of bridge programs published in peer-reviewed journals will tend to report greater first-year university retention than those published in dissertations and conference papers.

    METHODS

    Search Strategy

    Using the PsycINFO, Academic Search Complete, Medline, and ERIC academic databases, we searched for articles with titles, subjects, abstracts, or keywords containing 1) “science,” “technology,” “engineering,” “biology,” chemistry,” “physics,” ‘math,” “mathematics,” “calculus”; 2) “college,” “university,” “students,” “higher education”; 3) “summer,” “bridge”; and 4) “retention,” “attrition,” “GPA,” “grades,” “academic performance.” We excluded from our searches “elementary school” and “middle school,” as we were only interested in the high school-to-university transition. We also reviewed the programs referenced by Ashley et al.’s (2017) review of STEM summer bridge programs when they were not otherwise captured by our search process. Finally, we identified and contacted 17 researchers associated with a STEM bridge program that met the other inclusion requirements but for which we could not find quantitative academic data or data for a control group and requested unpublished data. After reading study abstracts, we identified 114 articles for further analysis based on our inclusion criteria.

    Inclusion Criteria

    Two research (H.P. & B.M.) assistants independently read the identified articles to determine whether they met the study’s inclusion criteria. B.C.B. made the final determination about whether articles met the inclusion criteria in cases of discrepancy between the research assistants. Articles were examined for further coding if the program 1) took place in the summer, on-campus, before students’ first year of university; 2) covered at least one STEM topic (non-STEM topics in addition to STEM topics were permissible); 3) reported at least one objective academic outcome (such as GPA or retention); and 4) reported results of a control group that was more narrowly defined than just the rest of the university (e.g., non–underrepresented minority STEM majors or STEM majors with weak academic backgrounds).

    Many bridge programs failed to meet our inclusion criteria, often because they did not report results from a similar control group. This is in line with Kulik et al. (1983), who found in their meta-analysis of college programs for high-risk students that only 60 (less than 12%) of the 504 articles the authors identified met their inclusion criteria, with a substantial portion failing to provide results for control groups or lacking appropriate control groups. Other studies excluded from this analysis included those that reported only nonquantitative subjective academic outcomes, such as qualitative data gained from conducting focus groups with participants, self-reported survey data such as perceived knowledge gained in a STEM topic or greater reported interest in a STEM topic, and measures of students’ satisfaction with the bridge program.

    Additionally, we intentionally excluded the Meyerhoff Scholars Program (Maton et al., 2012) from our analysis. The Meyerhoff program is a comprehensive STEM program that far surpasses an intervention with a STEM bridge program as its primary element (providing intensive, ongoing support for participants throughout all 4 years of university). Although we are limited by the information provided by other publications, no other bridge program in the primary studies included here describes a comprehensive program for ongoing student support, and thus we felt that the Meyerhoff program was qualitatively different. Notably, the Meyerhoff program (see Maton et al., 2012) is extremely successful, and including it in our meta-analysis would only strengthen the findings regarding STEM bridge program effectiveness.

    Coding Procedures

    We identified 25 studies that met the inclusion criteria. For each qualifying article, we recorded the quantitative outcome(s). After coding all reported outcomes, we determined that only first-year university GPA and first-year university retention met our requirement of having 11 or more effect sizes to have a power of more than 0.70 (see Borenstein et al., 2011). The most common general academic outcomes found in the literature search that did not meet our minimum number of studies were 2-year and 3-year university retention (three studies each). STEM-specific outcomes were 1-year and 3-year STEM retention (six studies each). In total, 16 studies comprising 25 samples were used in the meta-analysis. Two research assistants independently coded the sample size of the bridge and control groups; the overall first-year GPA of each group, the first-year university retention rate of each group, or both; and whether the study was published in a peer-reviewed journal. In cases of discrepancy between the two research assistants (which occurred in three out of 16 cases), B.C.B. made the final determination on the appropriate coding.

    We also coded several program characteristics that research suggests may be important, although we are limited by the depth and description each publication or report provided. We only counted programs as including an element if it was explicitly stated in the publication, but it is conceivable that programs contained elements not described therein. Eleven of the 16 programs incorporated some sort of tutoring arrangement through peers or the university’s tutoring center, whether this was during the bridge program, after the school year began, or both. Ten programs described some sort of faculty or industry professional mentoring arrangement during the summer or afterward, though programs varied in whether these relationships were mandatory or optional. Nine programs provided students with research opportunities during the summer or afterward. Although we are limited in our analysis of these moderators due to the small number of studies that examine them, we provide a summary of individual program characteristics in Table 1 for the interested reader.

    TABLE 1. Coding sheet for program elements of STEM bridge programsa

    IDProgramEffect IDMentorshipFee basedTutoringResearchCont
    DurAftTypeDurAftDurAft
    1Arizona State University Women in Applied Science and Engineering (WISE)1aYYProfYYNYNOther female students in the College of Engineering and Applied Sciences
    1b
    2University of Alabama Integrated Engineering Math-Based Summer Bridge Program2aYNProfY
    YNYNOther first-year engineering majors with similar mathematics placement scores
    2b
    2c
    3Arizona State University Minority Engineering Program (MEP) Summer Bridge Program3aNNNNNNNOther URM students in the College of Engineering and Applied Sciences
    3b
    4University of Cincinnati Emerging Ethnic Engineers Program4NNNYYNYMatched sample of engineering nonparticipants based on SAT scores
    5Morgan State University: Alliance for Minority Participation Summer Math Bridge Program and National Aeronautical Space Administration Morgan Engineering Enrichment Programb5NNNNNNNA random sample of other science, engineering, and math students
    6Bowling Green State University (BGSU) Academic Investment in Math and Science (AIMS)6YYProfNNYNYMatched sample based on high school academic profiles and demographics
    7University of West Georgia Institutional STEM Excellence (UWise)7aNYPeersNYYYYMatched sample based on gender, year of entry, and SAT/ACT scores
    7b
    8University of Missouri–Rolla Minority Engineering Program (MEP)8NYProfNYYNNNon-URM STEM students controlling for ACT math score and high school percentile rank
    9University of Florida Engineering Freshman Transition Program (EFTP)9YYPeersYNNNNOther first-year engineering majors
    10University of Memphis Summer Mathematics Bridge Bootcamp10YNProfYNNYYOther STEM majors who did not participate in any retention program, propensity score reweighting
    11Jackson State University Summer Engineering Enrichment Program (SEEP)11aNNNYNNNOther first-year engineering majors
    11b
    12Bridgewater State University STREAMS Summer Bridge Program12YNPeersNNNYYOther first-year STEM majors
    13Pennsylvania State University PreFirst Year Engineering & Science Program (PREF)13NYPeersNYYNNOther URM engineering first-year students
    14Middle Tennessee State University (MTSU) FirstSTEP14NNNNYNYMatched sample based on ACT math scores of other students enrolling in a precalculus course
    15Cleveland State University Operation STEM15YYProfNNYNNOther students taking precalculus
    16Saint Edward’s University Community for Achievement in Science, Academics and Research (CASAR)16aNNNYYYYOther first-year STEM majors
    16b
    16c
    16d

    aResearch reports are represented by ID number. Program, name of the bridge program; Effect ID, effect size identification code, which includes ID and effect identifier (i.e., when research reports included more than one effect, those effects are identified with lowercase letters); Dur, element offered during the summer bridge program; Aft, element offered after the summer bridge program; Prof, professional (e.g., STEM practitioners); Cont, control group characteristics.

    bStudy combined the two groups.

    The control groups used in the research studies included in this analysis are also described in Table 1. Five programs used all other STEM or engineering students as a control, seven programs used some sort of matched sample based on high school preparation, standardized tests scores and/or demographic background, three used more specific STEM demographic groups (two of underrepresented minority STEM students and one of female STEM students), and one program used all other students enrolled in precalculus. In four of these programs, students paid some amount to attend. In the remaining 12 programs, the program covered all costs (and in some cases provided stipends).

    Missing Data

    For studies that reported first-year overall GPA but did not report the SD of the GPA for the samples (seven of 12 studies), we imputed the SD using a weighted average of the square root of the variances reported in the other studies in the analysis (SD = 0.73 for program participants and SD = 0.60 for the control group).

    Analyses

    All meta-analyses were between-group comparisons using random-effects models, which tend to provide more accurate results compared with their fixed-effects counterparts when study effects are heterogeneous (National Research Council, 1992; Hunter and Schmidt, 2000). Heterogeneity is a reasonable assumption in the current meta-analysis, given the wide variety of bridge programs. For the first-year overall GPA outcome, the meta-analyzed Cohen’s d was calculated with a random-effects model as the standardized mean difference in bridge participants’ GPA compared with the control group’s GPA (i.e., positive d values indicate higher average GPA for the bridge group). For first-year university retention, the log-odds ratio of participant versus control retention was calculated as the odds that a bridge student would be retained compared with a control group student on a logarithmic scale. The log-odds ratio creates greater symmetry of the distribution of the outcome measures and centers it on 0 (Sterne et al., 2001), which makes the data more amenable to analyses. We then converted the log-odds ratio to a standard odds ratio for easier interpretability of the practical significance of findings. All analyses were conducted in R statistical software using the metafor package, a frequently used statistical package to fit fixed-, mixed-, and random-effects models to meta-analyses (Viechtbauer, 2010).

    RESULTS

    The 16 studies in this analysis yielded 25 different samples. Five studies were dissertations, six were conference papers, and the remaining five were published articles. Cumulatively, there were 4057 bridge program students and 26,516 control group students in this analysis. The median sample size of bridge participants was 75 (M = 122, SD = 167, interquartile range [IQR] = 30–101), and the median size of the control group was 168 (M = 967, SD = 2,051, IQR = 86–261). Many of these programs were at large public universities that had many more students deemed to be comparable to bridge program participants than the relatively few students who participated in the bridge program.

    Of these studies, there were 13 first-year overall GPA effects and 19 first-year university retention effects (because several studies provided separate results for different years or iterations of their bridge program, and some provided both GPA and retention data for a single sample). Table 1 shows other descriptive information of program elements. Table 2 shows descriptive information about each study and effect sizes used in the meta-analysis. The names of the programs and universities are listed in the table, rather than the citation, similar to the approach taken by other review articles of this nature (e.g., Estrada et al., 2016; Ashley et al., 2017). Tables 3 and 4 provide the results of all the analyses described.

    TABLE 2. Coding sheet for first-year overall GPA and first-year university retention for STEM bridge programs

    IDProgramEffect IDPRN bridgeN contSum of partGPA: bridgeGPA: controlFYR: bridgeFYR: control
    1Arizona State University Women in Applied Science and Engineering (WISE)1aY1034915294.0%75.0%
    1b4315920284.0%79.0%
    2University of Alabama Integrated Engineering Math-Based Summer Bridge Program2aY303060100%99.5%
    2b303060100%99.1%
    2c303060100%100%
    3Arizona State University
Minority Engineering Program (MEP) Summer Bridge Program3aY437912288.4%80.7%
    3b3510013581.6%75.0%
    4University of Cincinnati Emerging Ethnic Engineers Program4N12712725471.7%72.4%
    5Morgan State University: Alliance for Minority Participation Summer Math Bridge Program and National Aeronautical Space Administration Morgan Engineering Enrichment Programb5N814195002.812.48
    6Academic investment in math and science (AIMS) program at Bowling Green State University (BGSU)
    6N88881763.102.60
    7University of West Georgia Institutional STEM Excellence (UWise)7aN96961922.242.3078.1%61.5%
    7b2929582.972.4475.9%79.3%
    8University of Missouri–Rolla8N871142012.842.51
    9University of Florida EFTP9N415375141663.283.33
    10Summer Mathematics Bridge Bootcamp at U of Memphis10Y745690676512.912.5885.3%64.7%
    11Summer Engineering Enrichment Program (SEEP) at Jackson State University11aY14420635083.3%71.8%
    11b8823632484.1%71.6%
    12STREAMS Summer Bridge Program, Bridgewater State University12Y747265733990.0%80.0%
    13Pennsylvania State University PreFirst Year Engineering & Science Program (PREF)13N2692695382.892.57
    14Middle Tennessee State University (MTSU) FirstSTEP14Y36851212.562.29
    15Cleveland State University Operation STEM15Y7651759383.0%73.5%
    16Saint Edward’s University: Community for Achievement in Science, Academics and Research (CASAR)16aY171761933.333.0982.4%79.6%
    16b242012253.292.9387.5%79.6%
    16c242122363.192.9383.3%79.3%
    16d242282523.282.9370.8%82.5%

    aResearch reports are represented by ID number. Program, name of the bridge program; Effect ID, effect size identification code, which includes ID and effect identifier (i.e., when research reports included more than one effect, those effects are identified with lowercase letters); PR, program’s results published in a peer-reviewed publication; N bridge, number of students in the bridge program; N cont, number of students in the control group; Sum of part, total number of participants; FYR, first-year retention.

    bStudy combined the two groups.

    TABLE 3. Main effect analysesa

    AnalyseskEstSEzp95% CIτCrI
    First-year GPA120.34b0.093.66<0.001[0.16, 0.52]0.28[−0.23, 0.91]
    First-year retention191.747c0.134.23<0.001[1.35, 2.56]0.31[0.86, 3.57]

    ak, the number of studies; Est, effect size; SE, standard error; z, z-test value; p, probability; 95% CI, 95% confidence interval; τ, tau; 95% CrI, 95% credibility interval.

    bEffect size: Cohen’s d.

    cEffect size: odds ratio.

    TABLE 4. Moderator analysesa

    AnalyseskEstSEzp95% CIτ
    First-year GPAb
     Publication type120.370.211.750.09[−0.04, 0.78]0.24
     Published30.620.19[0.02, 1.23]
     Unpublished90.260.09[−0.08, 0.44]
    First-year retentionc
     Publication type191.511.271.720.08[−0.04, 0.78]0.24
     Published42.391.22[−0.25, 1.21]
     Unpublished151.581.15[−0.08, 1.33]

    ak, the number of studies; Est, effect size; SE, standard error; z, z-test value; p, probability; 95% CI, 95% confidence interval; τ, tau; 95% CrI, 95% credibility interval.

    bEffect size: Cohen’s d.

    cEffect size: odds ratio.

    First-Year Overall GPA

    The main effect of bridge program participation on GPA was statistically and practically significant, supporting hypothesis 1. Cohen’s d (Cohen, 1988) was 0.34 (95% confidence interval [CI] = 0.16, 0.52, p < 0.0001, credibility interval [CrI] = −0.23, 0.91). For context, in education interventions, a minimum detectable effect size (Cohen’s d) of between 0.20 and 0.40 is frequently set as a benchmark for whether the program has made a practical impact (Lee and Munk, 2008). Generally, bridge students generally had higher first-year overall GPAs than control group students. Qualifying these effects, as expected, there was large heterogeneity in the sample; QE(11) = 437.82, p < 0.0001, τ = 0.28. A retrospective power analysis using this effect size (d = 0.34) found that this analysis was appropriately powered (P = 0.99) to detect differences of this magnitude (Harrer et al., 2019).

    We also examined the studies in a direct manner for publication bias to address hypothesis 3a. We found that, on average, journal articles were marginally more likely to report larger positive effects for GPA outcomes than those published in conference papers and dissertations (journal M = 0.62, other publications M = 0.26; p = 0.08, 95% CI = −0.04, 0.78); however, of the studies that reported GPA, only three were published in peer-reviewed journals, meaning interpretability of this result is limited.

    First-Year Retention

    For first-year university retention, we examined the log-odds ratio using a random-effects model. Odds ratios compare the differences in probabilities of an event happening (in this case, first-year retention) between two groups (e.g., bridge students and control group students). The model was significant, with an odds ratio of 1.747 (p < 0.0001, 95% CI = 1.35, 2.56, CrI = 0.86, 3.57) in favor of a retained student being in the bridge group, supporting hypothesis 2. In other words, the mean odds ratio would predict that bridge students are 64% more likely (i.e., the odds ratio divided by one plus the odds ratio) to be retained than control group students. To provide further context that the odds ratio does not account for, the first-year retention base rates in these studies were moderately high (the weighted average retention rate across both groups was 76.1%), but many of the bridge groups were relatively small (the median size was 75 students), meaning that some caution should be used in extrapolating these findings. As with first-year GPA, there was evidence of heterogeneity in these studies; QE(18) = 39.32, p = 0.002, τ = 0.31.

    We also examined these studies for evidence of publication bias, addressing hypothesis 3b. We found that journals were marginally more likely to report positive outcomes than studies published in conference papers and dissertations (p = 0.09, 95% CI = −0.06, 0.88). However, of the studies that reported retention, only four were published in peer-reviewed journals, limiting our ability to find evidence of upward bias.

    DISCUSSION

    We examined the overall effectiveness of university STEM bridge programs, operationalized as participants’ first-year overall GPA and first-year university retention. We found a medium-sized effect of bridge program participation on first-year overall GPA compared with a control group, as well as greater first-year retention relative to control group students. The fact that bridge program participation impacted students’ retention, which college retention models generally regard as the result of academic performance (Tinto, 1999), provides evidence of a longer-term impact of the bridge program beyond just increasing GPAs. One caveat, however, is that we cannot isolate the effect of bridge performance on student GPA and retention, because the studies included in this meta-analysis did not systematically control for student motivation, self-efficacy, interest in science, or other variables that might influence performance through random assignment. That is, there is likely selection bias associated with the quasi-experimental approaches used in the studies included in this meta-analysis, and students who participate in bridge programs may differ from those who do not in some important ways that we cannot control. We also examined publication bias and found that findings in peer-reviewed journal articles tended to include more positive outcomes and larger effects (GPA and first-year retention) compared with conference papers and dissertations (marginally significant). This trend aligns with findings such as those in O’Boyle et al.’s (2017)’s management review, which found that published studies reported a ratio of supported to unsupported hypotheses that was more than twice as high as those in dissertations, presumably because peer-reviewed publications are more likely to report significant results. However, we were limited in the implications of our findings by the small number of studies published in peer-reviewed journals, meaning that further exploration of the extent of publication bias in STEM bridge program research is necessary.

    Finally, we have provided descriptive information on bridge programs to give researchers and practitioners a general overview of elements of past STEM bridge programs, although it is possible that some programs used elements the authors did not describe in the publications. We found that more than half of the programs in this meta-analysis provided students course tutoring, mentoring arrangements, and research opportunities, although the combination of services provided varied by program, as did the timing when students were offered these services (i.e., during or after the summer bridge program). These findings suggest that many, if not most, STEM bridge programs attempt to incorporate some of the elements research would suggest are most influential for STEM academic success and retention. In all three cases (tutoring, mentoring, and research opportunities), the number of programs that did not include these elements was too low to reasonably use in a quantitative analysis.

    Limitations

    This meta-analysis provides empirical meta-analytic summaries across all available studies meeting our inclusion criteria. We made every attempt to be comprehensive, and we can say with some confidence that the wide array of bridge program studies we meta-analyzed are representative of what is available in the literature. It was clear that the heterogeneity of STEM bridge programs and the range of outcomes they report, as well as the relatively underspecified methodologies that many studies employ, limit the ability of the current meta-analysis to yield generalizable conclusions about the effect of any future particular bridge program intervention. Given the tension between program heterogeneity and our desire to report the available evidence, our meta-analysis included only a relatively small subset of studies that met reasonable standards for research design. With a larger sample size, we would be able to test our hypotheses and examine publication bias with increased confidence (Simonsohn et al., 2014). Additional research in this area would also potentially broaden the array of outcomes beyond those examined here.

    Implications for Practitioners and Program Administrators

    Evaluating the effectiveness of bridge programs is a complex task. To have the strongest design, bridge program administrators should strive to ensure both internal validity (the confidence with which one can say that the results obtained from participation are the true result of the intervention) and external validity (the applicability of the bridge program in being able to provide generalizable conclusions that other bridge program directors may be able to draw upon; Gay and Airasian, 2000). In the following sections, we expand on our recommendations for program administrators and researchers examining the effectiveness of bridge programs.

    Tracking Additional Outcomes.

    Exploring other research questions beyond those in this meta-analysis would require tracking students beyond the yearlong time frame we report here, as well as ideally tracking STEM-specific outcomes. However, some universities do not require (and in some cases do not allow) students to declare majors until a certain point in their college careers (often at the end of the second year), which warrants additional consideration in terms of exploring how to operationalize early STEM retention and STEM performance. One option for program administrators is to collect data about students’ current major intentions upon matriculation and compare their intentions against their formally declared majors later in their academic careers. This approach would offer program administrators a way to account for the possibility of students’ intentions changing between accepting a university’s admission offer and beginning a bridge program, providing a more accurate accounting of the effect of a bridge program on retention. Tracking student engagement with the university, faculty, and peers during and after a bridge program might also allow researchers to better understand bridge students’ experiences at a university, how their experiences differ from those of nonparticipants, and how bridge participation might impact student engagement (Brewer, 2019).

    We also note that the majority of studies included in this meta-analysis were conducted at relatively larger, PhD-granting institutions (see Tables 1 and 2). Thus there is an opportunity to better study the effectiveness of bridge programs with a broader array of institution types. There may be some barriers to this endeavor, however. Two-year colleges may not offer specialized academic tracks, and student retention through a bachelor’s degree would be difficult to track. However, these institutions could examine the effectiveness of bridge programs on STEM course work and GPA, as well as declared major if students transfer to 4-year institutions. As other researchers have discussed (e.g., the review of Latinx STEM transfer interventions by Martin et al., 2018), 2-year institutions might coordinate with 4-year institutions to track transfer student success through the bachelor’s degree. This research could be particularly valuable in understanding whether bridge programs decrease transfer shock, which is when transferring students’ academic performance declines at their new 4-year institutions relative to their 2-year institution performance (Hills, 1965). Transfer students in STEM majors may experience greater transfer shock than transfer students in other majors (Lakin and Elliott, 2016), highlighting the importance of a continued focus on bridging academic STEM preparation gaps. Interventions such as mandatory learning communities for transfer students might reduce attrition when students transfer to a 4-year college or university (e.g., Scott et al., 2017). Moreover, students transferring from 2-year colleges into STEM classes and majors at 4-year institutions may also face unfavorable stereotypes by both faculty and peers about the ability and success of transfer students in STEM courses (Reyes, 2011). Despite these barriers, transfer students from 2-year colleges tend to be more committed to a specific major and career path than first-year university students (Aulck and West, 2017). Bridge programs at 4-year institutions might also be designed to better support the needs of transfer students.

    Bridge programs at smaller, 4-year liberal arts institutions could also be better studied. Students at these institutions do not tend to declare majors until later in their college careers, making STEM retention hard to gauge. Although traditional liberal arts colleges tend to not offer professional, vocational, or applied majors (including STEM majors such as engineering; Roche, 2010), they do tend to produce a greater percentage of graduates who eventually receive doctoral degrees in STEM fields than the percentage of graduates from larger universities (Cech, 1999). A liberal arts bridge program might be especially beneficial for students from underrepresented minority groups and students with weaker academic backgrounds, as liberal arts colleges may be able to offer STEM students a strong science and math foundation and educational environment (through smaller class sizes; Wolniak et al., 2004), although potentially at the expense of extensive research opportunities. In sum, examining the effectiveness of bridge programs for supporting success at 2-year and smaller 4-year institutions is a much-needed area of future research.

    Mixed-Methods Analyses.

    Mixed-methods research uses one or more studies to both qualitatively and quantitatively explore the same underlying phenomenon (Leech and Onwuegbuzie, 2009). Qualitative research can enrich researchers’ understanding of the impact of an intervention and uncover contextual factors that might influence student outcomes beyond just the direct effect of participating in the bridge program (Miller et al., 2020). In the context of STEM bridge programs, qualitative research on variables such as sense of belonging to one’s major and science, math, or engineering identity might be able to supplement and enrich quantitative analyses such as this meta-analysis. Although constructs related to STEM attitudes such as career aspirations can be assessed quantitatively (e.g., Beier et al., 2018), qualitative data (e.g., gathered through focus groups, survey responses, qualitative analyses of interviews) can enrich our understanding of these constructs. Qualitative research can also be incorporated into the findings of existing quantitative analyses (e.g., quantitative bridge program evaluation) to capture changes in bridge program students’ experiences and to assess whether program participation had a differential impact on students of different backgrounds (e.g., underrepresented minority students; see Tomasko et al., 2016).

    STEM bridge program goals vary between individual programs (see Ashley et al., 2017), and research benefits when researchers precisely define their hypotheses in the context of the program’s goals. For instance, researchers analyzing the impact of a bridge program goal to produce more STEM graduates should consider whether they also want to study the career intentions of these graduates, and whether these students intend to or ultimately enter a STEM field. They should also decide how to measure these goals. For instance, a program that is ultimately interested in determining whether participation was effective at increasing STEM interest (e.g., Thompson and Consi, 2008) might measure STEM career intentions, identity as a scientist, or sense of belonging to a STEM community, which might all be better predictors of students’ attitudes and intentions than STEM GPA or graduation major.

    Research Design Considerations.

    A full review of quasi-experimental designs useful in educational environments is beyond the scope of this paper (although see Campbell and Stanley, 1967). Nonetheless, we offer some ideas most relevant to our review. First, although randomized experimental designs are generally the “gold standard” for experimental research (Rogers and Révész, 2019), students usually opt into bridge programs, making random assignment impossible and selection bias likely. Therefore, it is important to consider the factors that could impact students’ self-selection into a program. For example, the cost to attend the program might play a major role in influencing students’ decisions to participate. Students who feel reasonably prepared for STEM course work might be less willing to pay for a summer program, but they might have participated if the program were free or provided a stipend. If this assumption is true, the academic impact of STEM bridge programs might be understated, because bridge students would likely be initially weaker in STEM preparation than control students. There may also be group differences in student self-efficacy, STEM interest, or other psychological characteristics, depending on whether programs are free, offer stipends, or are fee based.

    To determine program effectiveness while controlling for self-selection, matched sampling attempts to overcome the confounding that may occur when initial group differences are not controlled for (Campbell and Stanley, 1967) and provides an approach that is close to true experimental randomization (Stuart and Rubin, 2008), which might offer the most confidence in making conclusions about the effect of the program on student outcomes. Many studies we reviewed in our literature search compared results with a matched sample of similar students based on some operationalization of STEM preparedness, such as standardized test scores or high school performance (e.g., Gilmer, 2007; Bradford et al., 2019). Other studies used non-bridge underrepresented minority STEM students, or in the broadest cases, all other STEM students, as control group students (e.g., Kopec and Blair, 2014). Matched sampling analyses can be improved by using covariates that are not affected by a student participating in the bridge program (e.g., students’ demographic backgrounds, high school preparation) to build propensity scores, which attempt to match students on these covariates and reduce bias produced by confounding variables (Powell et al., 2020).

    Researchers might also consider increasing the internal validity of their studies by providing control group students a different treatment than the bridge program intervention (Campbell and Stanley, 1967), such as access to different classes or resources, rather than no-treatment controls. This approach would permit researchers to examine the effectiveness of different elements of the bridge program rather than the program in its entirety. Another way to increase internal validity would be to use multiple means of assessing constructs (i.e., using academic, psychological, and other STEM constructs such as career intentions) in both the treatment and control conditions pre and post intervention (i.e., a pretest–posttest control group design), which is considered one of the most robust approaches for quasi-experimental designs (Campbell and Stanley, 1967). Finally, time-series designs, in which data are collected at multiple time points pre and post intervention in order to see the impact of the intervention beyond underlying group trends (Grimshaw et al., 2000) can increase the strength of conclusions drawn about the impact of bridge program participation. Moreover, because bridge programs may be unable to increase sample sizes regardless of the outcome of any power analysis, it is important to make and report post hoc calculations to understand whether studies are powered adequately to detect expected effects.

    Future Directions

    Progress in bridge program research and evaluation can identify the effectiveness of a program, allowing comparisons of results against one another (in meta-analyses, within institution over time, or otherwise) and ensuring that researchers will have enough statistical power to detect significant and material effects of bridge program participation wherever those effects exist. University-specific gateway courses and class performance may be more straightforward for administrators to track, but these outcomes are among the least generalizable to other universities, which have different professors, class syllabi, and student populations. Although a discussion of classroom-level teaching practices is beyond the scope of this paper, incorporating the science of learning to design the most effective instruction methods to cover difficult STEM course work over a brief summer session is critical (National Academies of Sciences, Engineering, and Medicine, 2018). Future meta-analyses or institutional partnerships that allow for multilevel analyses across institutions could code for this natural variability (e.g., various classroom instruction styles, class syllabi, or other student characteristics) if institutions make this information available. Extending the outcomes examined in this research to include attitudes (e.g., STEM identity, belongingness, career aspirations) as well as performance outcomes would be valuable. Large-scale comparative studies could also be designed to identify which elements within the bridge program affect which outcomes.

    Similarly, reporting objective academic results as well as those of a control group for relevant STEM outcomes (e.g., STEM major retention, final STEM GPA) would allow many more studies to be used in future meta-analyses, providing more robust findings on program effectiveness. If a program does not have an easily accessible reference group to serve as a control, program administrators could compare the effect of participation with a group of STEM students as similar as possible to bridge participants by coding for and incorporating pre-existing differences, such as high school GPA, incoming Advanced Placement credit in STEM classes, and quantitative standardized ACT or Scholastic Aptitude Test (SAT) test scores, in both within-study analyses and meta-analyses.

    Future research could also explore underrepresented minority-focused STEM bridge programs, which comprise ∼50% of STEM bridge programs (Ashley et al., 2017). Examining whether these programs are more effective for underrepresented minority STEM students compared with more general STEM bridge programs would be valuable. Further research could also examine content differences between these two types of bridge programs. For instance, programs focused on underrepresented minority students might offer informational and social resources targeted toward the needs of this specific group of students. This line of research is especially important given the importance of inclusive STEM instruction. More generally, all STEM bridge programs should strive to define diverse students’ learning outcomes using a strengths, or asset-based, pedological approach rather than one focused on students’ perceived deficits (Johnson, 2019). Understanding that student-centered interventions (such as bridge programs) alone have not been enough to equalize STEM retention rates across groups, higher education researchers have identified increased institutional support as also necessary to build a culture of inclusive diversity and support the success of students who have been historically excluded from science based on their racial and ethnic backgrounds (termed “persons excluded because of their ethnicity or race,” or PEERs; Asai, 2020).

    Researchers should also attend to the definition of STEM relative to underrepresentation. Women major in the biological and health sciences at a significantly greater rate than they do other STEM majors (Dika and D’Amico, 2016). Similarly, students from underrepresented minority groups and female students have the highest graduation rates in biological and health fields (Lewis et al., 2009). As a result, the study of “PEMC” (physical sciences rather than any sciences, and computer science specifically instead of broader technology studies) may become the highest priority in interventions to ensure access across gender and race (Dika and D’Amico, 2016). Correspondingly, STEM bridge programs might also shift to more narrowly define their targeted STEM students. Future research on differing academic performance and rates of attrition by STEM subfield (especially regarding whether engineering and non-engineering STEM students have different intervention needs) may be useful, and many STEM bridge programs are specific to engineering students (e.g., Allen, 2001; Gleason et al., 2010). Engineering, which encompasses how scientific and engineering principles are combined and applied to solve problems (Kieran and O’Neill, 2009), is the STEM field with the most underrepresentation for both female and underrepresented minority students (Dika and D’Amico, 2016). It is distinct from other STEM majors (e.g., natural sciences) based on the extent to which students’ quantitative skills and confidence in quantitative ability predict academic success (Veenstra et al., 2009). To increase student diversity, more STEM bridge programs might be designed around the predictors of success of engineering students in the future.

    Finally, more research is required on student progression to graduate-level education in STEM and STEM careers as an outcome. Many researchers and policy makers discuss the importance of producing more STEM researchers and professionals, who often require education beyond a bachelor’s degree. However, STEM bridge programs rarely track graduate school enrollment rates (Ashley et al., 2017), and virtually none that we know of track STEM careers. Providing early opportunities for research experience may inherently make students more competitive for graduate programs and STEM careers. The inclusion of exposure to STEM as an applied practice, the learning acquired from gaining STEM research experience as part of bridge programs, and students’ consequent STEM decisions should also be explored.

    CONCLUSION

    STEM bridge programs serve an important goal of increasing STEM major retention, particularly for students who have faced barriers to successful STEM degree completion. However, despite the expense of these programs, the field has lacked systematic analysis of program effectiveness, as well as any consensus on criteria for success. To our knowledge, this is the first systematic quantitative review of the effectiveness of STEM bridge programs. We found that STEM bridge programs positively affected first-year student retention and performance. However, we were constrained in our analysis due to the limited outcomes many of the primary studies reported. Further research in this area would benefit from researchers and bridge program administrators continuing to examine a broad array of student outcomes and improving their study designs. We hope that this meta-analysis will serve others as a useful foundation for future inquiry into how to improve STEM bridge programs and augment the performance of the STEM students who would benefit the most from additional support.

    ACKNOWLEDGMENTS

    We gratefully acknowledge Bryce Miller and Hannah Park, former undergraduate research assistants in the Adult Skills and Knowledge Lab, for their assistance with coding. This project was unfunded; however, it was supported by B.C.B.’s National Science Foundation–supported research assistantship (NSF Award #1565032) while a graduate student. An earlier version of this study was presented as a poster at the 30th Association for Psychological Science Annual Convention in San Francisco, CA, in 2018.

    REFERENCES

  • Aikens, M. L., Robertson, M. M., Sadselia, S., Watkins, K., Evans, M., Runyon, C. R., ... & Dolan, E. L. (2017). Race and gender differences in undergraduate research mentoring structures and research outcomes. CBE—Life Sciences Education, 16(2), 1–12. https://doi.org/10/ggq342 Google Scholar
  • Allen, L. (2001). An evaluation of the University of Missouri–Rolla minority engineering program 7-week summer bridge program (Unpublished doctoral dissertation). University of Missouri–Columbia. Google Scholar
  • Ami, C. G. (2001). The effects of a four-week summer bridge program. Albuquerque: University of New Mexico. Google Scholar
  • Arendale, D. R., & Lee, N. L. (2018). Bridge programs. In Flippo, R. F.Bean, T. W. (Eds.), Handbook of college reading and study strategy research (pp. 281–292). New York, NY: Routledge. Google Scholar
  • Asai, D. J. (2020). Race matters. Cell, 181(4), 754–757. https://doi.org/10/ghprj5 MedlineGoogle Scholar
  • Ashley, M., Cooper, K. M., Cala, J. M., & Brownell, S. E. (2017). Building better bridges into STEM: A synthesis of 25 years of literature on STEM summer bridge programs. CBE—Life Sciences Education, 16(4), 1–18. https://doi
.org/10/ghbrjt Google Scholar
  • Aulck, L., & West, J. (2017). Attrition and performance of community college transfers. PLoS ONE, 12(4), e0174683. https://doi.org/10/f93j3p MedlineGoogle Scholar
  • Bauer, K. W., & Bennett, J. S. (2003). Alumni perceptions used to assess undergraduate research experience. Journal of Higher Education, 74(2), 210–230. https://doi.org/10.1080/00221546.2003.11777197 Google Scholar
  • Beasley, M. A., & Fischer, M. J. (2012). Why they leave: The impact of stereotype threat on the attrition of women and minorities from science, math and engineering majors. Social Psychology of Education, 15(4), 427–448. https://doi.org/10/f4kn2q Google Scholar
  • Beier, M. E., Kim, M. H., Saterbak, A., Leautaud, V., Bishnoi, S., & Gilberto, J. M. (2018). The effect of authentic project-based learning on attitudes and career aspirations in STEM. Journal of Research in Science Teaching, 56, 3–23. doi: 10.1002/tea.21465 Google Scholar
  • Benbow, C. P., & Arjmand, O. (1990). Predictors of high academic achievement in mathematics and science by mathematically talented students: A longitudinal study. Journal of Educational Psychology, 82(3), 430–441. https://doi.org/10/b92psn Google Scholar
  • Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2011). Introduction to Meta-analysis. Hoboken, NJ: Wiley. Google Scholar
  • Bradford, B., Beier, M. E., Wolf, M., McSpedon, M., & Taylor, M. (2019, June). STEM bridge program participation predicts first-and second-semester math performance. 2019 ASEE Annual Conference & Exposition, Tampa, FL. Google Scholar
  • Brawer, F. B. (1996). Retention–attrition in the nineties (No. ED393510). ERIC Clearinghouse for Community Colleges. Retrieved from https://eric.ed
.gov/?id=ED393510 Google Scholar
  • Brewer, H. C. (2019). Predicting Latino community college student success: A conceptual model for first-year retention (Doctoral dissertation). Seton Hall University Dissertations and Theses (ETDs.). Retrieved March 30, 2021, from https://scholarship.shu.edu/dissertations/2625 Google Scholar
  • Budny, D., Paul, C., & Newborg, B. B. (2010). Impact of peer mentoring on freshmen engineering students. Journal of STEM Education: Innovations and Research, 11(5), 9–24. Google Scholar
  • Burtner, J. (2004, October). Critical-to-quality factors associated with engineering student persistence: The influence of freshman attitudes. Proceedings of the 34th Annual Frontiers in Education Conference, Savannah, GA. Google Scholar
  • Campbell, D. T., & Stanley, J. C. (1967). Experimental and quasi-experimental designs for research. Chicago, IL: Rand McNally. Google Scholar
  • Campbell, P. B., Jolly, E., Hoey, L., & Perlman, L. K. (2002). Upping the numbers: Using research-based decision making to increase diversity in the quantitative disciplines (RIE No. ED463714). Groton, MA: Campbell-Kibler Associates. Google Scholar
  • Cech, T. R. (1999). Science at liberal arts colleges: A better education? Daedalus, 128(1), 195–216. JSTOR. Google Scholar
  • Chen, X. (2013). STEM attrition: College students’ paths into and out of STEM fields (Statistical Analysis Report NCES 2014–001). Washington, DC: National Center for Education Statistics, Institute of Education Sciences. Google Scholar
  • Christe, B. (2013). The importance of faculty-student connections in STEM disciplines: A literature review. Journal of STEM Education: Innovations & Research, 14(3), 22–26. Google Scholar
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. New York, NY: Routledge. Google Scholar
  • Cooper, K. M., Ashley, M., & Brownell, S. E. (2018). Breaking down barriers: A bridge program helps first-year biology students connect with faculty. Journal of College Science Teaching, 47(4), 60–70. Google Scholar
  • Crisp, G., Nora, A., & Taggart, A. (2009). Student characteristics, pre-college, college, and environmental factors as predictors of majoring in and earning a STEM degree: An analysis of students attending a Hispanic serving institution. American Educational Research Journal, 46(4), 924–942. https://doi.org/10/ccgmth Google Scholar
  • Cromley, J. G., Perez, T., & Kaplan, A. (2016). Undergraduate STEM achievement and retention: Cognitive, motivational, and institutional factors and solutions. Policy Insights from the Behavioral and Brain Sciences, 3(1), 4–11. https://doi.org/10/ghbrg6 Google Scholar
  • Dika, S. L., & D’Amico, M. M. (2016). Early experiences and integration in the persistence of first-generation college students in STEM and non-STEM majors. Journal of Research in Science Teaching, 53(3), 368–383. https://doi.org/10/ghbrd3 Google Scholar
  • Dioso-Henson, L. (2012). The effect of reciprocal peer tutoring and non-reciprocal peer tutoring on the performance of students in college physics. Research in Education, 87(1), 34–49. https://doi.org/10/ghbrdz Google Scholar
  • Dorfman, B.-S., & Fortus, D. (2019). Students’ self-efficacy for science in different school systems. Journal of Research in Science Teaching, 56(8), 1037–1059. https://doi.org/10/gfwbp5 Google Scholar
  • Eagan, K., Hurtado, S., & Chang, M. J. (2010, November). What matters in STEM. Proceedings of the 2010 Annual Meeting of the Association for the Study of Higher Education, Indianapolis, IN, (pp. 1–34). Google Scholar
  • Estrada, M., Burnett, M., Campbell, A. G., Campbell, P. B., Denetclaw, W. F., Gutiérrez, C. G., ... & Zavala, M. (2016). Improving underrepresented minority student persistence in STEM. CBE—Life Sciences Education, 15(3), 1–10. https://doi.org/10/ghbrgr Google Scholar
  • Fox, M. F., Sonnert, G., & Nikiforova, I. (2009). Successful programs for undergraduate women in science and engineering: Adapting versus adopting the institutional environment. Research in Higher Education, 50(4), 333–353. https://doi.org/10/chr9rd Google Scholar
  • Gainen, J., & Willemsen, E. W. (1995). Fostering student success in quantitative gateway courses. San Francisco, CA: Jossey-Bass. Google Scholar
  • Gay, L., & Airasian, P. W. (2000). Educational research: An introduction (6th ed.). Englewood Cliffs, NJ: Prentice Hall. Google Scholar
  • Gilmer, T. C. (2007). An understanding of the improved grades, retention and graduation rates of STEM majors at the academic investment in math and science (AIMS) program of Bowling Green State University (BGSU). Journal of STEM Education, 8(1), 11–21. Google Scholar
  • Gleason, J., Boykin, K., Johnson, P., Bowen, L., Whitaker, K. W., Micu, C., ... & Slappey, C. (2010). Integrated engineering math-based summer bridge program for student retention. Annual Conference Proceedings: Advances in Engineering Education, 4, 1–8. Retrieved March 30, 2021, from https://eric.ed.gov/?id=EJ1076158 Google Scholar
  • Goodlad, S., & Hirst, B. (1989). Peer tutoring: A guide to learning by teaching. New York, NY: Nichols Publishing. Google Scholar
  • Graham, K. J., McIntee, E. J., & Armbrister, P. M. (2013, April). NSF S-STEM scholarship and support mechanisms: A cohort-based summer bridge program in chemistry. Proceedings of the 245th American Chemical Society National Meeting, New Orleans, LA. Google Scholar
  • Graham, K. J., McIntee, E. J., Raigoza, A. F., Fazal, M. A., & Jakubowski, H. V. (2016). Activities in an S-STEM program to catalyze early entry into research. Journal of Chemical Education, 94(2), 177–182. https://doi
.org/10/f9wbrd Google Scholar
  • Gregerman, S. R., Lerner, J. S., von Hippel, W., Jonides, J., & Nagda, B. A. (1998). Undergraduate student-faculty research partnerships affect student retention. Review of Higher Education, 22(1), 55–72. https://doi
.org/10/ghbrjf Google Scholar
  • Grimshaw, J., Campbell, M., Eccles, M., & Steen, N. (2000). Experimental and quasi-experimental designs for evaluating guideline implementation strategies. Family Practice, 17(1), 11–16. https://doi.org/10/bdpsj2 Google Scholar
  • Gullatt, Y., & Jan, W. (2003). How do pre-collegiate academic outreach programs impact college-going among underrepresented students? Pathways to College Network. Retrieved March 30, 2021, from http://citeseerx
.ist.psu.edu/viewdoc/download?doi=10.1.1.483.7094&rep=rep1&type=pdf Google Scholar
  • Harrer, M., Cuijpers, P., & Ebert, D. (2019). Doing meta-analysis in R: A hands-on guidehttps://doi.org/10/gghwcp Google Scholar
  • Hernandez, P. R., Schultz, P. W., Estrada, M., Woodcock, A., & Chance, R. C. (2013). Sustaining optimal motivation: A longitudinal analysis of interventions to broaden participation of underrepresented students in STEM. Journal of Educational Psychology, 105(1). https://doi.org/10/gghhxg MedlineGoogle Scholar
  • Higher Education Research Institute. (2010). Degrees of success: Bachelor’s degree completion rates among initial STEM majors. Los Angeles: University of California. Google Scholar
  • Hill, S. E. K., Bahniuk, M. H., & Dobos, J. (1989). The impact of mentoring and collegial support on faculty success: An analysis of support behavior, information adequacy, and communication apprehension. Communication Education, 38(1), 15–33. https://doi.org/10/cnmvgn Google Scholar
  • Hills, J. R. (1965). Transfer shock: The academic performance of the junior college transfer. Journal of Experimental Education, 33(3), 201–215. https://doi.org/10/dqmf Google Scholar
  • Hodges, R., & White, W. G., Jr. (2001). Encouraging high-risk student participation in tutoring and supplemental instruction. Journal of Developmental Education, 24(3), 2–12. Google Scholar
  • Hong, B. S., & Shull, P. (2010). A retrospective study of the impact faculty dispositions have on undergraduate engineering students. College Student Journal, 44(2), 266–279. Google Scholar
  • Honicke, T., & Broadbent, J. (2016). The influence of academic self-efficacy on academic performance: A systematic review. Educational Research Review, 17, 63–84. https://doi.org/10/gd6ft9 Google Scholar
  • Hunter, J. E., & Schmidt, F. L. (2000). Fixed effects vs. random effects meta-analysis models: Implications for cumulative research knowledge. International Journal of Selection and Assessment, 8(4), 275–292. https://doi.org/10/djx9zx Google Scholar
  • Jack, A. A. (2019). The privileged poor: How elite colleges are failing disadvantaged students. Cambridge, MA: Harvard University Press. Google Scholar
  • Johnson, K. M. S. (2019). Implementing inclusive practices in an active learning STEM classroom. Advances in Physiology Education, 43(2), 207–210. https://doi.org/10/ggvcpm MedlineGoogle Scholar
  • Kezar, A. (2000). Summer bridge programs: Supporting all students (No. ED442421). ERIC Clearinghouse on Higher Education. Retrieved March 30, 2021, from https://eric.ed.gov/?id=ED442421 Google Scholar
  • Kieran, P., & O’Neill, G. (2009). Peer-assisted tutoring in a chemical engineering curriculum: Tutee and tutor experiences. Journal of Peer Learning, 2, 40–67. Google Scholar
  • Kitchen, J. A., Sadler, P., & Sonnert, G. (2018). The impact of summer bridge programs on college students’ STEM career aspirations. Journal of College Student Development, 59(6), 698–715. https://doi.org/10/gfrcmh Google Scholar
  • Kopec, R. L., & Blair, D. A. (2014, August). Community for achievement in science, academics, and research: The CASAR project. Proceedings of the 6th Annual First Year Engineering Experience Conference, College Station, TX. Google Scholar
  • Kuh, G. D., Kinzie, J., Buckley, J. A., Bridges, B. K., & Hayek, J. C. (2006). What matters to student success: A review of the literature. National Symposium on Postsecondary Student Success. Retrieved March 30, 2021, from http://www.alearningboxblog.com/uploads/5/8/0/2/58020745/kuh
_team_report.pdf Google Scholar
  • Kulik, C.-L. C., Kulik, J. A., & Shwalb, B. J. (1983). College programs for high-risk and disadvantaged students: A meta-analysis of findings. Review of Educational Research, 53(3), 397–414. https://doi.org/10/brssmv Google Scholar
  • Lakin, J. M., & Elliott, D. C. (2016). STEMing the shock: Examining transfer shock and its impact on STEM major and enrollment persistence. Journal of the First-Year Experience & Students in Transition, 28(2), 9–31. Google Scholar
  • Lee, H., & Munk, T. (2008). Using regression discontinuity design for program evaluation. In Proceedings of the 2008 Joint Statistical Meeting (pp. 3–7). Alexandria, VA: American Statistical Association. Google Scholar
  • Leech, N. L., & Onwuegbuzie, A. J. (2009). A typology of mixed methods research designs. Quality & Quantity, 43(2), 265–275. https://doi
.org/10.1007/s11135-007-9105-3 Google Scholar
  • Lewis, J. L., Menzies, H., Nájera, E. I., & Page, R. N. (2009). Rethinking trends in minority participation in the sciences. Science Education, 93(6), 961–977. https://doi.org/10/dgvpk7 Google Scholar
  • Luppino, M., & Sander, R. (2015). College major peer effects and attrition from the sciences. IZA Journal of Labor Economics, 4(1), 4. https://doi
.org/10.1186/s40172-014-0019-8 Google Scholar
  • Martin, J. P., Choe, N. H., Halter, J., Foster, M., Froyd, J., Borrego, M., & Winterer, E. R. (2018). Interventions supporting baccalaureate achievement of Latinx STEM students matriculating at 2-year institutions: A systematic review. Journal of Research in Science Teaching, 56(4), 440-464. Google Scholar
  • Massey, W. E. (1992). A success story amid decades of disappointment. Science, 258(5085), 1177–1180. https://doi.org/10/b7ww4t MedlineGoogle Scholar
  • Maton, K. I., Pollard, S. A., McDougall Weise, T. V., & Hrabowski, F. A. (2012). Meyerhoff Scholars Program: A strengths-based, institution-wide approach to increasing diversity in science, technology, engineering, and mathematics. Mount Sinai Journal of Medicine, 79(5), 610–623. https://doi.org/10/ghbrdh MedlineGoogle Scholar
  • Maton, K. I., Sto Domingo, M. R., Stolle-McAllister, K. E., Zimmerman, J. L., & Hrabowski, F. A. (2009). Enhancing the number of African Americans who pursue STEM PhDs: Meyerhoff Scholarship Program outcomes, processes, and individual predictors. Journal of Women and Minorities in Science and Engineering, 15(1), 15–37. https://doi.org/10/bj9557 MedlineGoogle Scholar
  • McKenna, P. G., & Lewis, V. (1986). Tapping potential: Ten steps for retaining underrepresented students. Journal of College Student Personnel, 27(5), 452–453. Google Scholar
  • Miller, C. J., Smith, S. N., & Pugatch, M. (2020). Experimental and quasi-experimental designs in implementation research. Psychiatry Research, 283, 112452. https://doi.org/10/ggkmjv MedlineGoogle Scholar
  • Moore, J. L. (2006). A qualitative investigation of African American males’ career trajectory in engineering: Implications for teachers, school counselors, and parents. Teachers College Record, 108(2), 246. https://doi
.org/10/b4gvk8 Google Scholar
  • Moust, J. H., & Schmidt, H. G. (1994). Facilitating small-group learning: A comparison of student and staff tutors’ behavior. Instructional Science, 22(4), 287–301. https://doi.org/10/bpk7tc Google Scholar
  • National Academies of Sciences, Engineering, and Medicine. (2018). How people learn II: Learners, contexts, and cultures. Washington, DC: National Academies Press. Google Scholar
  • National Academy of Sciences. (2010). Expanding underrepresented minority participation: America’s science and technology talent at the crossroads and the expansion of the science and engineering workforce pipeline. Washington, DC: National Academies Press. Google Scholar
  • National Research Council. (1992). Combining information: Statistical issues and opportunities for research. Washington, DC: National Academies Press. https://doi.org/10.17226/20865 Google Scholar
  • National Science Board. (2015). Revisiting the STEM workforce: A companion to science and engineering indicators 2014 (NSB-2015-10). Retrieved March 30, 2021, from https://www.nsf.gov/nsb/publications/2015/nsb201510.pdf Google Scholar
  • National Science Board. (2018). Major switching among first-time postsecondary students beginning 4-year colleges and universities in 2011–12: 2013–14 (Table B-12). Science and Engineering Indicators 2018. Retrieved March 30, 2021, from https://www.nsf.gov/statistics/2018/nsb20181/data/tables Google Scholar
  • O’Boyle, E. H., Banks, G. C., & Gonzalez-Mulé, E. (2017). The chrysalis effect: How ugly initial results metamorphosize into beautiful articles. Journal of Management, 43(2), 376–399. https://doi.org/10/gf2k59 Google Scholar
  • Ostrove, J. M., & Long, S. M. (2007). Social class and belonging: Implications for college adjustment. Review of Higher Education, 30(4), 363–389. https://doi.org/10/ghbrjm Google Scholar
  • Peske, H. G., & Haycock, K. (2006). Teaching inequality: How poor and minority students are shortchanged on teacher quality. Education Trust. Retrieved March 30, 2021, from https://edtrust.org/wp-content/uploads/2013/10/TQReportJune2006.pdf Google Scholar
  • Powell, M. G., Hull, D. M., & Beaujean, A. A. (2020). Propensity score matching for education data: Worked examples. Journal of Experimental Education, 88(1), 145–164. https://doi.org/10/ggf5gz Google Scholar
  • President’s Council of Advisors on Science and Technology. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics (No. ED541511). Washington, DC: U.S. Government Office of Science and Technology. Google Scholar
  • Reyes, M.-E. (2011). Unique challenges for women of color in STEM transferring from community colleges to universities. Harvard Educational Review, 81(2), 241–263. https://doi.org/10/gg45cn Google Scholar
  • Roche, M. W. (2010). Why choose the liberal arts? Notre Dame, IN: University of Notre Dame Press. Google Scholar
  • Rodger, S., & Tremblay, P. F. (2003). The effects of a peer mentoring program on academic success among first year university students. Canadian Journal of Higher Education, 33(3), 1–17. Google Scholar
  • Rogers, J., & Révész, A. (2019). Experimental and quasi-experimental designs. In The Routledge handbook of research methods in applied linguistics. New York, NY: Routledge. Google Scholar
  • Sablan, J. R. (2014). The challenge of summer bridge programs. American Behavioral Scientist, 58(8), 1035–1050. https://doi.org/10.1177/
0002764213515234 Google Scholar
  • Scott, T. P., Thigpin, S. S., & Bentz, A. O. (2017). Transfer learning community: Overcoming transfer shock and increasing retention of mathematics and science majors. Journal of College Student Retention: Research, Theory & Practice, 19(3), 300–316. https://doi.org/10/ghbhg9 Google Scholar
  • Seymour, E., & Hewitt, N. M. (1997). Talking about leaving: Why undergraduates leave the sciences. Boulder, CO: Westview. Google Scholar
  • Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014). P-curve: A key to the file-drawer. Journal of Experimental Psychology: General, 143(2), 534–547. https://doi.org/10/gffnn9 MedlineGoogle Scholar
  • Smith, P. E. (2017, June). A multi-program approach to student retention and success. Paper presented at: 2017 ASEE Annual Conference & Exposition (Columbus, OH). Google Scholar
  • Stanich, C. A., Pelch, M. A., Theobald, E. J., & Freeman, S. (2018). A new approach to supplementary instruction narrows achievement and affect gaps for underrepresented minorities, first-generation students, and women. Chemistry Education Research and Practice, 19(3), 846–866. https://doi.org/10/ggk5xn Google Scholar
  • Steele, C. M., & Aronson, J. (1995). Stereotype threat and the intellectual test performance of African Americans. Journal of Personality and Social Psychology, 69(5), 797–811. https://doi.org/10/cj4g2w MedlineGoogle Scholar
  • Sterne, J. A. C., Egger, M., & Smith, G. D. (2001). Investigating and dealing with publication and other biases in meta-analysis. British Medical Journal, 323(7304), 101–105. MedlineGoogle Scholar
  • Stolle-McAllister, K. (2011). The case for summer bridge: Building social and cultural capital for talented black STEM students. Science Educator, 20(2), 12–22. Google Scholar
  • Stuart, E. A., & Rubin, D. B. (2008). Best practices in quasi-experimental designs: Matching methods for causal inference. In J. W. Osborne (Ed.), Best Practices in Quantitative Methods. Thousand Oaks, CA: Sage. Google Scholar
  • Thompson, M., & Consi, T. (2008). Engineering outreach through college pre-orientation programs: MIT Discover Engineering. Journal of STEM Education, 8(3), 75–82.  Google Scholar
  • Tinto, V. (1999). Taking retention seriously: Rethinking the first year of college. NACADA Journal, 19(2), 5–9. https://doi.org/10/ghbrg9 Google Scholar
  • Tomasko, D. L., Ridgway, J. S., Waller, R. J., & Olesik, S. V. (2016). Association of summer bridge program outcomes with STEM retention of targeted demographic groups. Journal of College Science Teaching, 45(4), 90–99. https://doi.org/10.2505/4/jcst16_045_04_90 Google Scholar
  • Torres, A. (2000). Rethinking the model. In Campbell, G.Denes, R.Morrison, C. (Eds.), Access denied: Race, ethnicity, and the scientific enterprise (pp. 219–221). New York, NY: Oxford University Press. Google Scholar
  • Tsui, L. (2007). Effective strategies to increase diversity in STEM fields: A review of the research literature. Journal of Negro Education, 76(4), 555–581. Google Scholar
  • Veenstra, C. P., Dey, E. L., & Herrin, G. D. (2009). A model for freshman engineering retention. Advances in Engineering Education, 1(3), 1–33. Google Scholar
  • Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36(3), 1–48. https://doi.org/10/gckfpj Google Scholar
  • Whalin, R., Pang, Q., Lowe, L. S. N., & Latham, J. H. (2017, June 24). Assessment of a summer bridge program: Seven years and counting. Paper presented at: Minorities in Engineering Division Technical Session 2. 2017 ASEE Annual Conference & Exposition (Columbus, OH). Retrieved March 30, 2021, from https://peer.asee.org/27631 Google Scholar
  • Wheatland, J. A., Jr. (2001). The relationship between attendance at a Summer Bridge Program and academic performance and retention status of first-time freshman science, engineering, and mathematics students at Morgan State University, an historically Black university (Unpublished dissertation). Morgan State University, Baltimore, MD. Google Scholar
  • Wilson, R. (2000). Barriers to minority success in college science, mathematics, and engineering programs. In Campbell, G.Denes, R.Morrison, C. (Eds.), Access denied: Race, ethnicity, and the scientific enterprise (pp. 193–206). New York, NY: Oxford University Press. Google Scholar
  • Wilson, Z. S., Holmes, L., deGravelles, K., Sylvain, M. R., Batiste, L., Johnson, M., ... & Warner, I. M. (2012). Hierarchical mentoring: A transformative strategy for improving diversity and retention in undergraduate STEM disciplines. Journal of Science Education and Technology, 21(1), 148–156. https://doi.org/10/c8vz99 Google Scholar
  • Wischusen, S. M., Wischusen, E. W., & Pomarico, S. M. (2011). Impact of a short pre-freshman program on retention. Journal of College Student Retention: Research, Theory & Practice, 12(4), 429–441. https://doi
.org/10/fqdvrp Google Scholar
  • Wolniak, G., Seifert, T. A., & Blaich, C. F. (2004). A liberal arts education changes lives: Why everyone can and should have this experience. Liberal Arts Online, 4(3). Google Scholar
  • Yelamarthi, K., & Mawasha, P. (2008). A pre-engineering program for the under-represented, low-income and/or first generation college students to pursue higher education. Journal of STEM Education, 9(3), 5–15. Google Scholar
  • Zuo, C., Mulfinger, E., Oswald, F. L., & Casillas, A. (2018). First-generation college student success. In Feldman, R. S., (Ed.), The first year of college: Research, theory, and practice on improving the student experience and increasing retention (pp. 55–90). New York, NY: Cambridge University Press. https://doi.org/10.1017/9781316811764.004 Google Scholar
  • Zydney, A. L., Bennett, J. S., Shahid, A., & Bauer, K. W. (2002). Impact of undergraduate research experience in engineering. Journal of Engineering Education, 91(2), 151–157. https://doi.org/10/ghbrgz Google Scholar