ASCB logo LSE Logo

Building Better Bridges into STEM: A Synthesis of 25 Years of Literature on STEM Summer Bridge Programs

    Published Online:https://doi.org/10.1187/cbe.17-05-0085

    Abstract

    Summer bridge programs are designed to help transition students into the college learning environment. Increasingly, bridge programs are being developed in science, technology, engineering, and mathematics (STEM) disciplines because of the rigorous content and lower student persistence in college STEM compared with other disciplines. However, to our knowledge, a comprehensive review of STEM summer bridge programs does not exist. To provide a resource for bridge program developers, we conducted a systematic review of the literature on STEM summer bridge programs. We identified 46 published reports on 30 unique STEM bridge programs that have been published over the past 25 years. In this review, we report the goals of each bridge program and whether the program was successful in meeting these goals. We identify 14 distinct bridge program goals that can be organized into three categories: academic success goals, psychosocial goals, and department-level goals. Building on the findings of published bridge reports, we present a set of recommendations for STEM bridge programs in hopes of developing better bridges into college.

    INTRODUCTION

    Summer bridge programs are typically developed to facilitate students’ transition into college and improve student academic success. These programs are often multiweek intensive experiences that occur in the weeks before a student’s first year of college or a transfer student’s entry into a 4-year institution. Such programs are commonly referred to as summer bridge programs, boot camps, summer programs, or college prep programs. Over the past decade, there have been many national calls to increase the retention of students in science, technology, engineering, and mathematics (STEM) and to enhance the diversity of STEM professionals (National Academy of Sciences, National Academy of Engineering, and Institute of Medicine, 2010; American Association for the Advancement of Science [AAAS], 2011; President’s Council of Advisors on Science and Technology, 2012; Chen, 2013). As such, there has been increasing interest in developing bridge programs for STEM disciplines to prepare students to meet the challenging academic demands of these degree programs and to address factors that may contribute to the attrition of students in STEM (Seymour and Hewitt, 1997).

    Bridge programs take substantial time, effort, and resources due to their immersive nature, so a question that often arises is, What is the impact of bridge programs on students? Surprisingly, there are few review articles of bridge programs, and the ones that exist only summarize a subset of bridge programs (e.g., Kezar, 2000; Sablan, 2014). To our knowledge, there has been no comprehensive review of the literature on STEM bridge programs. To meet this need, we summarize the literature on bridge programs that target students who are entering or are interested in entering STEM majors. We describe characteristics of existing bridge programs, identify the goals of these programs, and highlight to what extent these programs report success in achieving these goals. We also identify gaps in the current literature on bridge programs and provide a set of recommendations for the STEM community to build better bridges into college for incoming students. We hope that this summary of the current landscape of STEM bridge programs can serve as a useful resource for bridge program developers. Further, we hope this review will stimulate thoughtful consideration of how to improve existing bridge programs and better evaluate the impact of bridge programs on students.

    METHODS

    We conducted a literature search for published reports that describe STEM bridge programs. In addition to the published literature, we searched the “gray literature” or non–peer reviewed literature, primarily in the form of unpublished dissertations and conference proceedings. Many bridge programs have not led to a peer-reviewed publication in a journal, so we chose to include reports that have not been peer-reviewed in a journal (e.g., conference abstracts). We did this so that we can provide readers with the maximum amount of information about current bridge programs and to avoid only focusing on programs that were most likely to produce favorable or statistically significant results (Rothstein et al., 2005). However, we recognize that there are likely many bridge programs for which no reports exist, and as such, these programs are not included in our review. Specifically, programs developed at master’s colleges and universities, baccalaureate colleges, and community colleges may be underrepresented in this review compared with programs at PhD-granting research institutions due to the greater incentives for faculty at R1 institutions to publish their findings. Between November 2016 and March 2017, we searched three online databases (Google Scholar, ERIC, and Arizona State University [ASU] library One Search) using an array of terms, including: bridge, bridge program, summer bridge, summer bridge program, summer program, boot camp, summer boot camp, summer college program, college preparation program, and summer college preparation program. For consistency, we will refer to all programs as “bridge programs” in this literature review. Further, we used a snowball approach to find additional programs that were referenced in the papers on bridge programs (Biernacki and Waldorf, 1981). We included STEM bridge programs from 1992 (the earliest report we could find) to 2016. The search yielded 46 published reports on 30 unique programs, and while we have tried to be as comprehensive as possible, we acknowledge that it is possible to have missed published reports.

    To identify characteristics of each STEM bridge program, two authors (M.A. and K.M.C.) independently reviewed each bridge report and recorded the subject of the program, the type of institution that offered the program, whether the bridge program was designed for first-year students or transfer students, the length of the program, and whether the program targeted a specific population of students as recorded in the report. The reviewers compared their findings and came to consensus about any discrepancies.

    To identify the goals of each STEM bridge program, one author (M.A.) reviewed each bridge report; noted any goal of the STEM bridge program as described in the report; and recorded how, if at all, the bridge report indicated that the program goal was measured and whether the program was successful at reaching that goal. A rubric was created to describe each program goal and whether bridge reports indicated that 1) the program succeeded in meeting the program goal, 2) the program did not succeed in meeting the program goal, or 3) the program goal was not measured in the report (please see Supplemental Table 1 for a copy of the rubric). In this review, we include any goal that was reported by at least three bridge programs (10% of all programs). It is important to note that we relied on interpretations of the data made by the authors of the bridge reports in terms of whether the authors perceived their programs were successful at achieving their goals; however, we do highlight some possible issues with methodologies and conclusions. As such, some of these findings should be interpreted with caution as far as the positive impacts of the bridge programs. To establish coding reliability, another author (J.M.C.) reviewed a subset of eight STEM bridge reports and used the rubric to code what goals each program reported and whether the bridge report indicated that the program was successful in meeting the goals. The two authors (M.A. and J.M.C.) compared their coding, discussed any discrepancies, and revised the rubric. This process was repeated on two additional subsets of eight reports until the authors’ consensus estimate was greater than 95% (Stemler, 2004). The two authors then separately reviewed every program and recorded which goals were reported, which goals were measured, and whether the program reported successfully meeting each goal using the revised rubric. The reviewers compared their final codes, achieving a consensus estimate of 99%, and came to consensus about any final discrepancies.

    PART 1. CURRENT LANDSCAPE OF STEM BRIDGE PROGRAMS

    To our knowledge, there have been 46 published reports on 30 unique STEM bridge programs from 1992, when the earliest report was published, through 2016. Twenty-six of these publications were in peer-reviewed journals (Figure 1A). Over the past 25 years, there has been an increase in the number of published reports on STEM bridge programs, including a rise in the number of peer-reviewed publications in recent years (Figure 1B). STEM bridge programs have been developed for students entering any STEM field (n = 10, 33%), but there are also programs designed for students entering a specific STEM major or interested in a specific STEM major; the highest number of published reports have been in engineering (n = 15, 50%). There are only two chemistry-specific programs (College of Saint Benedict and Saint John’s University FoCuS program and Monmouth College’s SOFIA program), one biology-specific program (Louisiana State University’s [LSU] BIOS program), one geoscience-specific program (Kapiolani Community College’s HaKilo program), and one nursing-specific program (University of Cincinnati’s Leadership 2.0 program) with at least one published report (Figure 2A). STEM bridge programs with published reports have been developed at public doctoral-granting institutions (n = 23, 77%), private doctoral-granting institutions (n = 2, 7%), private master’s degree–granting institutions (n = 2, 7%), and private bachelor’s degree–granting institutions (n = 3, 10%; Figure 2B). Most STEM bridge programs with published reports focus on incoming first-year students (n = 28, 93%); however, two programs focus on students who transfer from 2-year institutions to 4-year institutions (7%; Figure 2C). Bridge programs vary widely in length and last from 3 days to 8 weeks; one online bridge program allows students to participate as long as they want between when they attend new student orientation and when they begin college classes (Figure 2D). These differences among programs highlight that there is no set formula for creating a STEM bridge program, which makes it difficult to draw conclusions about what aspects of bridge programs may lead to positive outcomes.

    FIGURE 1.

    FIGURE 1. Reports on bridge programs published from 1992 to 2016. (A) There have been 46 reports published on STEM bridge programs, the earliest of which was published in 1992. Twenty-six reports (57%) have been published in peer-reviewed journals, and 20 reports (43%) have been published in other formats (e.g., conference papers, theses). (B) There has been an increase in the number of bridge reports published since 1992, with the majority of reports published in peer-reviewed journals published since 2006.

    FIGURE 2.

    FIGURE 2. Characteristics of all STEM bridge programs. (A) STEM bridge programs are geared toward a variety of disciplines. While many STEM bridge programs are open to students across all STEM disciplines (33%), the majority of programs are specific to engineering students (50%). (B) STEM bridge programs are offered at private doctoral institutions, private master’s degree–granting institutions, and private bachelor’s degree–granting institutions, although bridge programs are primarily implemented at public, PhD-granting institutions (77%). (C) The majority of STEM bridge programs are designed for first-year students who are entering a 4-year university directly from high school (93%), and only two programs (7%) are designed for students transferring from a 2-year college into a 4-year university. (D) The length of STEM bridge programs varies widely, from 3 days to 8 weeks, with one online bridge program for which there is no set length, because students are able to participate for as long as they like between when they attend orientation and the first day of classes. (E) STEM bridge programs target students from varying backgrounds: 50% of programs specified targeting URM students, 20% specified targeting women, 20% specified targeting academically underprepared students, and 10% specified targeting highly academically prepared students.

    Of all the bridge programs with published reports, some explicitly welcome any incoming student (n = 3, 10%) or do not specify a specific target population (n = 4, 13%). More commonly, programs target underrepresented minority (URM) students (n = 15, 50%; e.g., Ohio State University’s PREFACE Program), female students (n = 6, 20%; e.g., Bowling Green State University’s AIMS program), academically underprepared students (n = 6, 20%; e.g., Syracuse University’s Summer Bridge Program), or highly academically prepared students (n = 3, 10%; e.g., Monmouth College’s SOFIA program; Figure 2E). URM students and female students are often preferentially recruited into STEM bridge programs because they are historically underrepresented in STEM disciplines (National Science Foundation [NSF], 2017). Students with low academic preparation are often at most risk for attrition from college, particularly during the first year of college (Chen, 2013), so programs may focus on these students to improve overall retention rates. Notably, some programs focus on only high academic–ability URM students (e.g., University of Maryland Baltimore County’s [UMBC] Meyerhoff Scholars Program), but no programs with published reports explicitly report focusing only on low academic–ability URM students. A list of the 30 STEM bridge programs with characteristics of each program is shown in Table 1.

    TABLE 1. Individual bridge program characteristicsa

    ProgramSubject of programInstitution typeFirst-year or transfer ­studentsLengthTarget population of studentsNumber of published reportsReports in peer-reviewed journalsReports in other formats
    Bowling Green State University: Academic Investment in Math and Science (AIMS) Summer Bridge ProgramAcross STEMPublic, doctoral-granting universityF5 weeksURM, females1Gilmer, 2007
    Claremont Colleges: Summer Science Immersion Program (SScIP)Across STEMPrivate, bachelor’s degree–granting institutionF1 weekURM, females1Bhattacharya and Hansen, 2015
    College of Saint Benedict and Saint John’s University: Future Chemists Scholarships and Support (FoCuS) ProgramChemistryPrivate, bachelor’s degree–granting institutionF7 weeksURM, females, highly academically prepared2Graham et al., 2017bGraham et al., 2013
    Kapiolani Community College (KCC): School of Ocean and Earth Science and Technology (SOEST) Summer Bridge: HaKiloGeosciencePublic, doctoral-granting universityT1 weekURM1Bruno et al., 2016
    Louisiana State University (LSU): Biology Intensive Orientation for Students (BIOS) ProgramBiologyPublic, doctoral universityF5 daysOpen to anyone3Wischusen and Wischusen, 2007; Wischusen et al., 2011; Wheeler and Wischusen, 2014
    Massachusetts Institute of Technology: Discover EngineeringEngineeringPrivate, doctoral-granting universityF4–5 daysDid not specify1Thompson and Consi, 2007
    Middle Tennessee State University: Precollege summer bridge programAcross STEMPublic, doctoral-granting universityF10 daysAcademically underprepared1Raines, 2012
    Monmouth College: Summer Opportunities for Intellectual Activities (SOFIA) programChemistryPrivate, bachelor’s degree–granting institutionF3 weeksHighly academically prepared2Moore et al., 2016Moore, 2013
    Ohio State University: National Science Foundation Science and Engineering Talent Expansion Program (OSTEP) and Pre-Freshman and Co-operative Education (PREFACE) programAcross STEMPublic, doctoral-granting universityF6 weeksURM2Tomasko et al., 2016Tomasko et al., 2013
    Syracuse University: Summer Bridge ProgramEngineeringPrivate, doctoral-granting universityF6 weeksAcademically underprepared1Doerr et al., 2014
    University of Alabama: Engineering Math Advancement Program (E-MAP)EngineeringPublic, doctoral-granting universityF5 weeksDid not specify1Gleason et al., 2010Boykin et al., 2010
    University of California–Santa Barbara: Expanding Pathways in Science, Engineering, and Mathematics (EPSEM)Across STEMPublic, doctoral-granting universityT2 weeksURM1Lenaburg et al., 2012
    University of Cincinnati: Leadership 2.0 programNursingPublic, doctoral-­granting universityF6 weeksURM1Pritchard et al., 2016
    University of Maryland Baltimore County (UMBC): Meyerhoff Scholars ProgramAcross STEMPublic, doctoral-­granting universityF6 weeksURM, highly academically prepared7Hrabowski and Maton, 1995; Maton et al., 2000, 2009, 2012, 2016; Stolle-McAllister, 2011; Stolle-McAllister et al., 2011; Summers and Hrabowski, 2006
    University of Memphis: STEM Talent Expansion Program (STEP) Summer Mathematics Bridge BootcampAcross STEMPublic, doctoral-granting universityF2 weeksDid not specify1Russomanno et al., 2010; Windsor et al., 2015
    University of Wisconsin Milwaukee: College of Engineering and Science (CEAS) Summer Bridge ProgramEngineeringPublic, doctoral-granting universityF4 weeksAcademically underprepared1Reisel et al., 2012
    Arizona State University (ASU): Women in Applied Sciences and Engineering (WISE)EngineeringPublic, doctoral-granting universityF5 daysFemales2Fletcher et al., 2001a,b
    ASU: Minority Engineering Program (MEP) Summer Bridge ProgramEngineeringPublic, doctoral-granting universityF10 daysURM2Reyes et al., 1998, 1999
    Morgan State University: Alliance for Minority Participation Math Bridge Program and National Aeronautical Space Administration Morgan Engineering Enrichment ProgramAcross STEMPublic, doctoral-granting universityF4 weeksURM1Wheatland, 2000
    Pennsylvania State University: Pre-First Year Engineering and Science (PREF) programEngineeringPublic, doctoral-granting universityF6 weeksURM1Persaud and Freeman, 2005
    Purdue University: Mathematics Summer Bridge ProgramEngineeringPublic, doctoral-granting universityF1 weekURM1Budny, 1992
    Saint Edward’s University: ­Community for Achievement in Science, Academics and Research (CASAR)Across STEMPrivate, master’s degree–granting institutionF1 weekURM1Kopec and Blair, 2014
    Southern Illinois University–­Carbondale: Success weekEngineeringPublic, doctoral-granting universityF3 daysDid not specify1Chevalier et al., 2001
    Texas A&M: Personalized Precalculus Program (PPP)EngineeringPublic, doctoral-granting universityF3 weeksAcademically underprepared1Nite et al., 2015
    University of Florida: Engineering Freshman Transition Program (EFTP)EngineeringPublic, doctoral-granting universityF6 weeksOpen to anyone1Citty and Lindner, 2012
    University of Missouri–Saint Louis and Washington University: McDonnell Douglas Access to EngineeringEngineeringPublic, doctoral-granting universityF8 weeksURM, females1Shields et al., 1996
    University of New Mexico: Summer Bridge ProgramAcross STEMPublic, doctoral-granting universityF4 weeksURM, females1Ami, 2001
    University of North Carolina at Charlotte: Engineering Boot CampEngineeringPublic, doctoral-granting universityFNo set lengthOpen to anyone1Harkins, 2016
    University of Portland: Summer BridgeEngineeringPrivate, master’s degree–granting institutionF6 weeksAcademically underprepared1Cairncross et al., 2015
    Wayne State University: Summer Bridge ProgramEngineeringPublic, doctoral-granting universityF8 weeksAcademically underprepared1Grimm, 2005

    aThere are 30 STEM bridge programs with published reports. Most programs target first-year students (F) who are entering a 4-year university from high school, though some programs target transfer students (T) transferring from a 2-year college into a 4-year university. Nine of the programs have multiple reports published about them; the two programs with the most reports, UMBC’s Meyerhoff Scholars Program and LSU’s BIOS program, make up 24% of the total publications on bridge programs, with 11 published reports between them.

    bPublished electronically in December of 2016 and therefore included in the review.

    PART 2. GOALS, OUTCOMES, AND SUCCESSES OF STEM BRIDGE PROGRAMS

    What are the goals of these STEM bridge programs, and how, if at all, do programs measure how successful they are at achieving such goals? We have identified program goals articulated by at least three bridge programs and categorized them into three main areas: academic success goals, psychosocial goals, and department-level goals. Many of the goals of bridge programs are well aligned with educational theories centered on undergraduate student success and retention. For example, Tinto’s theory of college student departure suggests that students must experience social integration (e.g., form relationships with peers and faculty) and academic integration (e.g., achieve good grades) to maximize their chances of graduating from college (Tinto, 1975, 1987).

    We have also identified research designs and assessment strategies that evaluators use to assess bridge student outcomes. While evaluators commonly rely on quantitative methodologies such as surveys or tests to collect data, qualitative measures such as interviews or focus groups can also be used to provide insight into students’ thought processes and experiences. Bridge program evaluators tend use a pre–post research design or a comparison group design to assess student gains. In a pre–post design, bridge evaluators measure students’ academic success or affect at the beginning and end of the program. However, the specific measures used in pre–post designs (e.g., one item from a survey, a previously validated survey, student interviews) and the extent to which evaluators use statistical analyses to determine the effect size and significance of bridge student gains vary widely. Another way to assess the outcomes of bridge programs is to compare the gains of bridge students with the gains of a comparison group of students who did not participate in the program. However, because students usually self-select into bridge programs and often need to meet specific criteria to be eligible for the program, evaluators are usually unable to conduct randomized experiments. Thus, evaluators use quasi-experimental designs and can reduce bias by controlling for the variation between bridge students and the comparison group of students. However, the extent to which evaluators control for potentially confounding variables and the analyses they use vary extensively. For example, some evaluators compare bridge students with a larger group of students with similar demographics, while other evaluators choose to compare bridge students with a matched-pair group, with individual non–bridge students matched with individual bridge students using characteristics such as race/ethnicity, gender, and high school grade point average (GPA). The number of covariates used in the match, the criteria for determining whether the bridge and non–bridge students are appropriately matched, and the technique that evaluators use to create matched-pair groups (e.g., propensity score matching, using statistical tests to compare group averages) also vary extensively.

    In the following sections, we outline the goals for STEM bridge programs included in this review, how programs measured these goals, and whether programs reported that they are successful at meeting those goals. We highlight differences in how goals were measured and possible limitations of the study designs where applicable. These findings are summarized in Table 2. We acknowledge that these goals may not be representative of all bridge programs, because not all bridge programs have published reports, so our conclusions are limited to what we can say about the published reports.

    TABLE 2. Bridge programs report a number of goals that can be categorized into three areas: academic success goals, psychosocial goals, and department-level goalsa

    Academic success goalsPsychosocial goalsDepartment-level goals
    RemediationImprove content knowledgeMaximize GPAIncrease research participationIncrease retentionIncrease ­graduation ratesIncrease interest in the majorSense of belongingSense of preparednessSelf-efficacyNetwork with studentsNetwork with facultyRecruit students to the majorEnhance diversity of the major
    Programs with reports in peer-reviewed journals
    Bowling Green State University: Academic Investment in Math and Science (AIMS) Summer Bridge Program**SSSS********
    Claremont Colleges: Summer Science Immersion Program (SScIP)**************
    College of Saint Benedict and Saint John’s University: Future Chemists Scholarships and Support (FoCuS) Program**SS**********
    Kapiolani Community College (KCC): School of Ocean and Earth Science and Technology (SOEST) Summer Bridge: HaKilo******S**S**SS
    Louisiana State University (LSU): Biology Intensive Orientation for Students (BIOS) Program**S*S****SS***
    Massachusetts Institute of Technology: Discover Engineering******S***SSSS
    Middle Tennessee State University: Precollege summer bridge program*SS*S***SU****
    Monmouth College: Summer Opportunities for Intellectual Activities (SOFIA) program**************
    Ohio State University: National Science Foundation Science and Engineering Talent Expansion Program (OSTEP) and Pre-Freshman and Co-operative Education (PREFACE) program**U*S**SS*****
    Syracuse University: Summer Bridge Program*SS***********
    University of Alabama: Engineering Math Advancement Program (E-MAP)S*U*S*********
    University of California Santa Barbara: Expanding Pathways in Science, Engineering, and Mathematics (EPSEM)******S***SS**
    University of Cincinnati: Leadership 2.0 program**S*S*SSS*SS**
    University of Maryland Baltimore County (UMBC): Meyerhoff Scholars Program**SSSS*S*SSS*S
    University of Memphis: STEM Talent Expansion Program (STEP) Summer Mathematics Bridge Bootcamp*SSSS*S*SSS***
    University of Wisconsin Milwaukee: College of Engineering and Science (CEAS) Summer Bridge ProgramS*************
    Programs with no reports in peer-reviewed journals
    Arizona State University (ASU): Women in Applied Sciences and Engineering (WISE)****S********S
    ASU: Minority Engineering Program (MEP) Summer Bridge Program**S*S********S
    Morgan State University: Alliance for Minority Participation Math Bridge Program and National Aeronautical Space Administration Morgan Engineering Enrichment Program**S*S*********
    Pennsylvania State University: Pre-First Year Engineering & Science (PREF) program**S**S********
    Purdue University: Mathematics Summer Bridge ProgramS*S***********
    Saint Edward’s University: Community for Achievement in Science, Academics and Research (CASAR)****SS********
    Southern Illinois University Carbondale: Success week****S*********
    Texas A&M: Personalized Precalculus Program (PPP)S*U***********
    University of Florida: Engineering Freshman Transition Program (EFTP)**S*S*********
    University of Missouri–Saint Louis and Washington University: McDonnell Douglas Access to Engineering Program**************
    University of New Mexico: Summer Bridge ProgramS*************
    University of North Carolina at Charlotte: Engineering Boot Camp**S*S*********
    University of Portland: Summer bridgeS*S******SS***
    Wayne State University: Summer Bridge Program**************

    a✓, published report on the program explicitly stated the program goal; –, program goal was not explicitly stated in a publication; S, program reported success in meeting the goal; U, program reported lack of success in meeting the goal; *, program did not report measuring the goal.

    Academic Success Program Goals

    Remediation: Providing Students with Foundational Knowledge in a STEM Domain.

    Incoming students may not be academically prepared for the level of difficulty of college STEM course work, so some bridge programs try to remediate students to meet the requirements for entry into the major (Chen, 2013). Most STEM bridge programs with this goal target academically underprepared students and often use a placement test or incoming Scholastic Aptitude Test (SAT) or ACT score to determine whether students are eligible for the program. Many programs use a pre–post research design to measure remediation by having students take a placement exam before or at the beginning of the program and at the end of the program. Some programs determined they were successful if students placed into a higher-level course based on their placement exam score after participating in the program. This remediation often focused on basic math skills, so students would take a math placement test at the end of the program to measure whether they were better prepared to enter their STEM majors.

    Five STEM bridge programs had remediation as a goal of their program; all five programs assessed student remediation and concluded that they were successful in remediating students (Ami, 2001; Boykin et al., 2010; Gleason et al., 2010; Reisel et al., 2012; Cairncross et al., 2015; Nite et al., 2015). One additional program reported successfully remediating students, but was explicit that remediation was not an original goal of the program (Budny, 1992).

    Improving Student Content Knowledge in a Discipline.

    The first year of college STEM courses is notoriously difficult, and students are prone to struggle with the sheer amount of information presented in introductory courses (AAAS, 2011; Brownell et al., 2014). To give students a head start on the content they will encounter during their first semester in college, some bridge programs strive to increase student content knowledge in a particular STEM discipline. Eleven bridge programs reported improving student content knowledge as a goal of the program, and three programs reported measuring student content knowledge. All three programs that reported measuring content knowledge gains used a pre–post assessment of math ability. One of these programs gave students a previously validated pre–post concept inventory on rate of change and found that students achieved significantly higher scores at the end of the bridge program (Doerr et al., 2014). Another program gave students a pre- and posttest with questions taken from the math section of the ACT and showed that bridge participants increased test scores by more than 40% (Raines, 2012). A third bridge program gave students a math pre- and posttest that the program developers created themselves; they found that students’ algebra and trigonometry scores increased by 15% and 14%, respectively (Russomanno et al., 2010).

    Maximizing Student GPA.

    One way to measure student academic success is to examine students’ GPAs. Students’ GPAs are typically obtained by accessing institutional data. Seven bridge programs reported increasing the GPAs of students who participated in the program compared with students who did not participate in the program as an explicit goal of the program; five of the seven programs measured student GPAs. An additional 13 programs did not explicitly state maximizing bridge students’ GPAs as a goal of the program, but measured GPA as a way to determine student academic success.

    Fifteen of the 18 programs that measured students’ GPAs (83%) showed that their programs were successful in maximizing bridge student GPAs. It is important to note that programs used different ways of measuring gains in GPA. One program used propensity score matching to compare bridge students with students with similar characteristics (including gender, race/ethnicity, and prior academic ability) who did not participate in the bridge program (Windsor et al., 2015). Another program matched bridge participants to current and historical students who did not participate in the bridge program yet had similar characteristics (including gender, race/ethnicity, and prior academic ability; Maton et al., 2000). One program controlled for both gender and a measure of prior academic ability between bridge students and comparison students who did not participate in the program (Wischusen and Wischusen, 2007). Four additional programs did not use matched pairs, but did control for a measure of prior academic ability of students who did and did not participate in the bridge program (Wheatland, 2000; Gilmer, 2007; Gleason et al., 2010; Doerr et al., 2014).

    However, seven programs compared students who completed the bridge program with students in other courses without any control for prior academic ability (Budny, 1992; Citty and Lindner, 2012; Cairncross et al., 2015; Nite et al., 2015; Pritchard et al., 2016; Tomasko et al., 2016; Graham et al., 2017), and two programs did not use a comparison group at all (Raines, 2012; Harkins, 2016). Because prior academic ability is often a strong predictor of future GPA, it is important to control for a measure of prior academic ability to examine the impact of a program. It is also important to note that students volunteered to participate in all of these programs, so there may be a volunteer effect that is leading to some of these gains (Brownell et al., 2013), because most programs compared students who volunteered to participate in the program with students who did not volunteer to participate. However, one program did compare bridge participants with “declined” students (students who planned to participate, but ended up deciding to attend another institution) to reduce volunteer-bias effects (Maton et al., 2000).

    Increase Research Participation.

    Five STEM bridge programs reported increasing students’ participation in research as a goal. Four of these programs measured students’ participation in research, and all reported successfully meeting this goal. Three of these programs reported the number of students who had participated in their bridge programs who went on to participate in undergraduate research (Gilmer, 2007; Russomanno et al., 2010; Windsor et al., 2015; Graham et al., 2017), and another program reported the number of former bridge students who went on to conduct PhD-level research (Summers and Hrabowski, 2006; Maton et al., 2009, 2012, 2016; Stolle-­McAllister, 2011; Stolle-McAllister et al., 2011). Of note, most of these programs did not report a change in the percentage of students who were conducting research after implementing the bridge program and did not use a comparison group of students. Although one program examined the rates of success of applying to undergraduate research programs between bridge students and students who did not participate in the program, they did not control for a measure of academic ability between the two groups (Russomanno et al., 2010). Controlling for academic ability would strengthen their results, because academic ability has been shown to predict who is selected into undergraduate research programs (Russell, 2006a,b; Hurtado et al., 2014). The program that reported whether students went on to conduct PhD-level research did use a comparison group of students who had declined attending the program.

    Increase Student Retention.

    Student retention in a major can be defined in a number of different ways: retention through one semester, one year, multiple years, or up until graduation. Retention is usually determined by accessing institutional data on student degree progression. Twenty-one STEM bridge programs had a goal of improving student retention in college, and 12 of these programs measured student retention. An additional three programs reported measuring student retention, though they did not report retention as a goal of their program. Out of the 15 programs that measured retention, all reported having positive impacts on student retention.

    There was variation in the measures for how long students were retained. Three of these programs measured retention at every year after students participated in the program until they graduated (Chevalier et al., 2001; Boykin et al., 2010; Gleason et al., 2010; Tomasko et al., 2013, 2016), and one program measured retention percentages of different cohorts of students at a single time point (Maton et al., 2000). Nine programs measured retention 1 year after the bridge program (Reyes et al., 1998, 1999; Wheatland, 2000; Fletcher et al., 2001a,b; Citty and Lindner, 2012; Raines, 2012; Windsor et al., 2015; Kopec and Blair, 2014; Harkins, 2016; Pritchard et al., 2016), one program measured retention each semester after the bridge program through students’ fourth semester in college (Wischusen and Wischusen, 2007; Wischusen et al., 2011), and one program measured retention after the first year, the third semester, the fifth semester, and the seventh semester (Gilmer, 2007).

    There was also variation in the comparison groups used to evaluate bridge student retention. One program used matched pairs, individually matching students in the bridge program to students of similar characteristics who did not participate in the bridge program (Windsor et al., 2015). One program compared bridge participants with current and historical nonparticipants with similar characteristics (including gender, race/ethnicity, and prior academic ability; Maton et al., 2000). Two programs only controlled for average academic ability between the group of students who participated in the bridge program and the group of students who did not participate in the program (Gilmer, 2007; Boykin et al., 2010; Gleason et al., 2010). However, three programs used students as a comparison group to bridge students without controlling for other variables such as gender, race/ethnicity, or prior academic ability that could predict retention differences between the two groups (Chevalier et al., 2001; Citty and Lindner, 2012; Pritchard et al., 2016). One program did not use a comparison group but compared bridge participant retention with first-year student retention at the university as a whole (Raines, 2012).

    Increase Student Graduation Rate from College.

    Bridge program developers often stated increased graduation rates as a distinct goal from retention, so we considered it a separate category. Graduation rates are typically obtained from accessing institutional data. Twelve bridge programs had a goal of increasing graduation rates, and four of the programs measured it (Maton et al., 2000, 2009, 2012; Persaud and Freeman, 2005; Summers and Hrabowski, 2006; Gilmer, 2007; Kopec and Blair, 2014). All four of these programs reported successfully increasing graduation rates of bridge students. Notably, one of these programs focused on higher academic–ability students from underrepresented backgrounds, while the other three focused on underrepresented students of no specific academic ability.

    Psychosocial Program Goals

    Increase Interest in the Major.

    One reason that college students choose to leave STEM majors is because of a lack of interest in the discipline (Seymour and Hewitt, 1997). Thus, bridge programs often aim to increase student interest in the major in hopes that increased interest will lead to increased recruitment and persistence. Six STEM bridge programs stated that their goal was to increase student interest in the major. Only two of these programs reported measuring student interest, and both demonstrated success in increasing student interest (Thompson and Consi, 2007; Bruno et al., 2016). Three additional bridge programs measured student interest and reported increasing student interest, though this was not an explicit goal of those programs (Russomanno et al., 2010; Lenaburg et al., 2012; Pritchard et al., 2016).

    Two of the programs that measured bridge student interest in the major used a single-item pre–post measure (Lenaburg et al., 2012; Bruno et al., 2016); two other programs used postprogram surveys to measure student interest. One of these programs used a single-item measure of student interest in the major on the postprogram survey (Russomanno et al., 2010); for the other program, it is not clear from the published paper how many items were used to measure interest (Thompson and Consi, 2007). Another program measured student interest by interviewing nursing students after the program and coding the interviews for student interest; 92% of students were more interested in nursing after the program (Pritchard et al., 2016).

    Improving Student Sense of Belonging.

    Student sense of belonging to a community has been shown to influence student academic motivation, student well-being, and student academic achievement (reviewed in Anderman and Freeman, 2004; Trujillo and Tanner, 2014). Seven bridge programs cited sense of belonging as a goal, and three of those programs assessed student sense of belonging after the program (Maton et al., 2000, 2012, 2016; Stolle-McAllister, 2011; Stolle-McAllister et al., 2011; Tomasko et al., 2013, 2016; Pritchard et al., 2016). Two of these programs conducted one-on-one or focus group interviews with bridge students and analyzed interviews for themes of belonging and used survey items for identifying a sense of community (Maton et al., 2000, 2012; Stolle-McAllister, 2011; Stolle-McAllister et al., 2011; Pritchard et al., 2016). The other study assessed students’ written responses to open-ended questions about the impact of the program and coded for evidence of student sense of belonging (Tomasko et al., 2013, 2016). This program also gave students a pre–post survey that measured constructs of sense of belonging such as student social support on campus, feelings of connectedness, and being part of a group. All of these programs showed that students developed a sense of belonging by participating in the program.

    Increasing Student Sense of Preparedness.

    A student’s sense of preparedness relates to how well a student perceives that he or she is prepared for a degree program. Bridge programs could help prepare students for college, especially if students enter college from high schools that may not have adequately prepared them for the rigors of college. One program examined preparedness for precalculus (Raines, 2012), another program assessed students’ sense of preparedness regarding future STEM courses (Pritchard et al., 2016), and two programs focused on preparedness for college in general (Russomanno et al., 2010; Tomasko et al., 2013, 2016).

    While no programs specifically reported sense of preparedness as being a goal, these four programs did report measuring students’ sense of preparedness, and all reported having a positive impact. Three programs measured this through a single post survey question (Russomanno et al., 2010; Raines, 2012; Pritchard et al., 2016), while the fourth program used a pre–post survey design, although they only provided the post survey score in the published report (Tomasko et al., 2013, 2016).

    Increasing Student Self-Efficacy.

    Self-efficacy is a student’s perception of his or her ability to complete a task (Bong and Skaalvik, 2003) and has been suggested to be important for retention in science (Seymour and Hewitt, 1997). It is distinct from a sense of preparedness, because it is the actual confidence that a student has in being able to complete a task, not the extent to which he or she feels prepared for an academic experience. Four programs reported increasing student self-efficacy as a goal, and two of those programs measured student self-efficacy (Bruno et al., 2016; Maton et al., 2016). Additionally, four programs measured self-efficacy, even though they did not explicitly state increasing student self-efficacy as a goal (Wischusen and Wischusen, 2007; Russomanno et al., 2010; Raines, 2012; Wheeler and Wischusen, 2014). One program measured student research self-efficacy using six items from a scientific self-efficacy scale, which probes student confidence in performing common research practices such as technical skills or generating a research question (Maton et al., 2016). However, this program did not use a pre–post survey design; they reported students’ research self-efficacy scores but did not measure how the program influenced the level of student research self-efficacy. Another study used the self-efficacy subscales of the MSLQ (Motivated Strategies for Learning Questionnaire), which measures students’ perceptions of their abilities to be successful in their introductory biology class, in a pre–post program design (Pintrich and De Groot, 1990). Surprisingly, students’ self-efficacy decreased over the duration of the program (Wheeler and Wischusen, 2014). While a decrease in self-efficacy could be interpreted as a failure of the program, some literature suggests that overestimation of self-efficacy can result in lower probability of completing a task and lower student motivation. Therefore, a student may benefit from calibrating his or her self-efficacy (Schunk and Pajares, 2009), which is what the authors described as possibly happening in this program. Thus, the authors interpreted this to be an unintended benefit of the program.

    Networking with Students.

    Building relationships with their peers can help students feel as though they are part of the college community. Conceptualized as social integration in Tinto’s theory of college student departure (Tinto, 1975), student relationships can lead to enhanced student sense of belonging, sense of community, and retention in a degree program (Tinto, 1987; Cabrera et al., 1993; Gerdes and Mallinckrodt, 1994). Of the 10 STEM bridge programs to report a goal of increasing student relationships, five of these programs measured it using qualitative methods (Maton et al., 2000; Thompson and Consi, 2007; Stolle-McAllister, 2011; Stolle-McAllister et al., 2011; Wischusen et al., 2011; Lenaburg et al., 2012; Pritchard et al., 2016), and one measured it by providing students with a postprogram survey in which students rated their agreement to a question regarding networking with other students (Russomanno et al., 2010). One additional program that did not have networking with students as a goal conducted post–bridge focus group interviews, and they found that students reported networking with their peers during the bridge program (Cairncross et al., 2015).

    Networking with Faculty.

    Building relationships with faculty can be particularly helpful to students in navigating STEM disciplines, because students sometimes perceive STEM faculty as “chilly” and “unapproachable” (Seymour and Hewitt, 1997). There is evidence to suggest that few college students develop relationships with instructors (Snow, 1973; Pascarella, 1980; Lamport, 1993; Kuh and Hu, 2001). As an example, only 15% of students at a large research institution reported that an instructor knew their names in a typical high-enrollment biology course (Cooper et al., 2017b). Eight STEM bridge programs had a goal of students getting to know faculty. Of the four programs that measured this goal, two programs measured this by coding for themes of students networking with faculty from one-on-one or focus group interview data (Maton et al., 2000; Pritchard et al., 2016), while the other two programs issued a postprogram survey with questions about the extent to which students met and networked with faculty (Thompson and Consi, 2007; Lenaburg et al., 2012). All four programs showed positive gains in student reports of networking with faculty.

    Department-Level Goals

    Recruit Students to the Major.

    Some STEM departments used the bridge program as a way to attract more students to a specific major and advertised the bridge program as a way for students to learn more about the major. Seven STEM bridge programs had a goal of using the program to recruit students, and the two programs that measured this goal were both successful in recruiting more students to the major. One program measured recruitment by counting the total number of students the bridge program recruited (Bruno et al., 2016), and the other program reported an increase in the number of students in the major after the bridge program was implemented (Thompson and Consi, 2007).

    Enhance Diversity in the Major.

    There is a mismatch between the diversity of the current scientific workforce and the general public, so there have been efforts aimed at increasing the diversity of STEM majors (NSF, 2017). Diversity in STEM has been proposed to be important for ameliorating socioeconomic disparities, increasing the total number of talented scientists, and minimizing bias in scientific findings by having individuals from diverse backgrounds counteract each other’s biases (Intemann, 2009). Although more than half of the STEM bridge programs targeted an underrepresented group in STEM (e.g., URMs or women), only 11 bridge programs (37%) explicitly stated that increased diversity in a STEM major was a goal of the program. Three of the 11 bridge programs reported the results of measuring increased diversity and stated that they were successful in enhancing diversity (Maton et al., 2000, 2012; Summers and Hrabowski, 2006; Stolle-McAllister et al., 2011; Bruno et al., 2016). Two additional programs also measured and reported success in enhancing diversity, though this was not an explicit goal of those programs (Fletcher et al., 2001a,b; Thompson and Consi, 2007). Success was determined by increased numbers of underrepresented groups in the major after implementation of the bridge program.

    PART 3. RECOMMENDATIONS FOR FUTURE DIRECTIONS

    While the STEM bridge literature is growing and the number of peer-reviewed journal publications has increased in recent years, we have identified some areas for improvement in the current literature on bridge programs. In the following sections, we outline a set of four recommendations for enhancing the quality of the literature on STEM bridge programs, so bridge program developers can make more informed and evidence-based decisions in creating and refining bridge programs.

    Recommendation 1: A Call to Document and Publish Bridge Program Descriptions, Goals, and Outcomes

    We encourage bridge developers and evaluators to publish their findings in peer-reviewed journals. The process of peer review will help strengthen the literature base for bridge programs, so others can build upon what other bridge programs have learned. We chose to include both peer-reviewed and non–peer reviewed reports in our analysis of bridge reports, because, out of the 30 STEM bridge programs we identified, only 16 generated peer-reviewed publications. We also acknowledge that there are bridge programs that exist that do not have any published reports written about them, so they are missing from our analysis. This highlights the need for bridge developers and evaluators to publish on successful bridge programs currently absent from the peer-reviewed literature.

    It is possible that bridge developers and evaluators are not publishing their findings because most of their evaluation is focused internally on improving their programs. Yet, in order to establish evidence for successful bridge programs nationally, it is important for these findings to be documented in peer-reviewed journals that others can easily access. A possible solution to this problem is to move away from only internal evaluation of bridge programs and instead use design-based research (Anderson and Shattuck, 2012). Design-based research explores the impact of interventions that are implemented in educational settings, such as summer bridge programs, to both inform the process of designing and revising such programs and to contribute to our broader understanding of education theory (Anderson and Shattuck, 2012).

    Recommendation 2: Development of a Program Is Iterative, So Report Lessons Learned from Prior (Unsuccessful) Iterations to Guide the Development of More Successful Future Programs

    Design-based research recognizes the complexity of interventions such as bridge programs and acknowledges that design and evaluation are iterative processes that help to inform each other (Anderson and Shattuck, 2012). There are many moving parts in the development of a successful bridge program, and it is quite possible that first iterations of programs will not be successful in meeting those programs goals. Much can be learned from formative assessment of bridge programs, particularly pilot versions of programs, so that changes can be made to help improve the experience for a first cohort of students. It is important for bridge developers to realize that with a limited literature base on bridge programs that is not comprehensive of all types of students or institutions, it is likely that the first iteration of a program will not go smoothly. For example, we could not find one published example of a summer bridge program that is focused on underrepresented students who are academically underprepared, even though we have met multiple people at conferences who are interested in starting bridge programs for this specific population of students. Without any prior evidence to build on, bridge program developers may find out that what works well for academically prepared underrepresented students may not work as well for academically underprepared underrepresented students. However, many insights can be gained from “failed” programs with small pilot groups of students, especially if bridge developers can identify specific aspects of the program that did or did not work well. We encourage bridge developers to publish lessons learned from early pilots of bridge programs with enough detail so that others may be able to avoid the same issues.

    Unlike some teaching innovations that are relatively low cost and easy to implement, bridge programs tend to require significant time, funding, and resources to develop and implement. The lack of current information available, combined with the zeal for developing these bridge programs, means that we are likely developing new programs without building on the experiences and advice of others. Given the current funding pressures and limited resources on campuses, particularly at public institutions, perhaps we should prioritize funding to programs that are using evidence to design and revise their programs.

    Generally positive gains were reported from bridge programs, and while we do not doubt that positive gains can result from bridge programs, we worry that some of these reported positive gains were not significant and/or that negative results may not have been reported. To maximize the effectiveness of STEM bridge programs, we need to learn from both the successes and failures of these programs. The bridge community would greatly benefit from examples of programs that were not successful and ideas about why they may not have been successful. Further, there are two programs (LSU’s BIOS and UMBC’s Meyerhoff Scholars Program) that make up almost 50% of the peer-reviewed journal publications on STEM bridge programs. Although BIOS and the Meyerhoff Scholars Program are very different in their target populations (open to anyone compared with targeting high academic–ability URM students, respectively) and length of time (5 days compared with 6 weeks), there may be particular elements of these programs that do not generalize to other STEM bridge programs. Thus, we need additional examples in peer-reviewed journals of bridge programs that take place in different contexts with different students. We can learn a lot from the elements of programs that do not lead to positive results as well as from elements that are successful, especially if we know enough about the details of the programs. All of this information may help other bridge developers design their own programs.

    Recommendation 3: Report More Information about the Details of Implementing Bridge Programs

    When we surveyed the literature on bridge programs, we were struck by how little information was presented about how to run the day-to-day activities of a bridge program. How much do these programs cost? How are students recruited and selected into the program? What is the average size of the program, who staffs the program, and who develops the curriculum? A team developing a bridge program would benefit from more information about program logistics. While we recognize that there are constraints (e.g., word limits) for papers and concerns about relevance to the main points of a paper, we would encourage bridge developers to include this type of information in the supplemental material of a paper, on a linked website, or in a stand-alone curriculum description of the program. For more information about some of the details of the bridge programs in this review, please see individual program websites listed in Supplemental Table 2.

    Costs and Resources.

    Costs involved in running bridge programs can include funds to support curriculum development, stipends/salaries for staff, housing/food for students, materials for the program, and transportation. We had difficulty finding information about the costs of programs, including how much programs cost overall, program costs per individual student, and where the funds to run the program come from, although some reports do provide specific information (e.g., Wischusen and Wischusen, 2007). Some programs may be funded by an institutions, and if an institution is interested in increasing retention rates, then the focus of the program may be on academically underprepared students who are predicted to not be retained through their first year of college. Alternatively, programs may be underwritten by national funding agencies. In such cases, the priorities of national funding agencies may influence the demographics of students recruited to a bridge program. For example, funding from the National Science Foundation is awarded in part based on broader impacts and often focuses the recruitment into the program on underrepresented or underserved groups in science. Some programs may depend on participants to fund the bridge program, although requiring students to pay to participate in bridge programs may limit the participation of students from low socioeconomic backgrounds who are less likely to persist in college than their economically advantaged peers (Thayer, 2000).

    Bridge program developers would also benefit from learning about programs that use a combination of funding sources, especially if they receive initial funding from a national funding source and succeed in becoming self-sustaining. To our knowledge, there is little information about which programs have been able to become self-sustaining after national funding runs out. Some programs seem to extend their programming by combining national grant funding with private donor funding, while others have their institutions take on the financial responsibility for their programs. It is worth noting that different institutions have varying levels of institutional support for applying for grants or developing relationships with potential donors, so conclusions about how best to fund and sustain programs are likely specific to institution type. Nevertheless, bridge program developers would likely benefit from learning how current programs are sustained after initial national funding is depleted.

    Recruitment and Selection.

    Recruiting students to bridge programs can be difficult, because by definition, students who are recruited for bridge programs have not yet attended the 4-year institution, so communication can be complicated. More information about how programs recruit students would be useful to future bridge program developers (e.g., do they coordinate with admissions departments or do they hire staff who are in charge of recruitment?). Additionally, if programs cannot accommodate all students who are eligible and willing to participate in the program, then the bridge report should include how participants are selected. This information can be especially important when interpreting results, because selecting for students with certain characteristics (e.g., high motivation or commitment to the major) may influence outcome measures (e.g., retention), and programs would need to control for these characteristics to determine whether the intervention had an impact on a particular outcome.

    Size.

    From the published literature, it was often difficult to ascertain the size of programs, because sizes were not reported or programs often scale up over time and reports did not break down program enrollment by year. However, bridge program developers would benefit from other programs that support a similar number of students, because scaling up comes with its own set of challenges (Elias et al., 2003; Klingner et al., 2003).

    Curriculum Development.

    Bridge programs that have the same goals as a previously published bridge program may want to adapt previously developed curricular materials. If programs develop unique activities or exercises for students, especially ones that have been shown to influence student cognitive or affective gains, it would be helpful to share them by publishing in an outlet such as CourseSource, a website devoted to evidence-based teaching resources for biology education (CourseSource, 2017). The bridge community would benefit from creating and building upon resources that can be easily shared.

    Follow-up and Concurrent Interventions.

    All of these bridge programs take place during the summer before students enter college. However, 40% of the programs have a follow-up component that stretches into students’ first semesters or first years, or even their entire college careers (see Supplemental Table 2 for information about the follow-up for the 30 bridge programs). Thus, the length of summer bridge programs outlined in Figure 2 may be a bit misleading if a program continues to support students for an additional 4 years. While some bridge programs reported this follow-up, the level of detail about the follow-up varies. Sometimes it consists of a few social events spread out over the first year of college, other times it is a weekly class that has a set curriculum. Thus, the impact of these bridge programs may be the result of more than the summer experience. Further, institutions may implement multiple strategies to improve first-year student retention, including first-year mentoring programs (Rodger and Tremblay, 2003) or first-year learning communities (Hotchkiss et al., 2006), which can make it difficult to disaggregate how bridge programs influence student success. Information about bridge follow-up and other concurrent interventions would be useful to include in bridge reports or in supplemental material to give bridge developers a greater level of information about what aspects of the program may have contributed to specific gains and to help inform decisions about how long to support bridge students.

    Recommendation 4: Align Bridge Goals and Measured Outcomes

    From our analysis of the peer-reviewed published reports on STEM bridge programs, we have identified a misalignment between stated goals of the programs and the outcomes that are measured. In some cases of peer-reviewed publications of bridge programs, programs have goals that they do not report measuring, while in other cases, programs report outcomes, but do not explicitly state that those outcomes were goals of the program. It is possible that the program evaluators measured the goal and did not report an unsuccessful outcome, that the data that they needed to measure a goal were not available at that time, or that the authors felt that the data were not relevant to that particular publication. However, it is also possible that some programs may not have aligned their assessments of their programs with their intended goals.

    This disconnect between stated goals and outcomes may indicate that some STEM bridge programs are not being backward designed. Backward design is a model for course or program development that recommends first outlining desired goals, then determining acceptable evidence, and finally planning experiences and instruction (Wiggins and McTighe, 1998). Backward design has primarily been recommended for lecture courses. Recently, we have made specific recommendations to use backward design when designing course-based undergraduate research experiences and independent research experiences because of disconnects between goals and measured outcomes in the literature on undergraduate research (Cooper et al., 2017c). It appears as though similar disconnects may be happening for bridge programs, and a backward design approach could help program developers ensure that the activities in their program are being designed to meet their larger program goals.

    Specifically, bridge developers may benefit from first selecting the goals for their programs. Theoretical frameworks, such as Tinto’s theory of college student departure, can guide developers in selecting short-term goals (e.g., social integration) that are theorized to lead to accomplishing long-term goals (e.g., increased retention in the major). Different institutions (e.g., community colleges compared with research universities) may have different challenges for incoming students and thus may want to consult the literature to identify what short-term goals may be most likely to lead to accomplishment of long-term goals given the unique characteristics of their institution.

    Once goals are selected, then bridge developers can decide what constitutes acceptable evidence that a goal has been met. This may take the form of a pre–post assessment design wherein students’ progress toward a particular goal is measured at the beginning and end of the program, or a matched-pair design wherein bridge student outcomes are measured by comparing them with the outcomes of students who did not enroll in the bridge program or students in a previous year when the bridge program did not exist. Bridge programs are often voluntary, so any comparison of the participants with nonvolunteers should be viewed with caution, as there are likely motivation differences among these students (Brownell et al., 2013). Once bridge developers have selected goals and determined acceptable evidence, they can use STEM education literature to plan student experiences and instruction that are most likely to lead to program goals. If bridge developers observe an unexpected result from their bridge program (e.g., Cooper et al., 2017a), then they could build these desired outcomes into future versions of bridge programs and then use backward design to assess what specific elements of the program led to that particular result. As such, this backward design process becomes a highly iterative one in which novel outcomes could lead to a revision of the goals for the next year’s bridge program.

    Backward design can also help bridge developers be more critical about how they are assessing the impact of their programs. A major theme from our literature review on STEM bridge programs is that outcomes are being measured in many different ways, some of which are not aligned with best practices and may not be the most direct or rigorous way to measure a specific outcome. While it is unsurprising that researchers in an emerging area are using different approaches to assess bridge programs, discussions about what to consider when measuring certain outcomes and what are best practices for measuring those outcomes would benefit those interested in improving bridge programs.

    CONCLUSIONS

    In this paper, we synthesized the literature on STEM bridge programs over the past 25 years. We found 46 total published reports on 30 unique bridge programs. While these programs report success in achieving certain goals and provide a foundation for bridge developers, there is much that still needs to be established about the impact of STEM bridge programs. We hope that this review will spark more research and conversations about the potential impact of bridge programs.

    ACKNOWLEDGMENTS

    We thank the developers of the bridge programs discussed in this article for providing an important foundation that future programs can build upon. We also thank the members of the ASU Biology Education Research Lab, especially Sai Tummala and Erin Shortlidge, for their contribution and feedback on earlier drafts of this paper.

    REFERENCES

  • American Association for the Advancement of Science (2011). Vision and change in undergraduate biology education: A call to action, Washington, DC Retrieved March 19, 2015, from http://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf. Google Scholar
  • Ami C. G. (2001). The effects of a four-week summer bridge program, Albuquerque: University of New Mexico,. Google Scholar
  • Anderman L. H., Freeman T. (2004, Ed. M. L. MaehrP. R. Pintrich, Students’ sense of belonging in school In: Advances in motivation and achievement, (Vol. 13). Motivating students, improving schools: The legacy of Carol Midgley, Oxford, UK: Elsevier, 27-63. Google Scholar
  • Anderson T., Shattuck J. (2012). Design-based research: A decade of progress in education research. Educational Researcher 41, (1), 16-25. Google Scholar
  • Bhattacharya B., Hansen D. E. (2015). Implementing a summer STEM bridge program. Peer Review 17, (2), 19-20. Google Scholar
  • Biernacki P., Waldorf D. (1981). Snowball sampling: Problems and techniques of chain referral sampling. Sociological Methods and Research 10, (2), 141-163 https://doi.org/10.1177/004912418101000205. Google Scholar
  • Bong M., Skaalvik E. M. (2003). Academic self-concept and self-efficacy: How different are they really. Educational Psychology Review 15, (1), 1-40 http://dx.doi.org/10.1023/A:1021302408382. Google Scholar
  • Boykin K., Raju D., Bonner J., Gleason J., Bowen L. (2010). Engineering math based bridge program for student preparation In: International Conference on Education, Training and Informatics: ICETI 2010, held April 6–9, 2010, in Orlando, FL (pp. 1–6). Google Scholar
  • Brownell S. E., Freeman S., Wenderoth M. P., Crowe A. J. (2014). BioCore guide: A tool for interpreting the core concepts of vision and change for biology majors. CBE—Life Sciences Education 13, (2), 200-211 http://dx.doi.org/10.1187/cbe.13-12-0233. LinkGoogle Scholar
  • Brownell S. E., Kloser M. J., Fukami T., Shavelson R. J. (2013). Context matters: Volunteer bias, small sample size, and the value of comparison groups in the assessment of research-based undergraduate introductory biology lab courses. Journal of Microbiology and Biology Education 14, (2), 176-182 http://dx.doi.org/10.1128/jmbe.v14i2.609. MedlineGoogle Scholar
  • Bruno B. C., Wren J. L., Noa K., Wood-Charlson E. M., Ayau J., Soon S. L., Choy C. A. (2016). Summer bridge program establishes nascent pipeline to expand and diversify Hawai’i’s undergraduate geoscience enrollment. Oceanography 29, (2), 286-292 http://dx.doi.org/10.5670/oceanog.2016.33. Google Scholar
  • Budny D. (1992). Mathematics bridge program In: In Proceedings of the 1992 Frontiers in Education Twenty-second Annual Conference, ASEE 48-52. Google Scholar
  • Cabrera A., Nora A., Castaneda M. (1993). College persistence: Structural equations modeling test of an integrated model of student retention. Journal of Higher Education 64, (2), 123-139 http://dx.doi.org/10.2307/2960026. Google Scholar
  • Cairncross C., Jones S. A., Naegele Z., VanDeGrift T. (2015). Building a summer bridge program to increase retention and academic success for first-year engineering students In: In American Society for Engineering Education 2015 Annual Conference, held June 14–17, 2015, in Seattle, WA (pp. 26.300.1–26.300.24). Google Scholar
  • Chen X. (2013). STEM attrition: College students’ paths into and out of STEM fields (Statistical Analysis Report NCES 2014-001), Washington, DC: National Center for Education Statistics,. Google Scholar
  • Chevalier L., Chrisman B., Kelsey M. (2001). SUCCESS week: A freshmen orientation program at Southern Illinois University Carbondale College of Engineering In: In International Conference on Engineering Education, held August 6–10, 2001, in Oslo, Norway (pp. 7E8-1–7E8-3). Google Scholar
  • Citty J. M., Lindner A. S. (2012). Dual model summer bridge programs: A new consideration for increasing retention rates In: In American Society for Engineering Education 2012 Annual Conference, held June 10–13, 2012, in San Antonio, TX (pp. 25.480.1–25.480.13). Google Scholar
  • Cooper K. M., Ashley M., Brownell S. E. (2017a). A bridge to active learning: A summer bridge program helps students maximize their active-learning experiences and the active-learning experiences of others. CBE—Life Sciences Education 16, (1), ar17 http://dx.doi.org/10.1187/cbe.16-05-0161. LinkGoogle Scholar
  • Cooper K. M., Haney B., Krieg A., Brownell S. E. (2017b). What’s in a name? The importance of students perceiving that an instructor knows their names in a high-enrollment biology classroom. CBE—Life Sciences Education 16, (1), ar8 http://dx.doi.org/10.1187/cbe.16-08-0265. LinkGoogle Scholar
  • Cooper K. M., Soneral P., Brownell S. E. (2017c). Design your goals before you develop your CURE. A call to use backward design in planning course-based undergraduate research experiences. Journal of Microbiology and Biology Education 18, (2), http://dx.doi.org/10.1128/jmbe.v18i2.1287. Google Scholar
  • CourseSource (2017). CourseSource home page In: Retrieved February 13, 2017, from www.coursesource.org. Google Scholar
  • Doerr H. M., Ärlebäck J. B., Staniec A. C. (2014). Design and effectiveness of modeling-based mathematics in a summer bridge program. Journal of Engineering Education 103, (1), 92-114 http://dx.doi.org/10.1002/jee.20037. Google Scholar
  • Elias M. J., Zins J. E., Graczyk P. A., Weissberg R. P. (2003). Implementation, sustainability, and scaling up of social-emotional and academic innovations in public schools. School Psychology Review 32, (3), 303-319. Google Scholar
  • Fletcher S. L., Newell D. C., Anderson-Rowland M. R., Newton L. D. (2001a, October). The Women in Applied Science and Engineering Summer Bridge Program: Easing the transition for first-time female engineering students In: In 31st Annual Frontiers in Education Conference, held October 10–13, 2001, in Reno, NV (Vol. 3). Google Scholar
  • Fletcher S. L., Newell D. C., Newton L. D., Anderson-Rowland M. R. (2001b, June). The WISE Summer Bridge Program: Assessing student attrition, retention, and program effectiveness In: In American Society for Engineering Education 2001 Annual Conference, held June 24–27, 2001, in Albuquerque, NM (pp. 6.1053.1–6.1053.7). Google Scholar
  • Gerdes H., Mallinckrodt B. (1994). Emotional, social, and academic adjustment of college students: A longitudinal study of retention. Journal of Counseling and Development 72, (3), 281-288 http://dx.doi.org/10.1002/j.1556-6676.1994.tb00935.x. Google Scholar
  • Gilmer T. C. (2007). An understanding of the improved grades, retention and graduation rates of STEM majors at the Academic Investment in Math and Science (AIMS) Program of Bowling Green State University (BGSU). Journal of STEM Education: Innovations and Research 8, (1/2), 11-21. Google Scholar
  • Gleason J., Boykin K., Johnson P., Bowen L., Whitaker K., Micu C., Slappy C. (2010). Integrated engineering math-based summer bridge program for student retention. Advances in Engineering Education 2, (2), 1-17. Google Scholar
  • Graham K. J., McIntee E. J., Armbrister P. M. (2013). NSF S-STEM scholarship and support mechanisms: A cohort-based summer bridge program in chemistry. Abstracts of Papers of the American Chemical Society 245, 46. Google Scholar
  • Graham K. J., McIntee E. J., Raigoza A. F., Fazal M. A., Jakubowski H. V. (2017). Activities in an S-STEM program to catalyze early entry into research. Journal of Chemical Education 94, (2), 177-182 http://dx.doi.org/10.1021/acs.jchemed.6b00338. Google Scholar
  • Grimm M. J. (2005). Work in progress—an engineering bridge program—the foundation for success for academically at-risk students In: In 35th Annual Frontiers in Education Conference, held October 19–22, 2005, in Indianapolis, IN (pp. S2C-8–S2C-9). Google Scholar
  • Harkins M. (2016). Engineering boot camp: A broadly based online summer bridge program for engineering freshmen. In American Society for Engineering Education 2016 Annual Conference, held June 26–29, 2016, in New Orleans, LA. Google Scholar
  • Hotchkiss J. L., Moore R. E., Pitts M. M. (2006). Freshman learning communities, college performance, and retention. Education Economics 14, (2), 197-210. Google Scholar
  • Hrabowski F. A., Maton K. I. (1995). Enhancing the success of African-American students in the sciences: Freshmen year outcomes. School Science and Mathematics 95, (1), 18-27 http://dx.doi.org/10.1111/j.1949-8594.1995.tb15719.x. Google Scholar
  • Hurtado S., Eagan K., Figueroa T., Hughes B. (2014). Reversing underrepresentation: The impact of undergraduate research programs on enrollment in STEM graduate programs, Los Angeles: Higher Education Research Institute, University of California–Los Angeles,. Google Scholar
  • Intemann K. (2009). Why diversity matters: Understanding and applying the diversity component of the National Science Foundation’s broader impacts criterion. Social Epistemology 23, (3–4), 249-266 http://dx.doi.org/10.1080/02691720903364134. Google Scholar
  • Kezar A. (2000). Summer bridge programs: Supporting all students (Report No. BBB32577), Washington DC: George Washington University, Graduate School of Education and Human Development, Retrieved from ERIC (ED442421). Google Scholar
  • Klingner J. K., Ahwee S., Pilonieta P., Menendez R. (2003). Barriers and facilitators in scaling up research-based practices. Exceptional Children 69, (4), 411-429 https://doi.org/10.1177/001440290306900402. Google Scholar
  • Kopec R. L., Blair D. A. (2014). Community for Achievement in Science, Academics, and Research: The CASAR Project In: In 6th Annual First Year Engineering Experience Conference, held August 7–8, 2014, College Station, TX (pp. F1A-1–F1A-3). Google Scholar
  • Kuh G. D., Hu S. (2001). The effects of student–faculty interaction in the 1990s. Review of Higher Education 24, (3), 309-332 http://dx.doi.org/10.1353/rhe.2001.0005. Google Scholar
  • Lamport M. A. (1993). Student–faculty interaction and the effect on college student outcomes: A review of the literature. Adolescence 28, (112), 971-990. MedlineGoogle Scholar
  • Lenaburg L., Aguirre O., Goodchild F., Kuhn J. U. (2012). Expanding pathways: A summer bridge program for community college STEM students. Community College Journal of Research and Practice 36, (3), 153-168 http://dx.doi.org/10.1080/10668921003609210. Google Scholar
  • Maton K. I., Beason T. S., Godsay S., Domingo M. R. S., Bailey T. C., Sun S., Hrabowski F. A. (2016). Outcomes and processes in the Meyerhoff Scholars Program: STEM PhD completion, sense of community, perceived program benefit, science identity, and research self-efficacy. CBE—Life Sciences Education 15, (3), ar48 http://dx.doi.org/10.1187/cbe.16-01-0062. LinkGoogle Scholar
  • Maton K. I., Domingo M. R. S., Stolle-McAllister K. E., Zimmerman J. L., Hrabowski F. A. (2009). Enhancing the number of African-Americans who pursue STEM PhDs: Meyerhoff Scholarship Program outcomes, processes, and individual predictors. Journal of Women and Minorities in Science and Engineering 15, (1), 15-37 http://dx.doi.org/10.1615/JWomenMinorScienEng.v15.i1.20. MedlineGoogle Scholar
  • Maton K. I., Hrabowski F. A., Schmitt C. L. (2000). African American college students excelling in the sciences: College and postcollege outcomes in the Meyerhoff Scholars Program. Journal of Research in Science Teaching 37, (7), 629-654 http://dx.doi.org/10.1002/1098-2736(200009)37:7<629::AID-TEA2>3.0.CO;2-8. Google Scholar
  • Maton K. I., Pollard S. A., McDougall Weise T. V., Hrabowski F. A. (2012). Meyerhoff Scholars Program: A strengths-based, institution-wide approach to increasing diversity in science, technology, engineering, and mathematics. Mount Sinai Journal of Medicine: A Journal of Translational and Personalized Medicine 79, (5), 610-623 http://dx.doi.org/10.1002/msj.21341. MedlineGoogle Scholar
  • Moore L. J. (2013). Igniting an interest in chemistry: Expanding on renewable energy laboratories in a three-week summer bridge program. Abstracts of Papers of the American Chemical Society 245, 744. Google Scholar
  • Moore L. J., Fasano C. G., Downing W. (2016). Developing a culture of undergraduate research with incoming students. CUR Quarterly 36, (4), 45 http://dx.doi.org/10.18833/curq/36/4/13. Google Scholar
  • National Academy of Sciences, National Academy of Engineering, and Institute of Medicine (2010). Rising Above the Gathering Storm, revisited: Rapidly approaching category 5, Washington, DC: National Academies Press, https://doi.org/10.17226/12999. Google Scholar
  • National Science Foundation (2017). Women, minorities, and persons with disabilities in science and engineering: 2017, Arlington, VA: National Center for Science and Engineering Statistics,. Google Scholar
  • Nite S. B., Capraro R. M., Capraro M. M., Allen G. D., Pilant M., Morgan J. (2015). A bridge to engineering: A personalized precalculus (bridge) program In: In 45th Annual Frontiers in Education Conference, held October 21–24, 2015, in El Paso, TX (pp. 2053–2058). Google Scholar
  • Pascarella E. T. (1980). Student–faculty informal contact and college outcomes. Review of Educational Research 50, (4), 545-595 https://doi.org/10.3102/00346543050004545. Google Scholar
  • Persaud A., Freeman A. L. (2005). Creating a successful model for minority students’ success in engineering: The PREF Summer Bridge Program In: In 2005 Women in Engineering ProActive Network/National Association of Multicultural Engineering Program Advocates Joint Conference, held April 10–13, 2005, in Las Vegas, NV (pp. 1–7). Google Scholar
  • Pintrich P. R., De Groot E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology 82, 33-40 http://dx.doi.org/10.1037/0022-0663.82.1.33. Google Scholar
  • President’s Council of Advisors on Science and Technology (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics, Washington, DC. Google Scholar
  • Pritchard T. J., Perazzo J. D., Holt J. A., Fishback B. P., McLaughlin M., Bankston K. D., Glazer G. (2016). Evaluation of a summer bridge: Critical component of the Leadership 2.0 Program. Journal of Nursing Education 55, (4), 196-202 http://dx.doi.org/10.3928/01484834-20160316-03. MedlineGoogle Scholar
  • Raines J. M. (2012). FirstSTEP: A preliminary review of the effects of a summer bridge program on pre-college STEM majors. Journal of STEM Education: Innovations and Research 13, (1), 22-29. Google Scholar
  • Reisel J. R., Jablonski M., Hosseini H., Munson E. (2012). Assessment of factors impacting success for incoming college engineering students in a summer bridge program. International Journal of Mathematical Education in Science and Technology 43, (4), 421-433 http://dx.doi.org/10.1080/0020739X.2011.618560. Google Scholar
  • Reyes M. A., Anderson-Rowland M. R., McCartney M. A. (1998). Freshman introductory engineering seminar course: Coupled with bridge program equals academic success and retention In: In 28th Annual Frontiers in Education Conference, held November 4–7, 1998, in Tempe, AZ (Vol. 1, pp. 505–510). Google Scholar
  • Reyes M. A., Gotes M. A., McNeill B., Anderson-Rowland M. R. (1999). MEP Summer Bridge Program: A model curriculum project In: In American Society for Engineering Education 1999 Annual Conference, held June 20–23, 1999, in Charlotte, NC (pp. 4.380.1–4.380.8). Google Scholar
  • Rodger S., Tremblay P. K. (2003). The effects of a peer mentoring program on academic success among first year university students. Canadian Journal of Higher Education 33, (3), 1-18. Google Scholar
  • Rothstein H. R., Sutton A. J., Borenstein M. (2005). Publication bias in meta-analysis: Prevention, assessment and adjustments, West Sussex, UK: Wiley,. Google Scholar
  • Russell S. H. (2006a). Evaluation of NSF support for undergraduate research opportunities: Synthesis report, Arlington, VA: National Science Foundation,. Google Scholar
  • Russell S. H. (2006b). Evaluation of NSF support for undergraduate research opportunities: Follow-up survey of undergraduate NSF program participants: Draft final report, Arlington, VA: National Science Foundation, (pp. vi, 6, 54, 15). Google Scholar
  • Russomanno D. J., Best R., Ivey S., Haddock J. R., Franceschetti D., Hairston R. J. (2010). MemphiSTEP: A STEM talent expansion program at the University of Memphis. Journal of STEM Education: Innovations and Research 11, (1), 69-81. Google Scholar
  • Sablan J. R. (2014). The challenge of summer bridge programs. American Behavioral Scientist 58, (8), 1035-1050 https://doi.org/10.1177/0002764213515234. Google Scholar
  • Schunk D. H., Pajares F. (2009, Ed. K. R. WentzelA. Wigfield, Self-efficacy theory In: Handbook of motivation at school, New York: Routledge,. Google Scholar
  • Seymour E., Hewitt N. (1997). Talking about leaving: Why undergraduates leave the sciences, Boulder, CO: Westview,. Google Scholar
  • Shields N., Grodsky H. R., Darby W. P. (1996). Access to engineering: A description and an evaluation of a pre-collegiate program for minorities and women In: In American Society for Engineering Education 1996 Annual Conference, held June 23–26, 1996, in Washington, DC (pp. 1.52.1-1.52.16). Google Scholar
  • Snow S. G. (1973). Correlates of faculty–student interaction. Sociology of Education 46, (4), 489-498 http://dx.doi.org/10.2307/2111902. Google Scholar
  • Stemler S. (2004). A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assessment, Research, and Evaluation 9, (4), 1-11. Google Scholar
  • Stolle-McAllister K. (2011). The case for summer bridge: Building social and cultural capital for talented Black STEM students. Science Educator 20, (2), 12-22. Google Scholar
  • Stolle-McAllister K., Domingo M. R. S., Carrillo A. (2011). The Meyerhoff way: How the Meyerhoff Scholarship Program helps Black students succeed in the sciences. Journal of Science Education and Technology 20, (1), 5-16 http://dx.doi.org/10.1007/s10956-010-9228-5. MedlineGoogle Scholar
  • Summers M. F., Hrabowski F. A. (2006). Preparing minority scientists and engineers. Science 311, (5769), 1870-1871 http://dx.doi.org/10.1126/science.1125257. MedlineGoogle Scholar
  • Thayer P. B. (2000). Retention of students from first generation and low income backgrounds In: Opportunity Outlook, 3 1 2-8. Google Scholar
  • Thompson M. K., Consi T. R. (2007). Engineering outreach through college pre-orientation programs: MIT Discover Engineering. Journal of STEM Education: Innovations and Research 8, (3/4), 75-82. Google Scholar
  • Tinto V. (1975). Dropouts from higher education: A theoretical synthesis of recent literature. A Review of Educational Research 45, (1), 89-125 https://doi.org/10.3102/00346543045001089. Google Scholar
  • Tinto V. (1987). Leaving college: Rethinking the causes and cures of student attrition, Chicago, IL: University of Chicago Press,. Google Scholar
  • Tomasko D. L., Ridgway J. S., Olesik S. V., Waller R. J., McGee M. M., Barclay L. A., Upton J. (2013). Impact of summer bridge programs on STEM retention at the Ohio State University In: In Proceedings of the 2013 ASEE North-Central Section Conference, held April 5–6, 2013, in Columbus, OH (pp. 1–13). Google Scholar
  • Tomasko D. L., Ridgway J. S., Waller R. J., Olesik S. V. (2016). Association of summer bridge program outcomes with STEM retention of targeted demographic groups. Journal of College Science Teaching 45, (4), 90-99. Google Scholar
  • Trujillo G., Tanner K. D. (2014). Considering the role of affect in learning: Monitoring students’ self-efficacy, sense of belonging, and science identity. CBE—Life Sciences Education 13, (1), 6-15 http://dx.doi.org/10.1187/cbe.13-12-0241. LinkGoogle Scholar
  • Wheatland J. A. (2000). The relationship between attendance at a summer bridge program and academic performance and retention status of first-time freshmen science, engineering, and mathematics students at Morgan State University, an historically Black university (Doctoral dissertation), Baltimore, MD: Morgan State University,. Google Scholar
  • Wheeler E. R., Wischusen S. M. (2014). Development self-regulation and self-efficacy: A cognitive mechanism for success of biology boot camps. Electronic Journal of Science Education 18, (1), 1-16. Google Scholar
  • Wiggins G., McTighe J. (1998). Understanding by design, Alexandria, VA: Association for Supervision and Curriculum Development,. Google Scholar
  • Windsor A., Russomanno D. J., Bargagliotti A., Best R., Franceschetti D., Haddock J., Ivey S. (2015). Increasing retention in STEM: Results from a STEM talent expansion program at the University of Memphis. Journal of STEM Education: Innovations and Research 16, (2), 11-19. Google Scholar
  • Wischusen S. M., Wischusen E. W. (2007). Biology intensive orientation for students (BIOS): A biology “boot camp.”. CBE—Life Sciences Education 6, (2), 172-178 http://dx.doi.org/10.1187/cbe.06-08-0184. LinkGoogle Scholar
  • Wischusen S. M., Wischusen E. W., Pomarico S. M. (2011). Impact of a short pre-freshman program on retention. Journal of College Student Retention: Research, Theory and Practice 12, (4), 429-441 http://dx.doi.org/10.2190/CS.12.4.c. Google Scholar