ASCB logo LSE Logo

General Essays and ArticlesFree Access

Factors Predicting the Extent to which STEM Students Value Cross-Disciplinary Skills: A Study across Four Institutions

    Published Online:https://doi.org/10.1187/cbe.22-06-0101

    Abstract

    Expectancy-value theory of motivation (EVT) suggests that student values influence their likelihood of putting in the effort required to learn, and these values can be shaped by student characteristics, such as their experiences, sociodemographics, and disciplinary norms. To understand the extent to which these characteristics relate to students’ values, we surveyed 1162 graduating science, technology, engineering, and mathematics (STEM) students across four universities using the previously developed and validated Survey of Teaching Beliefs and Practices for Undergraduates (STEP-U). The STEP-U survey included Likert questions to capture students’ values of 27 cross-disciplinary skills and the frequency with which they experienced 27 instructional methods thought to develop particular skills. Exploratory factor analyses (EFA) showed an understandable factor structure for both students’ perceived value of cross-disciplinary skills and frequency of classroom experiences. Using multiple regression, we identified differences in values that were associated with classroom experiences, STEM discipline, participation in undergraduate research, and student sociodemographics. Findings were generalizable across institutions and disciplines. The theoretical framework (EVT), the broad data collection (four institutions with multiple disciplines), and the type of data analyses (e.g., EFA) used provide theoretical, methodological, and practical contributions and suggest additional directions for future research.

    INTRODUCTION

    Within the last decade, there have been efforts to continuously grow undergraduate enrollment in science, technology, engineering, and mathematics (STEM) disciplines (President’s Council of Advisors on Science and Technology, 2012). However, there is a concern about the graduates’ preparedness for the workforce (Hart Research Associates, 2015; McGunagle and Zizka, 2020). As such, more research is required across STEM disciplines to broaden what and how we think about student outcomes as they relate to postgraduate student success. Student success within undergraduate STEM programs and in postgraduate careers relies not only on gaining content knowledge but also on a variety of skills such as communication, critical thinking, and lab techniques (American Association for the Advancement of Science, 2011; American Chemical Society, 2015; Heron and McNeil, 2016). For the purposes of this study, we will use the term “cross-disciplinary skills” to describe a set of skills that include those deemed most important across students, faculty, and employers as described in the Literature Review. From a motivational perspective, it is vital to understand the degree to which students value these cross-disciplinary skills, as student values influence their likelihood of dedicating the necessary effort toward acquiring these skills (Wigfield and Eccles, 2000).

    Previous work has explored the extent to which students value a variety of skills (e.g., Demaria et al., 2018; Marbach-Ad et al., 2019); however, there are two gaps in this research. First, although there is a body of research that suggests that students’ values of cross-disciplinary skills are shaped by the experiences they have before graduation (e.g., Gilmore et al., 2015; Demaria et al., 2018; Lavi et al., 2021; McGunagle and Zizka, 2020), these studies are situated at single institutions, so it is unclear whether the findings are generalizable beyond their original context. There has also been little work teasing apart the different impacts of these experiences (i.e., research experiences, classroom experiences, and disciplinary affiliation) on student values. Second, while conceptual models have been developed and cross-disciplinary skills have been ranked in importance, there is currently limited empirical research on how these skills can be meaningfully organized. The lack of an overarching organizational framework for skills important for STEM majors limits researcher’s ability to synthesize, compare, and extend previous work in this area.

    In the present study, we use a large, cross-institutional and cross-disciplinary sample to measure the extent to which graduating undergraduate STEM students value a variety of cross-disciplinary skills and identify an organizational structure for these skills using exploratory factor analysis (EFA). Based on this factor structure, we explore the influences of student research experience, experience with specific classroom instructional practices, and disciplinary affiliation on the extent to which students value skills important for the workplace.

    LITERATURE REVIEW

    In this review we first describe the characteristics of motivation theory and its relationship to cross-disciplinary skills. We then overview a categorization for cross-disciplinary skills and values important in STEM disciplines and the research describing factors that may relate to students’ skill development. We then bring these ideas together to present a conceptual framework for understanding cross-disciplinary skill-based outcomes for undergraduate STEM majors.

    Motivation and Cross-Disciplinary Skills-Based Outcomes

    If effort is required for learning then it follows that motivation is also required, because students will not make that effort unless they are motivated to do so. (Palmer, 2005, p. 1855)

    To understand the importance of students’ values on their skill development, we use the expectancy-value theory (EVT) framework of motivation (Wigfield and Eccles, 2000), defined as “individuals’ choice, persistence, and performance [which] can be explained by their beliefs about how well they will do on the activity and the extent to which they value the activity” (Wigfield and Eccles, 2000, p. 68). There are two main components of motivation from the EVT framework: expectancy and value. Expectancy beliefs are defined as belief in one’s ability to be able to complete a task and include self-beliefs such as ability beliefs and self-efficacy.1 Value is defined as one’s belief in the importance of a task.

    A plethora of research has sought to understand the importance of these motivational constructs on student performance (e.g., Van Dinther et al., 2011; Cerasoli et al., 2014; Ferrell et al., 2016). These studies suggest that student interest, also called intrinsic motivation, is the most important motivational characteristic predicting student performance. Other studies have identified intervention strategies for improving motivation (e.g., Curry et al., 2020; Hulleman et al., 2010), which include emphasizing the value or allowing students to discover the value in various tasks. While the findings are complex and nuanced, these studies suggest that instructional practices that promote students’ seeking value in learning, as opposed to the instructor telling students the value of the learning, enhance its value. Other work suggests that motivation and its impact on learning may differ based on various student sociodemographic characteristics such as gender and race (e.g., Macphee et al., 2013; Roksa and Whitley, 2018). These studies suggest that women and students who are historically underrepresented in higher education are differentially impacted by motivational constructs, to the detriment of their performance and persistence. In other words, historically underrepresented students and female students may benefit less from being motivated than their majority counterparts.

    Together, the research on motivation suggests that: 1) the extent to which students value learning is the most important component of motivation to promote learning, 2) opportunities for students to discover why learning something is important can improve the value of learning for them, and 3) other variables may influence students’ motivation for learning. In the context of the present study, we seek to measure undergraduate STEM students’ value of cross-disciplinary skills and the characteristics that may relate to these values, such as course and research experiences, as well as sociodemographic characteristics. We do not presume to identify the different types of values (e.g., cost, intrinsic) students have toward these skills. Rather, we are using EVT to demonstrate the importance of these values for learning. In the following section, we review the research on skill-based outcomes, values, and characteristics that are associated with these skills.

    Cross-Disciplinary Skills-Based Outcomes for Undergraduate STEM Majors

    Previous literature has identified a plethora of skills important for STEM undergraduate students to develop and has sought to organize them conceptually around various constructs. The broadest conceptualization of these skills is 21st-century skills, defined as “a broad set of knowledge, skills, work habits, and character traits that are vital to the success in the future world” (McGunagle and Zizka, 2020, p. 592). Twenty-first-century skills were originally organized by Binkley and colleagues (2012) into four main categories: ways of thinking, ways of working, tools for working, and living in the world (p. 18–19). Ways of thinking include skills such as creativity, critical thinking, innovation, and problem solving. Ways of working are characterized by skills related to communication and collaboration. Tools for working focus on information literacy and evaluating sources and evidence. Finally, living in the world includes skills related to personal and social responsibility.

    Others have defined skills more narrowly based on what is needed for STEM employment, called employability skills or workforce skills. For example, Siekmann (2016) organized employability skills into technical skills (e.g., STEM and non-STEM skills), cognitive skills (e.g., creativity, critical thinking), and socioemotional skills (e.g., curiosity, empathy). Rayner and Papakonstantinou (2015) organized employability skills into vocational skills (e.g., ability to acquire and apply disciplinary knowledge), generic skills (e.g., problem solving, critical thinking, communication), and interpersonal skills (e.g., teamwork, confidence, ethics). Viskupic and colleagues (2021) organized workforce skills into data skills (e.g., apply knowledge, analyze data, evaluate and interpret data), disciplinary skills (e.g., spatial thinking, temporal thinking), communication skills (e.g., teamwork, oral communication), and systems thinking (e.g., build models, describe and analyze systems).

    Cross-disciplinary skills are similar in scope to employability skills, and the two terms are often used synonymously. Marbach-Ad et al. (2016) organized cross-disciplinary skills into two main groups: retention skills (e.g., skills for acquiring facts, memorization) and transfer skills (e.g., applying knowledge, problem solving, critical thinking). Even more narrow than either employability or cross-disciplinary skills are STEM research–based skills, which can be organized into communication skills (e.g., written and oral communication), literature skills (e.g., identify and read research articles), data skills (e.g., collect and interpret data), and general lab skills (e.g., conduct experimental procedures; Adedokun et al., 2013). In the only study to organize skills empirically, Lavi and colleagues (2021) used EFA to group 14 different 21st-century skills into three categories: domain-general skills (e.g., critical thinking), soft skills (e.g., collaboration, communication), and STEM-specific skills (e.g., experimentation, systems thinking).

    These studies demonstrate that there is no consensus on how to categorize the different skills that undergraduate students need to acquire, and while there are many overlaps in the conceptually organized skills, there are also groupings that appear to differ between studies (Table 1). For example, characteristics that define Binkley et al.’s (2012) ways of thinking parallel the characteristics of Siekmann’s (2016) cognitive skills; however, Rayner and Papakonstantinou’s (2015) generic skills are a mix of Adedokun et al.’s (2013) ways of thinking/cognitive skills and communication skills. Thus, there is no common approach to defining or organizing these various skills. This lack of commonality across studies makes it challenging to synthesize the literature and fully understand what and how these skills are important for undergraduate students’ postgraduate success.

    TABLE 1. Overview of various terms and organizing frameworks for skills-based undergraduate outcomes

    AuthorsSkills termOrganizing frameworks
    Binkley et al. (2012, pp.18–19)21st-century skillsWays of thinking
    • Creativity and innovation

    • Critical thinking, problem solving, decision making

    • Learning to learn, metacognition

    Ways of working
    • Communication

    • Collaboration (teamwork)

    Tools for working
    • Information literacy

    • ICT literacy

    Living in the world
    • Citizenship—local and global

    • Life and career

    • Personal and social responsibility—including cultural awareness and competence

    Heron and McNeal (2016, pp. 20–21)21st-century skillsScientific and technical skills
    • Solve complex, ambiguous problems in real-world contexts

    • Show how results obtained relate to the original problem

    • Instrumentation competency

    • Software competency

    • Coding competency

    • Data analytics competency

    Communication skills
    • Communicate with difference audiences, understand each audience needs, and make communication relevant and maximally impactful

    • Obtain information and evaluate its accuracy and relevance

    • Articulate one's understanding and persuasively communicate the worth to others

    • Organize and communicate ideas in multiple ways

    • Teach a complex idea to others, use feedback to evaluate learning, and develop revised strategies

    Professional and workplace skills
    • Work collegially and collaboratively in diverse teams

    • Identify what must be understood and learn it

    • Generate new ideas

    • Obtain knowledge about existing technology resources

    • Demonstrate familiarity with basic workplace concepts

    • Disciplinary awareness of career opportunities

    • Awareness of practices for résumés and interviews

    • Critical and professional life skills

    [Physics-specific] knowledge
    • Apply fundamental, cross-cutting themes and basic laws of physics

    • Represent basic concepts in multiple ways

    • Solve problems within and across disciplines

    • Knowledge of how basic concepts can be used to solve applied problems

    Lavi et al. (2021, p. 4)21st century skillsDomain-general skills
    • Complex problem-solving

    • Critical thinking

    • Individual learning

    • Question posing

    Soft skills
    • Written and oral communication

    • Intercultural communication

    • Creativity

    • Collaboration

    • Entrepreneurship

    STEM-specific skills
    • Engineering design

    • Experimenting and testing

    • Stem knowledge application

    • Systems thinking

    Siekmann (2016)Employability skillsTechnical skills
    • Coding

    • Design

    Cognitive skills
    • Creativity

    • Critical thinking

    Socio-emotional skills
    • Curiosity

    • Empathy

    • Resilience

    Foundational literacies
    • Numeracy

    Rayner and Papakonstantinou (2015, p. 103)Employability skillsVocational skills
    • Discipline-related knowledge

    • Apply discipline knowledge

    • Develop discipline knowledge

    Generic skills
    • Problem solving

    • Critical thinking

    • Written and oral communication

    • Numeracy and quantitative skills

    Interpersonal skills
    • Personal planning, organization

    • Teamwork

    • Ethics

    • Flexibility and adaptability

    • Self-confidence and independence

    Viskupic et al. (2021, p. 31)Workforce skillsData skills
    • Reasoning and synthesis

    • Applying skills in new scenarios

    • Quantitative skills

    • Manage uncertainty

    • Evaluation of data quality

    [Geoscience] skills
    • Temporal thinking

    • Spatial thinking

    • Field skills

    Communication
    • Work as part of a team

    • Written and oral communication

    • Evaluation of literature

    Systems thinking
    • Systems Thinking

    Societal relevance
    • Societal Relevance

    Marbach-Ad et al. (2016, p. 5)Cross-disciplinary skillsRetention skills
    • Memorize some basic facts

    • Remember formulas, structures, procedures

    Transfer skills
    • Work in groups

    • Scientific writing

    • Acquire major scientific concepts

    • Learn basic sets of lab skills

    • Understand the dynamic nature of science

    • Understand how science applies to everyday life

    • Apply quantitative reasoning

    • Problem solving

    • Develop information literacy

    • Develop creativity and innovation

    • Develop understanding of interdisciplinary nature of science

    • Decision making based on evidence

    Adedokun et al. (2013, p. 946)STEM research skillsCommunication skills
    • Writing results of experiments

    • Documenting research procedures

    • Orally communicating research findings

    • Writing a paper for publication

    • Preparing a research poster

    Literature-based skills
    • Conduct literature searches

    • Literature reviews

    • Understand a journal article

    Data skills
    • Observe and collect data

    • Organize and enter data into spreadsheets

    • Conduct statistical analysis using computer software

    • Interpret data

    • Relate results to “the bigger picture” in their field

    General lab skills
    • Following/research experimental procedures

    • Working independently on research projects

    In addition to various approaches for conceptually organizing these skills, previous studies have sought to empirically rank order skills important for undergraduate STEM majors from the perspectives of students, faculty, and employers. For example, Demaria and colleagues (2018) asked 197 senior biomedical science majors to rank three employability skills they perceived as most important to their future career. Students perceived communication skills as most important, followed by critical thinking and teamwork. Further, students perceived these skills as significantly more important to their future employment than disciplinary content knowledge and disciplinary skills. In a study of 145 STEM faculty and 2345 graduating STEM students’ perceptions of the importance of 10 cross-disciplinary skills, Marbach-Ad and colleagues (2019) found that both faculty and students held similar perceptions. The most important skills for both groups were the ability to problem solve, apply quantitative reasoning, acquire major concepts, and make decisions based on evidence. The least important skills were being able to memorize facts and the ability to work in groups.

    The literature on STEM employers uses similar methods for identifying the most desired skills. For example, McGunagle and Zizka (2020) asked 250 manufacturing employers to rank 16 employability skills that were most important to STEM undergraduate success in the workplace. The top four ranked skills were the ability to be a team player, self-motivation, verbal communication, and problem solving. In a study of 118 STEM employers, Rayner and Papakonstantinou (2015) asked participants to rank a set of 10 employability skills and found the four most important were the ability to apply relevant knowledge, develop relevant knowledge, problem solve, and think critically. Sarkar and colleagues (2016) surveyed 167 recent STEM graduates and 53 employers and asked them to rate the importance of 20 different employability skills. Recent STEM graduates identified communication skills, information retrieval, ability to independently learn, and time management as most important. In contrast, employers in the study identified adaptability, problem solving, critical thinking, being self-driven, and teamwork as the most important skills.

    While previous studies suggest that employer, faculty, and student perceptions about cross-disciplinary skills are sometimes divergent (e.g., Imafuku et al., 2018), there is consensus on the importance of some skills. For example, students and faculty identified knowledge acquisition and applying quantitative reasoning as the two most important skills (Marbach-Ad et al., 2019), and employers perceived that development and application of knowledge were skills vital for student success in the workplace (Rayner and Papakonstantinou, 2015). Across studies, there appear to be six common skills universally perceived as important: knowledge acquisition, knowledge application, problem solving, critical thinking/decision making, communication, and collaboration/teamwork (Table 2). These common important skills span the various frameworks to include general skills but also research-based skills that span disciplines.

    TABLE 2.  Relative importance of six common skills identified in the literaturea

    Participant typeStudyKnowledge acquisitionKnowledge applicationProblem solvingCritical thinking/decision makingCommunicationCollaboration/ teamwork
    StudentDemaria et al. (2018)213
    Marbach-Ad et al. (2019)b3214
    Sarkar et al. (2016)b2561
    FacultyMarbach-Ad et al. (2019)b4213
    EmployerMcGunagle and Zizka (2020)463 (oral) 9 (written)1
    Rayner and Papakonstantinou (2015)213458
    Sarkar et al. (2016)235

    aValues in the table represent the order of importance for these particular skills relative to other skills on the survey. A missing value indicates another skill not included in the table was at this ranking.

    bAuthors of these studies provided percentages of participants who identified the skills as important, and these were used to identify rankings.

    Despite these various conceptual frameworks and ranked importance of a plethora of skills, there is no consensus on the terms used to describe these skills nor is there an empirically identified framework for organizing these skills. Further, only a few studies, to our knowledge, have attempted to understand the relationships between the various skills (Marbach-Ad et al., 2019; Koçak, and Göksu, 2020; Lavi et al., 2021). One goal of this study is to use an adapted version of a previously validated survey, the Survey of Teaching Beliefs and Practices for Undergraduates (STEP-U; Marbach-Ad et al., 2016), to measure students’ perceived importance of a set of 24 cross-disciplinary skills across four different institutions. Gathering data across the institutions provides us with a sample size sufficient to conduct a factor analysis to understand the ways in which these skills can be grouped and allows us to evaluate the generalizability of our findings.

    Characteristics Related to Students’ Values and Development of Cross-Disciplinary Skills

    It is important to not only consider the types of cross-disciplinary skills necessary for student postgraduation success, but also the characteristics that may be associated with students’ ability to develop and attain these skills. Previous research suggests there are a variety of contextual influences on students’ development and value of cross-disciplinary skills, including research experience, classroom instruction, and disciplinary context. Further, students’ sociodemographic characteristics, such as gender and race, may also affect the extent to which they value cross-disciplinary skills. In the following sections, we review the literature on these various characteristics and identify the limitations of the current work.

    Research Experience

    Undergraduate research experiences (UREs) have been widely studied to understand their impact on students (e.g., Gilmore et al., 2015; Linn et al., 2015). Within this body of literature are studies demonstrating differences in students’ acquisition and value of cross-disciplinary skills by URE experience. In a study examining first-year STEM graduate students’ measured research skills, Gilmore and colleagues (2015) found that students with UREs performed significantly better on almost all research skills than those without UREs. Using propensity-score matching, Carter et al. (2016) found that undergraduate engineering students with UREs tended to report stronger teamwork, communication, and leadership skills than those without UREs. Even when controlling for institutional, curricular, and demographic variables, URE students still reported stronger communication skills than non-URE students.

    While there are many studies like these two that document the importance of UREs for student skill development, very few studies seek to understand the association between UREs and the extent to which students value these skills. In fact, only Marbach-Ad and colleagues’ (2016, 2019) studies explore the association between research experience and values. They found that prior research experience was one of the most important predictors of values pertaining to knowledge acquisition, evidence-based decision making, quantitative reasoning, and scientific writing skills. Given that values are a critical determinant of student persistence and motivation (Wigfield and Eccles, 2000), the present study aims to address this gap in the literature to better understand the relationship between research experience and values of cross-disciplinary skills that are organized in an empirically developed framework.

    Classroom Instruction

    Instructional practices can also influence the ways in which students value and learn cross-disciplinary skills. Demaria and colleagues (2018) assessed students’ perceptions of employability skills in relation to their experience in a biomedical science capstone course that integrated active learning and skills-based activities/assessments. Open-ended responses from students most often indicated that group oral presentations and group assignments were important activities for their development of communication, teamwork, and critical-thinking skills. Marbach-Ad and colleagues (2016) found that exposure to specific cross-disciplinary skills in the classroom was a significant predictor of the extent to which graduating STEM students valued these skills. For example, students who more frequently reported experience with inquiry-based instruction valued problem-solving more highly. In a subsequent study, Marbach-Ad and colleagues’ (2019) interview data suggested that the experiences students had in courses could have influenced their perceived value of a cross-disciplinary skill. For example, a student who highly valued application of science to everyday life attributed this to their experience with real-world examples in an anatomy course, while a student who placed little importance on collaboration attributed this to their negative experience with group work assignments.

    In a study of 827 Israeli graduate and undergraduate STEM students’ perceptions of skills and experiences in courses, Lavi and colleagues (2021) found that course assignments, projects, and research were most often associated with helping students develop most skills. Sarkar et al. (2016) sought to assess postgraduates’ perceptions of useful skills and the extent to which they perceived their undergraduate degree programs as helping them develop these skills. They found that students felt more “overdeveloped” (i.e., less value and more preparation) in knowledge acquisition and application skills and more “underdeveloped” (i.e., more value and less preparation) in skills related to communication, collaboration, and problem solving. Using national survey data and Bayesian statistics, Viskupic and colleagues (2021) mapped a set of workforce skills, described in Table 1, onto various courses in geoscience curricula. The authors found that the three most highly reported skills geoscience major students were exposed to in their courses included data skills (e.g., making inferences from observations), disciplinary skills (e.g., applying disciplinary knowledge), and communication skills (e.g., working in teams). The least frequently experienced skills for these students were systems thinking and societal relevance.

    In combination, these studies suggest that the types of instructional practices that students experience in their undergraduate courses and the frequency with which they experience these practices may influence their values and development of cross-disciplinary skills. While informative, these studies provide qualitative or descriptive information about these relationships or provide inferential statistics for individual skills and experiences. There is clearly more work needed to understand the relationships between a variety of instructional practices and students’ values of cross-disciplinary skills across institutions, which the present study aims to address.

    Disciplinary Norms and Practices

    Disciplinary contexts and norms can also play an important role in students’ values and development of cross-disciplinary skills. McGunagle and Zizka (2020) found that the importance of 16 cross-disciplinary skills, as rank-ordered by STEM employers, varied among STEM disciplines. The top three ranked skills for aerospace and defense employers were adaptability, problem solving, and ability to gather data, while for manufacturing employers, the most important skills were being a team player, self-motivation, and verbal communication. Marbach-Ad and colleagues (2019) similarly found disciplinary differences in student values for several skills (e.g., applying quantitative reasoning, acquiring major scientific concepts, scientific writing, working in groups). Based on anecdotal evidence from interviews, these differences could be attributed to the more quantitative nature of certain disciplines, different classroom experiences in courses for the major, and other aspects of the nature of the discipline. They observed no disciplinary differences in student values for some skills, such as problem solving, evidence-based decision making, and creativity.

    While there are more studies exploring disciplinary differences of students’ experiences in general (e.g., Pike and Killian, 2001; Rainey et al., 2018), the two studies described previously illustrate that different values of cross-disciplinary skills exist across disciplines. These findings align with the research on organizational change arguing that STEM departmental cultures and disciplinary norms should be accounted for (Reinholz et al., 2019). However, Reinholz and colleagues (2019) also acknowledge that the institutional context plays a role in how departments at the local level implement and institute teaching and learning. Thus, what is missing from the current research on disciplinary differences in student values of cross-disciplinary skills is an understanding of the relative contributions of local contextual factors and disciplinary norms. By studying student perceptions across STEM disciplines and institutions, the present study aims to address this gap and provide a survey tool that can be used in future studies at other institutions.

    Student Sociodemographic Characteristics

    A plethora of recent research examines the importance of race and gender for students’ self-efficacy, belonging, performance, and persistence in undergraduate STEM (e.g., Gayles and Ampaw, 2014; Macphee et al., 2013; Rainey et al., 2018; Witherspoon and Schunn, 2019). Despite the demonstrated inequities that exist between students of differing gender and race, only a few studies that explore students’ values and acquisition of cross-disciplinary skills account for demographic variables. For example, researchers have controlled for students’ gender and/or race when exploring students’ values of cross-disciplinary skills (Adedokun et al., 2013; Marbach-ad et al., 2019). Only one study, to our knowledge, identifies whether students’ cross-disciplinary skill values differ based on sociodemographic characteristics. Using regression modeling, Marbach
-Ad and colleagues (2016) found that only gender was a significant predictor for students’ values for one of the five cross-
disciplinary skills explored—application of science to everyday life. Race (categorized as a binary variable: underrepresented minority [URM] and non-URM) was not a significant predictor for any of the cross-disciplinary skill values. The lack of research on the role of students’ sociodemographic characteristics on their values of cross-disciplinary skills is a limitation in the current literature and warrants further exploration.

    In summary, there are various important characteristics that should be considered when understanding students’ values and acquisition of cross-disciplinary skills (Figure 1). Students’ values influence the effort they put into learning those skills (blue), and those values may be influenced by student characteristics (orange) and external characteristics (green). In this study, we aim to explore the components of this conceptual framework to better understand their relationships with one another and with students’ values of cross-disciplinary skills.

    FIGURE 1.

    FIGURE 1. Conceptual framework for understanding student values of cross-disciplinary skills in STEM. Dotted arrows indicate the potential influence of students’ values on their choice of courses and research experiences. These relationships are not the focus of the present study.

    PURPOSE

    Despite the previous research on cross-disciplinary skills and students’ values regarding these skills, to our knowledge, no study has sought to empirically identify an organizing framework for the skills. Further, to our knowledge, no study has disentangled the characteristics that are associated with students’ values of cross-disciplinary skills across institutions. We used a revised version of the STEP-U survey tool, described in the Methods, across four institutions to answer the following research questions:

    1. What distinct factors of student values of and classroom experiences with cross-disciplinary skills can be identified across the four institutions using the STEP-U survey?

    2. How are students’ values of cross-disciplinary skills related to research experience, reported classroom experiences, STEM discipline, sociodemographic characteristics, and institution?

    METHODS

    Participants and Context

    In Spring 2019, we collected survey data from 1162 of 2542 (46% response rate) graduating STEM students at four institutions in the Mid-Atlantic (see Table 3 for institutional characteristics). Across the different institutions, biology majors predominated, although there was variability across institutions in the majors sampled (Table 4). See Appendix A in the Supplemental Material for response rates by major.

    TABLE 3. Institutional characteristics and overall student population demographics

    University 1University 2University 3University 4
    Undergraduate enrollment27,00016,00040,00020,000
    Carnegie classificationVery high research activityVery high research activityVery high research activityDoctoral/professional university
    % Female undergraduate students (2018–2019)47%55%48%59%
    % White undergraduate students (2018–2019)49%57%62%54%

    TABLE 4. Participant characteristics at each institution

    University 1University 2University 3University 4a
    Sample size29534248441
     Overall response rate25%82%70%18%
    Majors (% sample):
     Biology33%81%34%55%
     Biochemistry6%10%
     Chemistry2%19%5%14%
     Computer Science35%2%
     Math8%25%16%
     Physics5%4%9%
     Otherb12%20%7%
     Research experience46%39%59%44%
    Gender:
     Female42%64%52%74%
     Male58%36%48%26%
    Ethnicity:
     Asian27%21%8%<1%
     Black8%2%4%2%
     Latino10%8%5%<1%
     White49%57%61%65%
     American Indian/ Alaska Native<1%
     Native Hawaiian/ Pacific Islander<1%
     Multiple6%6%3%<1%
     International (no ethnicity information given)1%15%<1%
     Unknown1%4%3%

    aDemographic information from University 4 was obtained as total counts and includes all participants who responded to the survey (n = 43), although two surveys were later excluded from regression analyses due to missing data.

    b“Other” includes STEM majors for which the number of responses was ∼10 or fewer, and included environmental science, astronomy, data science, engineering, and science (an interdisciplinary major intended to provide a broad overview of science).

    Data Collection

    We used an adapted version of the STEP-U survey originally developed in 2011 by the University of Maryland (UMD) College of Computer, Mathematical, and Natural Sciences Teaching and Learning Center. The survey was designed to explore self-reported educational values and experiences of graduating students in biology and chemistry. We acknowledge the potential limitations with self-reported data; however, this approach aligns with previous studies measuring students’ development and value of skills that also use self-reporting (e.g., Carter et al., 2016; Lavi et al., 2021). The original instrument was validated and refined in an iterative process as the surveyed population was expanded to include graduating seniors from additional STEM disciplines, including computer science, physics, and mathematics (Marbach-Ad et al., 2012, 2014, 2016). Response process validity (Reeves and Marbach-Ad, 2016) of the survey was established through individual cognitive interviews with 25 senior students from five STEM disciplines. It was determined that the instrument provided valid and reliable measures when used with the undergraduate student population at UMD (Marbach-Ad et al., 2019).

    In 2018, we created a Regional Consortium for Change in Undergraduate STEM Education as a first step in exploring the generalizability of the STEP-U survey beyond one institution. We held a series of meetings with the representatives of the four universities to establish content validity. During these meetings, we aimed to reach consensus in adapting the survey items to match the constructs they were intended to measure. We first considered focal group and cognitive interview feedback received from students and faculty members at UMD (Marbach-Ad et al., 2019). For example, some of the original items in the STEP-U included the word “science.” We learned from UMD faculty members and students that this term may have been distracting for students in mathematics and computer science, who may not view themselves as scientists. We revised these items to be more applicable across multiple STEM majors (e.g., “scientific writing” became “writing for a scholarly or professional audience,” and “understanding how science applies to the real world” became “understanding how your discipline applies to the real world”).

    The revised STEP-U survey used here consisted of 27 items asking students about the extent to which they valued specific cross-disciplinary skills (denoted as Values questions) and 27 items asking about their classroom experiences (denoted as Experiences questions). Of the 27 Values items, two (Computer programming, Using software appropriate for my discipline) were not included in the final data analysis. Their inclusion in the original factor model produced an uninterpretable factor structure, as these two items consistently loaded onto two factors. Thus, they were excluded from the final analysis, resulting in a more interpretable factor structure. This exclusion was also considered conceptually logical, as these items were substantively different from the other items and were not widely applicable to all majors. The Values questions asked students to “Rate the following skills in terms of importance to you in your undergraduate education, where 1 = “not important” and 5 = “extremely important.” The Experiences questions were divided into two sections. One section included 12 items that asked students “In how many courses for your primary major did your instructors use these methods?” The other section included 15 items and asked students “In how many courses for your primary major were you asked to engage in the following?” Both sections used the same scale, where 1 = “never” and 5 = “in all of my courses.” In addition to the Values and Experiences questions, the survey also contained questions about students’ undergraduate research experiences and postgraduation plans (see Appendix B in the Supplemental Material for full survey). Student sociodemographic information (e.g., race, gender) were obtained and connected to the data for universities 1, 2, and 3. Sociodemographic information were provided for University 4 but were anonymized and could not be connected to the data.

    Surveys were administered during Spring 2019 to graduating students at each institution (Table 4). The authors obtained Institutional Review Board (IRB) approval for the study at their individual institutions and administered surveys using established mechanisms, which varied slightly depending upon institutional context and conventions. For example, at University 1, the survey was part of a long-standing graduation survey administered to science, mathematics, and computer science majors. The survey was administered online via Qualtrics. Students were recruited to take the survey via a link on the college commencement page, postings in the student newsletter, and direct emails. At University 2, the survey was administered online via Qualtrics at the department level for only biology and chemistry majors as a requirement for graduating seniors (although respondents had the ability to opt out of having their data used for research purposes). At University 3, the survey was administered online via Qualtrics and was included in the annual survey that students across the college complete as part of the graduation checkout process. University 4 did not have an existing culture of graduation surveys, so the online survey was sent to graduating STEM seniors via an email with a link to a Google Form. This was a voluntary opt-in survey, which, along with the newness of such a survey, may have negatively impacted the response rate.

    Data Analysis

    All analyses were conducted in R (R Core Team, 2020). To answer the research questions, we used EFA and multiple regression. Regressions were conducted using base R, while cluster-robust standard errors were computed with the sandwich package and variance explained by the predictors was computed with the relaimpo package (Zeileis, 2004; Grömping, 2006). EFAs were conducted using the psych package (Revelle, 2020). Separate EFAs were used to create composite scales for the Values and Experiences questions. Factors were extracted using principal axis factoring, and an oblimin rotation was used to determine the final factor structure. Scree plots were examined to give an idea of the total number of factors to extract. Based on the plots, we considered multiple numbers of factors to extract for both the Values (i.e., four-, five-, and six-factor solutions) and Experiences (i.e., two- and three-factor solutions). The final number of factors extracted was based on the solution that produced a clear simple structure and had factors with the strongest interpretability. While there are several methods for determining an appropriate number of factors in an EFA, this method is both straightforward to implement and results in an accurate number of factors that are also interpretable (Costello and Osborne, 2005).

    Once factors were finalized, we calculated mean scores for each factor for each student, which were used as the basis for the multiple regression analysis. Four different multiple regression analyses were run to explore the relationship between each of the Values factors and Experiences factors, research experience, major, and university. Predictor variables included the students’ mean scores on the two Experiences factors, research experience (a dichotomous indicator of whether the student participated in an on- or off-campus research experience), primary STEM major (dummy coded with biology as the reference group), gender identity (a dichotomous indicator with male as the reference group), ethnicity (dummy coded with white as the reference group), and university (dummy coded with University 1 as the reference group). We ran the models using data from Universities 1–3 with sociodemographic variables, as sociodemographic data from University 4 were anonymous and unable to be connected to the data. We then compared these findings to our analyses using the full data set and excluding sociodemographic variables. Ultimately, we chose to report the results of the model containing the sociodemographic variables due to their theoretical importance, and thus data from University 4 are excluded from these reported results (see Appendix C in the Supplemental Material for the results of the model containing all data).

    A potential concern with these data is that the students are nested within universities, which can impact model standard errors if not addressed. We chose to use cluster-robust standard errors, which adjust the estimated regression model standard errors to account for potential heteroskedasticity and the clustering of students within universities (Cameron et al., 2011). While a random effects model could also be used to improve standard error estimation, including fixed effects allows us to control for unobserved university-level characteristics, which we believe to be theoretically important. Additionally, we sought to examine how much variability in the outcome each predictor explained; this is straightforward under a standard ordinary least-squares framework but is not well developed for random effects models. Linearity and normality assumptions were checked graphically and appeared to be met. The data were analyzed with a complete case analysis. From the original sample size of 1190, 16 were removed from the Values EFA due to missing data, leading to a sample of 1174. For the Experiences EFA and regression analyses, another 12 respondents were removed due to missing data, leading to the final sample size of 1162. The regression models using data from Univer­sities 1–3 had a total sample size of 1126 for all four models after incomplete cases were removed (including all University 4 data).

    Finally, we created four plots to visually explore the descriptive differences in Values and Experiences by STEM discipline and institution. Because all four institutions included biology major participants, we plotted means across institutions for that discipline only, with one plot for the four Values factors and one for the two Experiences factors. We also plotted overall means for each discipline for each of the four Values as well as the two Experiences factors.

    RESULTS

    Below we present the EFA findings to answer research question 1, then discuss descriptive findings and the linear regression models to answer research question 2.

    Factor Structure for Students’ Cross-Disciplinary Skill Values and Classroom Experiences

    We identified a four-factor structure for the Values questions, which accounted for 49.99% of the variance (Table 5). All factors had high internal consistency reliability (Cronbach’s α > 0.8). The grouping of the questions into these factors also made logical sense and was indicative of larger constructs. While we name each factor in the results for clarity, we explain the reasons for these names in the discussion. Factor 1, which we term Research and Writing, represents how much students valued skills related to designing research studies, finding literature, and writing about it. Factor 2, Memorization, represents how much students valued basic memorization. Conceptual and Data Application, Factor 3, represents values related to deriving understanding from data. Finally, Factor 4, which we are calling the Nature of the Discipline, represents how much students value a variety of skills that are central to the practice of the various STEM disciplines, such as collaboration, communication, and creativity.

    TABLE 5. Factor loadings for student values of cross-disciplinary skillsa

    ItemFactor loadingsDescriptives
    1234MeanSD
    Factor 1: Research and Writing3.870.95
     Writing for a scholarly or professional audience0.543.761.16
     Learning basic sets of laboratory skills0.673.851.27
     Evaluating credibility of sources in your discipline0.703.931.10
     Locating credible primary sources0.843.861.15
     Understanding information presented in primary sources0.744.131.00
     Designing research studies0.443.651.17
    Factor 2: Memorization3.361.03
     Memorizing some basic facts0.693.691.17
     Remembering formulas0.842.931.25
     Remembering procedures or steps0.783.481.16
     Memorizing large quantities of information0.773.331.33
    Factor 3: Conceptual and Data Application4.400.54
     Acquiring major concepts in your discipline0.514.560.64
     Applying quantitative reasoning0.604.270.81
     Solving problems0.634.540.67
     Drawing conclusions based on reason and evidence0.504.410.81
     Analyzing data0.654.380.81
     Interpreting data0.724.360.82
     Decision-making based on evidenceb0.364.250.85
    Factor 4: Nature of the Discipline3.870.69
     Developing entrepreneurial thinking0.703.051.30
     Working in groups0.513.521.13
     Developing creativity and innovation0.723.891.02
     Understanding how your discipline applies to the real world0.374.400.83
     Understanding the evolving nature of your discipline0.314.150.89
     Developing an understanding that your discipline connects with other disciplines0.564.110.93
     Developing oral communication skills0.673.871.13
     Collaborating with peers0.553.981.01
    Extracted sums of squared loadings3.683.283.042.50
    Percentage of variance14.7113.1212.169.99
    Construct reliability (α)0.870.850.830.84

    an = 1174 (includes Universities 1–4). Sample size is larger than other analyses due to fewer missing data for these items. Likert-scale responses ranged from 1 = not important to 5 = very important.

    bLoaded above 0.30 on another factor and was treated as loading solely to the factor on which it had the highest loading.

    We identified a two-factor structure for the Experiences questions, which accounted for 31.91% of the variance (Table 6). The factors had moderate to high internal consistency reliability (Cronbach’s α > 0.75). Similar to the factor structure for the Values questions, the organization of the Experiences questions made logical sense. Factor 1, which we call Interactive/Evidence-Based Experiences, represents activities that have research support for improving student learning in the classroom, such as working in groups during class time. Factor 2, Procedural and Quantitative Experiences, represents common STEM classroom experiences that require the use of procedures or the application of information learned in a course. Three items did not have loadings above 0.3 on either factor and were removed from the analysis (Extensive lecturing, Emphasizing major concepts or theories, Answering questions from individual students in class).

    TABLE 6. Factor loadings for student-reported frequency of classroom experiencesa

    ItemFactor loadingsDescriptives
    12MeanSD
    Factor 1: Interactive/Evidence-Based Experiences3.080.66
     Working in groups during class time0.592.660.91
     Teaching with an approach that emphasizes that your discipline connects with other disciplines0.603.071.06
     Using evidence to support ideas0.653.671.08
     Interpreting data0.583.661.04
     Emphasizing the evolving nature of your discipline0.583.431.08
     Analyzing data0.613.671.02
     Engaging with content during class through non-lecture activities.0.582.881.03
     Communicating course goals and objectives to students0.344.110.91
     Administering a pretest at the beginning of the semester to assess your prior knowledge0.482.171.04
     Reading primary sources0.693.251.08
     Designing research studies0.612.431.02
     Completing assignments/activities that require creativity and innovationb0.472.991.08
     Oral presentations0.592.410.93
     Writing assignments (reflective writing, journals, essays, reports)0.612.791.15
     Relating course material to the real world0.593.621.00
     Working in groups outside of class time0.492.700.93
     Discussing and exchanging ideas with classmates during class time0.622.860.96
    Factor 2: Procedural and Quantitative Experiences3.060.66
     Computer programming0.702.361.29
     Requiring you to memorize large quantities of information−0.372.081.01
     Solving problems0.354.30084
     Taking exams that allow you to bring notes or a formula sheet0.632.161.01
     Applying quantitative reasoning0.353.83096
     Assigning homework that counts toward final grade0.493.761.00
     Using software appropriate for your discipline0.702.911.16
    Extracted sums of squared loadings5.972.64
    Percentage of variance22.129.79
    Construct reliability (α)0.890.75

    an = 1162 (includes Universities 1–4). Likert-scale responses ranged from 1 = never to 5 = in all of my courses. Any loadings below 0.3 were treated as 0 and ignored.

    bLoaded above 0.30 on another factor and was treated as loading solely to the factor on which it had the highest loading.

    Descriptive Relationships between Experiences and Values factors

    We were interested in exploring surface-level descriptive differences between universities across one major and between several majors across all four universities. When comparing descriptive differences in participants’ Values and Experiences between universities for biology majors, there are some notable similarities. Regardless of the magnitude of undergraduate biology majors’ perceived value at an institution, the relative ranking of the Values factors followed a similar trend across all institutions (Figure 2). Biology majors valued Memorization the least relative to other cross-disciplinary skills; however, they largely varied between institutions (range = 0.67). Conversely, Conceptual and Data Application were valued most highly across institutions, and students were consistent in the value they placed on these skills relative to other skills (range = 0.28), regardless of institution. Similar to perceived values, biology majors, irrespective of institution, reported more frequent Interactive/Evidence-Based Experiences than Procedural and Quantitative Experiences (Figure 3). There were also relatively consistent ranges in reported Interactive/Evidence-Based Experiences (range = 0.51) and Procedural and Quantitative Experiences (range = 0.44) across institutions. In other words, while there are differences in the actual mean values for biology majors’ reported values and experiences across institutions, the trends were nearly identical regardless of institution.

    FIGURE 2.

    FIGURE 2. Means plot for biology student values of cross-disciplinary skills across institutions. Total n = 569. Likert-scale responses ranged from 1 = not important to 5 = very important.

    FIGURE 3.

    FIGURE 3. Means plot for biology student experiences with cross-disciplinary skills across institutions. Total n = 569. Likert-scale responses ranged from 1 to 5, with 1 = none, 3 = half of my classes, and 5 = all of my classes.

    There were similar descriptive trends in the mean Values and Experiences across institution by discipline. With the exception of computer science and math, Memorization was least valued by students, with large variation in how they valued the skill across disciplines (range = 0.94; Figure 4). Those majoring in computer science and physics valued memorization the least, and those in the majors comprising the “other” category valued it the most (see note in Table 4 for majors included in this category). All students, regardless of major, valued Process Skills and Reasoning more than the other skills, with little variation between disciplines (range = 0.49). Computer science majors valued Process Skills and Reasoning the lowest of all disciplines, with biology being the second lowest. Finally, the value of Research and Writing skills varied most across disciplines (range = 1.59), with computer science and math majors valuing these skills much less than all other disciplines.

    FIGURE 4.

    FIGURE 4. Means plot for student values of cross-disciplinary skills across majors. Total n = 1162. Likert-scale responses ranged from 1 = not important to 5 = very important.

    There were no consistent trends across disciplines in the frequency of reported classroom experiences (Figure 5). There was a large range in the frequency of Procedural and Quantitative Experiences across disciplines (range = 1.10), with biology students reporting the lowest frequency and physics and computer science students reporting the highest. Participants across disciplines were more similar in their reported frequency of Interactive/Evidence-Based Experiences (range = 0.63), with computer science students reporting the lowest frequency and “other” students reporting the highest. Interestingly, the relative frequency of these two classroom experience factors differed depending on discipline. Biochemistry students, biology students, and those in the majors comprising the “other” category reported more frequent Interactive/Evidence-Based Experiences compared with Procedural and Quantitative Experiences, whereas computer science, math, and physics students reported the opposite. Chemistry students were the only disciplinary group who reported similar frequencies of Interactive/Evidence-Based Experiences and Procedural and Quantitative Experiences.

    FIGURE 5.

    FIGURE 5. Means plot for student experiences with cross-disciplinary skills across major. Total n = 1162. Likert-scale responses ranged from 1 = none, 3 = half of my classes, 5 = all of my classes.

    Factors Predicting Student’s Values Regarding Cross-Disciplinary Skills

    In addition to descriptive examination of how the Experiences factors, research experience, university, and major relate to the Values factors, we also ran regression analyses to further explore these relationships. Overall, students’ values of the four different Values factors were significantly related to classroom experiences, research experience, major, ethnicity, gender, and university (Table 7), and these variables each explain differing levels of variability of each of the Values. Though each of the four models was significant, each explained different levels of variability in their respective outcome (from 15% to 45%). In particular, the Research and Writing skills regression model was significant, F(18, 1107) = 51.27, p < 0.0001, and accounted for 45% of the variance. Further, examining the unique variability explained by each variable can provide a sense of how important that variable is in explaining the outcome, when controlling all other variables in the model. Students who valued research and writing skills tended to report more Interactive/Evidence-Based Experiences in the classroom and fewer Procedural and Quantitative Experiences, and these variables uniquely explained 12% and 0.3% of the variance in the outcome, respectively. Having a research experience was positively associated with valuing research and writing skills and accounted for 1% of the variability in the outcome. Gender identity and ethnicity were weak predictors of valuing research and writing skills, with gender explaining essentially none of the variability and ethnicity explaining 0.4% of the variability in the outcome. Differences across majors and universities both explained some variability in the outcome, with majors explaining 7% overall and university explaining 1% overall. Thus, students’ reports of their Interactive/Evidence-Based Experiences and disciplinary differences are the two most important factors in explaining differences in values related to research and writing, while differences in research experience, gender identity, ethnicity, and university are weakly predictive of this value.

    TABLE 7. Regression Model coefficients for Values factors outcomesa

    Predictor variablesValues outcome variable
    Research and WritingMemorizationConceptual and Data ApplicationNature of the Discipline
    BSESemi-partial r2BSESemi-partial r2BSESemi-partial r2BSESemi-partial r2
    Intercept2.25***0.111.57***0.233.20***0.272.43***0.13
    Experiences:
     interactive0.62***0.070.120.53***0.060.070.28***0.050.060.46***0.030.10
     Procedural−0.120.050.003−0.070.060.0010.12*0.060.010.020.040
     Research experience0.22***0.040.01−0.020.0700.07***0.010.003−0.0030.030
    Major:
     other−0.160.110.070.29*0.140.020.13***0.030.040.09***0.010.01
     Biochemistry0.29***0.08−0.070.040.19***0.010.19***0.03
     Chemistry0.28***0.03−0.200.130.19***0.030.17***0.04
     Computer science−0.82***0.12−0.070.18−0.29**0.10−0.110.06
     Math−0.54**0.170.34***0.060.12***0.01−0.10***0.01
     Physics0.120.07−0.160.180.120.06−0.12***0.03
     Female0.02***0.00400.050.0600.03***0.010.0010.08***0.010.003
    Race/ethnicity:
     Asian0.090.080.0040.30***0.060.02−0.030.050.020.070.170.003
     Black0.030.040.31***0.06−0.070.080.050.06
     International0.14**0.050.26**0.09−0.26***0.06−0.100.06
     Latino−0.090.070.190.17−0.18***0.010.0040.03
     Multiracial−0.0010.12*0.05−0.060.050.020.14
     Other−0.130.210.20**0.06−0.200.12−0.030.09
    University:
     University 2−0.090.060.010.34***0.040.01−0.11***0.020.01−0.21***0.020.01
     University 30.16***0.020.16***0.02−0.10***0.01−0.010.04
    Adjusted R20.450.150.210.21

    a For all models, n = 1,126 (includes Universities 1–3 and complete data sets only). Standard errors are heteroskedasticity-consistent standard errors. Biology was the reference category for major. University 1 was the reference category for university. White is the reference category for race; the “other” race category includes those with unknown backgrounds and American Indian and Pacific Islanders (due to low counts). For major, university, and race/ethnicity, semi-partial r2 is computed for the set of all categories as a whole and listed under the first category only.

    *p < 0.05.

    **p < 0.01.

    ***p < 0.01.

    The Memorization regression model was significant, F(18, 1107) = 11.73, p < 0.0001, and accounted for 15% of the variance. Students who valued Memorization tended to report more frequent Interactive/Evidence-Based Experiences, and this factor uniquely explained 7% of the variability in memorization; however, there were no significant relationships between valuing memorization and reported frequency of Procedural and Quantitative Experiences. There was also no relationship between research experience or gender identity and the extent to which students valued memorization skills. Ethnicity, disciplinary differences, and differences across the universities all uniquely explained a small proportion of the overall variability in the memorization factor. Overall, this set of predictors did not explain much variability in the outcome compared with the other models.

    The regression model for students’ values of Conceptual and Data Application was also statistically significant, F(18, 1107) = 17.42, p < 0.0001, and accounted for 21% of the variance in responses. There was a positive relationship between students’ values and the frequency of their Interactive/Evidence-Based Experiences and Procedural and Quantitative Experiences, although Interactive/Evidence-Based Experiences uniquely explained more variability than Procedural and Quantitative Experiences (6% vs. 1%). While the relationship between values of Conceptual and Data Application and students’ research experiences was statistically significant, research experience was not a strong predictor and explained only 0.3% of the variability in the outcome. However, there were differences in these values based on major. Differences across majors represented the second-most important predictor of valuing Conceptual and Data Application skills, with these differences accounting for 5% of the variability in the outcome. Differences in gender identity did not represent an important predictor, explaining only 0.1% of the variability in the outcome, and differences in ethnicity accounted for a small proportion of the total variability (2%). Differences across universities uniquely accounted for only a slight portion of the overall variability (1%).

    Finally, the Nature of the Discipline regression model was significant, F(18, 1107) = 17.23, p < 0.0001, and accounted for 21% of the variance in the outcome. Students who reported more frequent Interactive/Evidence-Based Experiences tended to value the skills related to the nature of STEM disciplines more highly. This factor uniquely accounted for most of the variability in the outcome (10%). Differences across majors and differences across universities uniquely accounted for only a small portion of the variance (both 1%), while gender identity and ethnicity were both even weaker predictors, both explaining 0.3% of the variability in the outcome. More frequent Procedural and Quantitative Experiences and reporting a research experience were not important predictors.

    DISCUSSION

    In the following sections, we discuss the results according to the research questions (RQ).

    RQ1: What Distinct Factors of Student Values and Classroom Experiences Can Be Identified across the Four Institutions Using the STEP-U Survey?

    We used a revised version of the STEP-U instrument across four different institutions to measure the extent to which graduating STEM students valued cross-disciplinary skills and the reported frequency with which they experienced classroom experiences related to those skills. Using EFA, we found a four-factor structure for participants’ Values (Research and Writing, Memorization, Conceptual and Data Application, and Nature of the Discipline) and a two-factor structure for participants’ Experiences (Interactive/Evidence-Based Experiences and Procedural and Quantitative Experiences). The EFA for both Values and Experiences questions resulted in a simple, interpretable structure, with reasonable variance explained and adequate internal consistency reliability.

    Factor analysis of “Values” data obtained with an earlier version of the survey indicated a two-factor structure, with two items grouped under a factor referred to as “Retention” and the remaining 12 items grouped together under a second factor labeled “Transfer” (Marbach-Ad et al., 2016). While the “Retention” factor was easily interpretable, the “Transfer” factor served as a catchall for many different skills related to the generation and application of scholarly knowledge. The current study, which used an expanded set of 27 values items, represents a refinement of these earlier results. The new Memorization factor is comparable to the previous “Retention” factor, with the only difference being that one triple-barreled item from the previous survey (“Remember formulas, structures, and procedures”) was split into three separate items in the revised survey. The items that previously comprised the “Transfer” factor were expanded and refined, which resulted in identification of three new, more easily interpretable factors in this study. This new factor structure facilitates future investigations that seek to gain deeper insight into the relationship between student attitudes, experiences, and career trajectories. We also acknowledge that some potentially informative items (i.e., Computer programming, Using software appropriate for my discipline) were not encompassed by this factor structure and were omitted from the analyses reported here, but may be of interest to future researchers.

    While some previous studies have used EFA to create factors for cross-disciplinary skills (e.g., Marbach-Ad et al., 2016; Lavi et al., 2021), this is the first study, to our knowledge, to do so for both student’s perceived values and their reported frequency of related classroom experiences. Plotting Values and Experiences factor means for biology majors across institutions and for all majors demonstrated similarities in trends across the factors, providing preliminary evidence that these factors describe constructs that are viewed similarly across majors and institutions. In the next sections, we will elaborate on each factor and explore why specific skills/experiences clustered under a unique factor by situating the discussion in the literature as well as our own interpretations.

    Interpreting the Values Factors

    The Research and Writing factor represents skills related to designing research studies (Learning basic sets of laboratory skills, Designing research studies), finding literature (Locating credible primary sources, Understanding information presented in primary sources, Evaluating credibility of sources in your discipline), and writing about it (Writing for a scholarly or professional audience). The need to develop research and writing skills in undergraduate education for workforce preparation has received much more attention in recent years. Historically, workers were expected to hold skills related to developing, distributing, and consuming products; however, with the growing access to resources through technology, workers need to develop skills, such as accessing, managing, integrating, evaluating, and creating information (Griffin et al., 2012). Griffin and colleagues (2012) also suggest that, in the 21st century, education systems should aim to prepare workers with a different set of skills in order to adapt to the worldwide move from an industrial-based to an information-based economy.

    The Memorization and Conceptual and Data Application factors align with Mayer’s (2002) differentiation between knowledge acquisition (i.e., retention) and the use of knowledge in a variety of new situations (i.e., transfer). Building on Bloom’s taxonomy, Mayer proposed that retention is “the ability to remember material at some later time in much the same way it was presented during instruction” (p. 226), which aligns with skills grouped in the Memorization factor related to simple knowledge acquisition or retention (Memorizing some basic facts, Remembering formulas, Remembering procedures or steps, and Memorizing large quantities of information). Mayer (2002) defined transfer as “the ability to use what was learned to solve new problems, answer new questions, or facilitate learning new subject matter” (p. 226), which aligns closely with factor 3, Conceptual and Data Application. This factor represents values related to conceptual knowledge (Drawing conclusions based on reason and evidence, Applying quantitative reasoning, Solving problems, Acquiring major concepts in your discipline, Decision-making based on evidence) and data application (Analyzing data, Interpreting data).

    The Nature of the Discipline factor represents skills related to understanding the dynamic nature of the discipline (Understanding the evolving nature of your discipline, Understanding how your discipline applies to the real world, Developing an understanding that your discipline connects with other disciplines, Developing entrepreneurial thinking, Developing creativity and innovation) as well as skills that enable students to successfully adapt to the workplace (Working in groups, Developing oral communication skills, Collaborating with peers). These skills align closely with nature of science, or science as a way of knowing (Lederman et al., 2002; Wheeler et al., 2019), as well as Binkley’s 21st-century skills (2012; Table 8).

    TABLE 8. Alignment of Nature of Discipline Factor with the literature

    Factor 4: Nature of the Discipline21st-century skillsaNature of Scienceb
    Developing entrepreneurial thinkingWays of thinkingScience is creative and inferential.
    Working in groupsWays of workingScience is collaborative.
    Developing creativity and innovationWays of thinkingScience is creative and inferential.
    Understanding how your discipline applies to the real worldScience is socially and culturally embedded.
    Understanding the evolving nature of your disciplineScience is tentative and revisionary.
    Developing an understanding that your discipline connects with other disciplinesScientific knowledge is gained through a variety of methods.
    Developing oral communication skillsWays of working
    Collaborating with peersWays of workingScience is collaborative.

    aAs organized by Binkley et al. (2012).

    bAs characterized by Wheeler et al. (2019).

    When evaluating the Values factors overall in light of the organizational structures proposed by previous researchers (Table 1), we see that all the studies referred in some ways to skills that are related to research and writing, but they included them under different categories. For example, Binkley et al. (2012), included Information Literacy and Information and Communication Technology (ICT) Literacy skills under the category Tools of Working, arguing that, in the 21st century, information literacy will be a necessary work tool. Other researchers grouped writing or research skills under the category Communication Skills (Adedokun et al., 2013; Heron and McNeal, 2016) and even sometimes grouped oral and written communication as one item (Rayner and Papakonstantino, 2015; Lavi et al., 2021; Viskupnic et al. 2021).

    Both Adedokun and colleagues (2013) and Lavi and colleagues (2021) provided empirical evidence to group together oral and written communication skills; however, their focus was on students’ perceived development of cross-disciplinary skills, rather than student values. From a motivational perspective, while highly valuing cross-disciplinary skills may result in the development of these skills, expectancy also plays a role in the effort students put forth to learn a skill. Our factor analysis resulted in “Writing for a scholarly audience” falling into factor 1 (Research and Writing) and “oral communication” falling into factor 4 (Nature of the Discipline). We suspect that this difference may be attributed to the limited empirical evidence in the literature for organizing cross-disciplinary skills. To our knowledge, only three previous studies have gathered evidence to support the validity of their survey tools (Adedokun et al., 2013; Marbach-Ad et al., 2016; Lavi et al., 2021), and of these, only one (Marbach-Ad et al., 2016) sought to characterize the extent to which students value cross-disciplinary skills. However, the 14 survey items in Marbach-Ad et al. (2016) did not include oral communication skills. They only included the skill Scientific Writing, which was grouped under a broad Transfer Skills category. More research is needed to understand what written and oral communication skills are developed and how they are valued.

    It is noteworthy that, while we did not ask students to order-rank the values they attributed to various cross-disciplinary skills, we did observe that students across institutions and disciplines nearly always valued the Memorization factor less than the other three factors. Further, very few previous studies included simple knowledge acquisition in their organizational framework (Rayner and Papakonstantinou, 2015; Heron and McNeal, 2016; Marbach-Ad et al., 2016), and yet disciplinary knowledge is the foundation for all disciplinary and cross-disciplinary skill development (Bloom, Krathwohl, and Masia, 1956; Fink, 2013). Both Marbach-Ad et al. (2016) and this study support the value of knowledge acquisition as a skill that students need to develop, despite it not always being perceived as valuable. Students interviewed by Marbach-Ad et al. (2016) explained that they value simple memorization of facts, as it is a requisite skill to develop other skills (Marbach-Ad et al., 2019). Additional research is needed to better understand how students perceive memorization skills and how these skills relate to the development of more sophisticated cognitive skills.

    Interpreting the Experiences Factors

    In the present study, we used classroom experiences questions that aligned with various cross-disciplinary skills and identified a two-factor structure of Experiences that includes active-learning practices: Interactive/Evidence-Based Experiences and Procedural and Quantitative Experiences. The Interactive/Evidence-Based Experiences factor included the types of activities that engage students in the thinking process through communication (e.g., Oral presentations, Writing assignments), collaboration (e.g., ​​Working in groups during class time), and application (e.g., Relating course material to the real world). The Procedural and Quantitative Experiences factor represented more foundational classroom activities that engage students in their learning process, especially through computational (e.g., Using software appropriate for your discipline) or quantitative activities (e.g., Applying quantitative reasoning).

    In the literature, classification schemes for learning activities are generally developed conceptually/theoretically (e.g., Walter et al. 2016) and exist on a continuum of student-centered (e.g., group work) to instructor-centered (e.g., extensive lecturing) practices or as a bifurcated active learning and traditional instruction. Previous research has also presented data from faculty (e.g., Dancy and Henderson, 2007) and students (e.g., Freeman et al., 2014; Cavanagh et al., 2018) to further support this framing of instructional practices around students’ active engagement in the learning process. In the present study, active-learning classroom experiences were found under both factors. The more passive instructional practices (sometimes referred to as instructor-centered practices or traditional instruction) such as extensive lecturing and answering questions from individual students in class did not load strongly on either of the Experiences factors and so were excluded. Additional research is needed to better understand how students’ classroom experiences related to development of cross-disciplinary skills aligns with various approaches to organize instructional practices.

    RQ2: How Are Students’ Values Related to Institutional, STEM Discipline, Research Experience, Sociodemographic Characteristics, and Reported Classroom Experiences?

    In addition to identifying an empirically based structure for students’ perceived Values and Experiences related to cross-disciplinary skills, we also ran four linear regression models to explore relationships between students’ values, classroom experiences, undergraduate research experience, STEM discipline, gender, ethnicity, and institution.

    Controlling for research experience, institutional, gender, ethnicity, and disciplinary differences, we found that the Interactive/Evidence-Based Experiences factor related to each of the Values factors and consistently explained the most variability in the outcome. In other words, the more frequently students reported experiencing classroom activities such as designing research studies, oral presentations, and group work, the more likely they were to report valuing all cross-disciplinary skills. This study aligns with previous work that demonstrates a positive relationship between evidence-based classroom experiences and cross-disciplinary skills such as communication, collaboration, and critical thinking (e.g., Marbach-Ad et al., 2016; Demaria et al., 2018). It further extends this work to demonstrate the importance of evidence-based classroom activities for students’ value of cross-disciplinary skills related to research practices.

    The Procedural and Quantitative Experiences factor was positively associated with the Conceptual and Data Application factor, which suggests that students who experience classroom activities related to problem solving and applying quantitative reasoning also value these skills. However, this relationship was weaker than the relationship between Interactive/Evidence-Based Experiences in explaining differences in reported Conceptual and Data Application values. There was also no relationship between the Procedural and Quantitative Experiences factor and Research and Writing, Memorization, and Nature of the Discipline values factors, suggesting that students’ more foundational classroom experiences may not relate to these particular cross-disciplinary values.

    From an EVT of motivation perspective, our findings support a conceptual model for how classroom experiences can influence the extent to which students value, and therefore might be motivated to develop, a broad range of cross-disciplinary skills. In particular, the Interactive/Evidence-Based Experiences factor, which includes experiences such as non–lecture based in-class activities, group work, and opportunities to relate course material to the real world, may allow students to discover value in their learning and improve motivation (e.g., Curry et al., 2020). In contrast, the Procedural and Quantitative Experiences factor, which includes experiences such as solving problems, graded homework assignments, and computer programming, appears less strongly related to student values, perhaps because these instructional practices might sometimes be more rote in nature and their value less readily apparent to students. Previous research has demonstrated a relationship between students’ beliefs and motivation for learning (e.g., Paulsen and Feldman, 2006; Husain, 2014), so the lack of relationship between Procedural and Quantitative Experiences and cross-disciplinary values may be related to students’ beliefs regarding these classroom practices. These motivational-related differences in instructional experiences provides a reasonable explanation for why the Interactive/Evidence-Based Experiences factor was more strongly related to and accounted for more variance in each values factor compared with Procedural and Quantitative Experiences. Further research is warranted to identify how the nuances of specific instructional practices and student beliefs regarding those practices might influence the development of student values and their motivation to develop related cross-disciplinary skills.

    We also observed that students’ research experience was significantly related to their value of Research and Writing and, to a lesser extent Conceptual and Data Application. This finding supports previous research and current recommendations that UREs can enhance students’ research and communication skills (e.g., Gilmore et al., 2015; Carter et al. 2016). However, research experience was not related to the other two values factors. Given that UREs provide students with authentic scientific experiences, it is surprising that research experience is not related to students’ value of the Nature of the Discipline. This lack of relationship further supports the aforementioned claim that student beliefs may play a role in how they value the Nature of the Discipline. Alternatively, students who engage in research experiences may already hold similar values or beliefs about the nature of the discipline. It would be beneficial to further tease out what components of research experiences (e.g., mentoring, laboratory work, collaboration) relate to the cross-disciplinary skills that students value.

    When controlling for all other variables, we observed that students’ sociodemographic characteristics (i.e., gender, race/ethnicity) were significant predictors for many of the values but minimally accounted for variance in the outcomes (0 to 0.4%). This finding is surprising, given the body of literature demonstrating differences in undergraduate STEM students’ experiences and outcomes based on their race and gender (e.g., Rainey et al., 2018; Witherspoon and Schunn, 2019) and the motivational literature (e.g., Roksa and Whitley, 2018). However, sociodemographic characteristics are not why outcome differences exist, rather they are a proxy for sociocultural variables that may be attributed to students with different racial and gender identities (Eddy and Brownell, 2016). In the present study, we found that students’ cross-disciplinary values were not outcomes that can be explained by students’ sociodemographic characteristics, which may provide additional insight into understanding when, and for what outcome measures, sociodemographic variables can be proxies for sociocultural variables. Recent research has found that students’ intersectional identities (e.g., Black female, white male) may be more important when examining outcomes in STEM undergraduate education (Rainey et al., 2018; Van Dusen and Nissen, 2020). Thus, the limited explanatory power of students’ sociodemographic variables in the present study may be a result of treating students’ race and gender as separate variables in the regression models. Further research understanding how race and gender relate to students’ cross-disciplinary skill values is warranted.

    Finally, we observed that both the disciplinary context and institution were related to students’ values of the four cross-disciplinary skills factors. The largest variation in relationships between discipline and values were for Research and Writing values, where differences across majors represented the second most important variable for explaining differences in reported values. Descriptively, we also observed trends in students’ cross-disciplinary values for biology majors across institutions but that students at University 2 consistently valued cross-disciplinary skills less than students at the other institutions (Figure 2). These data add to a growing understanding of similarities and differences that exist across STEM disciplines (e.g., Reinholz et al., 2019) and institutions. Further exploration of students’ perceptions and values of cross-disciplinary skills and their relationship to disciplinary structures, people, symbols, and power (Reinholz et al., 2019) may help confirm which values span disciplines, regardless of institution, and which are specific to institutional types.

    Limitations

    This cross-disciplinary, cross-institutional study allowed for more generalizability than single-discipline and single-institution studies; however, there are a few limitations. First, because the survey was not mandatory at all of the institutions, the sample is composed largely of self-selected respondents. Additionally, some of the universities’ students are disproportionately represented in the sample, potentially due to those universities’ culture of distributing exit surveys. Both of these sampling limitations may have resulted in findings representative of certain types of students. Second, students reported their experiences just before graduating based on their recollections, rather than contemporaneously over the course of their entire undergraduate studies. This retrospective, self-reporting of experiences could introduce potential bias. However, there are studies demonstrating that these retrospective findings can be more accurate given students’ reflectiveness on their experiences (Volkwein et al., 2007). Third, while the study was conducted across four institutions, these institutions were all predominately white institutions of high research activity in the Mid-Atlantic. This may limit our findings’ generalizability to institutions with differing characteristics. Further research expanding to historically Black colleges and universities, Hispanic-serving institutions, and community colleges would help further generalize the present study’s findings. Finally, the retrospective nature of this study captures student values at only one point in the educational trajectory (graduation), which makes it impossible to assess how various facets of students’ characteristics and experiences are causally related to their values. It is very likely that student values change over the course of their time as students as a result of their experiences as undergraduate students and during their initial steps toward postgraduate employment or graduate education. A deeper investigation into this process is warranted and would require a sufficiently fine-grained set of longitudinal data to allow proper temporal ordering of the variables of interest, as well as a rich set of covariates to eliminate all association due to confounding.

    CONCLUSION

    Recent calls for undergraduate STEM instructional reforms suggest that there is a gap between the skills students have and the skills employers desire (Jang, 2016). To better prepare students for their postgraduate careers, there is a need for more research on the association between students’ experiences and the skill levels that they develop through these experiences. In this study, we measured students’ values of cross-disciplinary skills, which are important for their success in future careers. Here, we summarize the theoretical, methodological, and practical contributions of this study and suggest additional directions for future research.

    Theoretical Contribution

    Our study aimed to increase the generalizability of prior work by including multiple disciplines and institutions. The data analysis shows that, while it is possible to conduct research across multiple disciplines, it is important to consider nuanced differences that may be present. This is also the case for institutional differences, as institutional types and cultures could influence students’ values. Future studies should gather information from larger populations from varied institutions and disciplines. This would increase the ability to probe for potential interaction effects. For example, our model only tested main effects of classroom experiences on values. It is possible that a particular type of experience has differing effects on students majoring in different disciplines at different types of institutions (e.g., liberal arts, research-intensive, community college). Additionally, these differences could be examined more thoroughly through qualitative methods such as interviews and focus groups.

    Methodological Contribution

    Research studies exploring perceptions of skills by students, faculty, and/or employers suggest there are some differences in the relative value placed on these skills (e.g., Jang, 2016; Imafuku et al., 2018). Here, we revised a previously validated survey tool, the STEP-U, that could be used to probe perceived values and experiences, not only with undergraduate students, but also potentially with faculty, graduate students, and employers. It could also be used longitudinally to examine the development of student values over time, by measuring values at additional time points, such as matriculation, postgraduation, and in the workplace. Finally, the tool could be used to further explore the nuances in the relationship between experiences and values by examining the interactions between classroom experiences and STEM discipline.

    Practical Contribution

    The results of this study could be shared with faculty to spur conversations about teaching and learning. Research shows that faculty find data about their own students’ thoughts, values, and understandings very compelling (Marbach-Ad et al., 2010, 2019), perhaps more so than published findings from other institutions. Therefore, department-level conversations around relevant STEP-U data could be particularly beneficial for promoting discussions about instructional methods. In addition, evidence regarding associations between research experience and student values could help promote opportunities for undergraduate research and other practical experiences. The STEP U could be used to evaluate whether the implementation of these additional opportunities impact students’ values.

    In summary, our findings demonstrate the relationships between students’ values of cross-disciplinary skills and their classroom experiences, research experiences, and contextual experiences within their disciplines and institutions. This provides validity evidence that the four values factors represent somewhat distinct values that could be enhanced by different types of experiences.

    FOOTNOTES

    1There is some disagreement on whether ability beliefs and self-efficacy are subsumed into expectancy beliefs. For more details, see Wigfield and Eccles (2000), Pajares (1996), and Husain (2014).

    ACKNOWLEDGMENTS

    The preparation of this article was supported in part by a grant from NSF (#1524832) to APLU’s Network of STEM Education Centers (NSEC). This work has been approved by the Institutional Review Boards of the four universities. We thank the students who participated in this study.

    REFERENCES

  • Adedokun, O. A., Bessenbacher, A. B., Parker, L. C., Kirkham, L. L., & Burgess, W. D. (2013). Research skills and STEM undergraduate research students’ aspirations for research careers: Mediating effects of research self-efficacy. Journal of Research in Science Teaching, 50(8), 940–951. https://doi.org/10.1002/tea.21102 Google Scholar
  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action. Washington, DC. Retrieved December 10, 2021, from www.visionandchange.org/VC_report.pdf Google Scholar
  • American Chemical Society. (2015). ACS guidelines and evaluation procedures for bachelor’s degree programs. Retrieved December 10, 2021, from www.acs.org/content/dam/acsorg/about/governance/committees/
training/2015-acs-guidelines-for-bachelors-degree-programs.pdf Google Scholar
  • Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining twenty-first century skills. In Griffin, P.McGaw, B.Care, E. (Eds.), Assessment and teaching of 21st century skills (pp. 17–66). Dordrecht, Netherlands: Springer. https://doi.org/10.1007/
978-94-007-2324-5_2 Google Scholar
  • Bloom, B. S., Krathwohl, D. R., & Masia, B. B. (1956). Taxonomy of educational objectives. The classification of educational goals. Handbook I: Cognitive domain. New York, NY: Longman. Google Scholar
  • Cameron, A. C., Gelbach, J. B., & Miller, D. L. (2011). Robust inference with multiway clustering. Journal of Business Economics and Statistics, 29(2), 238–249. https://doi.org/10.1198/jbes.2010.07136 Google Scholar
  • Carter, D. F., Ro, H. K., Alcott, B., & Lattuca, L. R. (2016). Co-curricular connections: The role of undergraduate research experiences in promoting engineering students’ communication, teamwork, and leadership skills. Research in Higher Education, 57(3), 363–393. https://doi.org/10.1007/s11162-015-9386-7 Google Scholar
  • Cavanagh, A. J., Chen, X., Bathgate, M., Frederick, J., Hanauer, D. I., & Graham, M. J. (2018). Trust, growth mindset, and student commitment to active learning in a college science course. CBE—Life Sciences Education, 17(1), 1–8. https://doi.org/10.1187/cbe.17-06-0107 Google Scholar
  • Cerasoli, C. P., Nicklin, J. M., & Ford, M. T. (2014). Intrinsic motivation and extrinsic incentives jointly predict performance: A 40-year meta-analysis. Psychological Bulletin, 140(4), 980–1008. https://doi.org/10.1037/a0035661 MedlineGoogle Scholar
  • Costello, A. B., & Osborne, J. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most of your analysis. Practical Assessment, Research, and Evaluation, 10(7), 1–9. Google Scholar
  • Curry, K. W., Spencer, D., Pesout, O., & Pigford, K. (2020). Utility value interventions in a college biology lab: The impact on motivation. Journal of Research in Science Teaching, 57(2), 232–252. https://doi.org/10.1002/tea.21592 Google Scholar
  • Dancy, M., & Henderson, C. (2007). Framework for articulating instructional practices and conceptions. Physical Review Special Topics—Physics Education Research, 3(1), 010103. Google Scholar
  • Demaria, M., Hodgson, Y., & Czech, D. (2018). Perceptions of transferable skills among biomedical science students in the final-year of their degree: What are the implications for graduate employability? International Journal of Innovation in Science and Mathematics Education, 26(7), 11–24. Google Scholar
  • Eddy, S. L., & Brownell, S. E. (2016). Beneath the numbers: A review of gender disparities in undergraduate education across science , technology, engineering, and math disciplines. Physics Review Physics Education Reseasrch, 12, 1–20. https://doi.org/10.1103/PhysRevPhysEducRes.12.020106 Google Scholar
  • Ferrell, B., Phillips, M. M., & Barbera, J. (2016). Connecting achievement motivation to performance in general chemistry. Chemistry Education Research and Practice, 17, 1054–1066. https://doi.org/10.1039/C6RP00148C Google Scholar
  • Fink, L. D. (2013). Creating significant learning experiences: An integrated approach to designing college courses. Hoboken, NJ: Wiley. Google Scholar
  • Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Pnas, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111 MedlineGoogle Scholar
  • Gayles, J. G., & Ampaw, F. (2014). The impact of college experiences on degree completion in STEM fields at four-year institutions: Does gender matter? Journal of Higher Education, 85(4), 439–468. https://doi
.org/10.1353/jhe.2014.0022 Google Scholar
  • Gilmore, J., Vieyra, M., Timmerman, B., Feldon, D., & Maher, M. (2015). The relationship between undergraduate research participation and subsequent research performance of early career STEM graduate students. Journal of Higher Education, 86(6), 834–863. https://doi.org/10.1080/
00221546.2015.11777386 Google Scholar
  • Griffin, P., McGaw, B., & Care, E. (Eds.) (2012). The Changing Role of Education and Schools. In Assessment and Teaching of 21st Century Skills (pp. 1–16). Dordrecht, Germany: Springer. Google Scholar
  • Grömping, Ulrike. (2006). Relative importance for linear regression in R: The package relaimpo. Journal of Statistical Software, 17(1), 1–27. Google Scholar
  • Hart Research Associates. (2015). Falling short? College learning and career success. Washington, DC. Google Scholar
  • Heron, P., & McNeil, L. (2016). Phys21: Preparing Physics Students for 21st-Century Careers. In American Physical Society. College Park, MD: American Association of Physics Teachers. Google Scholar
  • Hulleman, C. S., Godes, O., Hendricks, B. L., & Harackiewicz, J. M. (2010). Enhancing interest and performance with a utility value intervention. Journal of Educational Psychology, 102(4), 880–895. Google Scholar
  • Husain, U. K. (2014). Relationship between self-efficacy and academic motivation. In International Conference on Economics, Education and Humanities (ICEEH’14) held from December 10–11, 2014 at Bali (Indonesia) (pp. 18–22). https://doi.org/10.15242/icehm.ed1214132 Google Scholar
  • Imafuku, R., Yasuda, S., Hashimoto, K., Matsunaga, D., Ohashi, Y., Yamamoto, K., ... & Saiki, T. (2018). Exploring medical students’ and faculty’s perspectives on benefits of undergraduate research experience. Medical Science Educator, 28(3), 553–560. https://doi.org/10.1007/s40670-018-0593-7 Google Scholar
  • Jang, H. (2016). Identifying 21st century STEM competencies using workplace data. Journal of Science Education and Technology, 25(2), 284–301. Google Scholar
  • Koçak, Ö., & Göksu, İ. (2020). Examining 21st century skill levels of students and the relationship between skills. Journal of the Faculty of Education, 21(2), 772–784. https://doi.org/10.17679/inuefd.656784 Google Scholar
  • Lavi, R., Tal, M., & Dori, Y. J. (2021). Perceptions of STEM alumni and students on developing 21st century skills through methods of teaching and learning. Studies in Educational Evaluation, 70, 101002. https://doi
.org/10.1016/j.stueduc.2021.101002 Google Scholar
  • Lederman, N. G., Abd-El-Khalick, F., Bell, R. L., & Schwartz, R. (2002). Views of nature of science questionnaire (VNOS): Toward valid and meaningful assessment of learners’ conceptions of nature of science. Journal of Research in Science Teaching, 39(6), 497–521. https://doi.org/10.1002/tea.10034 Google Scholar
  • Linn, M. C., Palmer, E., Baranger, A., Gerard, E., & Stone, E. (2015). Undergraduate research experiences: Impacts and opportunities. Science, 347(6222), 1261757. https://doi.org/10.1126/science.1261757 MedlineGoogle Scholar
  • Macphee, D., Farro, S., & Canetto, S. S. (2013). Academic self-efficacy and performance of underrepresented STEM majors: Gender, ethnic, and social class patterns. Analyses of Social Issues and Public Policy, 13(1), 347–369. https://doi.org/10.1111/asap.12033 Google Scholar
  • Marbach-Ad, G., Hunt, C., & Thompson, K. V. (2019). Exploring the values undergraduate students attribute to cross-disciplinary skills needed for the workplace: An analysis of five STEM disciplines. Journal of Science Education and Technology, 28(5), 452–469. https://doi.org/10.1007/s10956-019-09778-8 Google Scholar
  • Marbach-Ad, G., McAdams, K. C., Benson, S., Briken, V., Cathcart, L., Chase, M., … & Smith, A. C. (2010). A model for using a concept inventory as a tool for students’ assessment and faculty professional development. CBE—Life Sciences Education, 9(4), 408–416. LinkGoogle Scholar
  • Marbach-Ad, G., Rietschel, C., & Thompson, K. V. (2016). Validation and application of the survey of teaching beliefs and practices for undergraduates (STEP-U): Identifying factors associated with valuing important workplace skills among biology students. CBE—Life Sciences Education, 15(4). https://doi.org/10.1187/cbe.16-05-0164 Google Scholar
  • Marbach-Ad, G., Schaefer, K. L., Kumi, B. C., Friedman, L. A., Thompson, K. V., & Doyle, M. P. (2012). Development and evaluation of a prep course for chemistry graduate teaching assistants at a research university. Journal of Chemical Education, 89(7), 865–872. https://doi.org/10.1021/ed200563b Google Scholar
  • Marbach-Ad, G., Schaefer, K. L., & Thompson, K. V. (2012). Faculty teaching philosophies, reported practices, and concerns inform the design of professional development activities of a disciplinary teaching and learning center. Journal on Centers for Teaching and Learning, 4, 119–137. Google Scholar
  • Marbach-Ad, G., Ziemer, K. S., Orgler, M., & Thompson, K. V. (2014). Science teaching beliefs and reported approaches within a research university: Perspectives from faculty, graduate students, and undergraduates. International Journal of Teaching and Learning in Higher Education, 26(2), 232–250. Google Scholar
  • Mayer, R. E. (2002). Rote versus meaningful learning. Theory into practice, 41(4), 226–232. Google Scholar
  • McGunagle, D., & Zizka, L. (2020). “Employability skills for 21st-century STEM students: The employers’ perspective”. Higher Education, Skills and Work-Based Learning, 10(3), 591–606. https://doi.org/10.1108/HESWBL
-10-2019-0148 Google Scholar
  • Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational Research, 66(4), 543–578. https://doi.org/10.3102/00346543066004543 Google Scholar
  • Palmer, D. (2005). A motivational view of constructivist-informed teaching. International Journal of Science Education, 27(15), 1853–1881. https://doi.org/10.1080/09500690500339654 Google Scholar
  • Paulsen, M. B., & Feldman, K. A. (2005). The Conditional and Interaction Effects of Epistemological Beliefs on the Self-Regulated Learning of College Students: Motivational Strategies. Research in Higher Education, 46(7), 731–768. https://doi.org/10.1007/s11162-004-6224-8 Google Scholar
  • Pike, G. R., & Killian, T. S. (2001). Reported gains in student learning: Do academic disciplines make a difference? Research in Higher Education, 42(4), 429–454. https://doi.org/10.1023/A:1011054825704 Google Scholar
  • President’s Council of Advisors on Science and Technology. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology. Washington, DC: U.S. Government Office of Science and Technology. Google Scholar
  • R Core Team (2020). R: A computer language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Google Scholar
  • Rainey, K., Dancy, M., Mickelson, R., Stearns, E., & Moller, S. (2018). Race and gender differences in how sense of belonging influences decisions to major in STEM. International Journal of STEM Education, 5(1). https://doi
.org/10.1186/s40594-018-0115-6 MedlineGoogle Scholar
  • Rayner, G., & Papakonstantinou, T. (2015). Employer perspectives of the current and future value of STEM graduate skills and attributes: An Australian study. Journal of Teaching and Learning for Graduate Employability, 6(1), 100. https://doi.org/10.21153/jtlge2015vol6no1art576 Google Scholar
  • Reeves, T. D., & Marbach-Ad, G. (2016). Contemporary test validity in theory and practice: A primer for discipline-based education researchers. CBE—Life Sciences Education, 15(1), rm1. LinkGoogle Scholar
  • Reinholz, D. L., Ngai, C., Quan, G., Pilgrim, M. E., Corbo, J. C., & Finkelstein, N. (2019). Fostering sustainable improvements in science education: An analysis through four frames. Science Education, 103(5), 1125–1150. https://doi.org/10.1002/sce.21526 Google Scholar
  • Revelle, W. (2020). psych: Procedures for personality and psychological research. Evanston, IL: Northwestern University. Retrieved December 15, 2021, from https://cran.r-project.org/package=psych Google Scholar
  • Roksa, J., & Whitley, S. E. (2018). Fostering academic success of first-year students : Exploring the roles of motivation, race, and faculty. Journal of College Student Development, 58(3), 333–348. Google Scholar
  • Sarkar, M., Overton, T., Thompson, C., & Rayner, G. (2016). Graduate employability: Views of recent science graduates and employers. International Journal of Innovation in Science and Mathematics Education, 24(3), 31–48. Google Scholar
  • Siekmann, G. (2016). What is STEM ? The need for unpacking its definitions and applications. National Center for Vocational Education Research. Retrieved December 10, 2021, from http://www.p21.org/our-work/p21-framework Google Scholar
  • Van Dinther, M., Dochy, F., & Segers, M. (2011). Factors affecting students’ self-efficacy in higher education. Educational Research Review, 6(2), 95–108. https://doi.org/10.1016/j.edurev.2010.10.003 Google Scholar
  • Viskupic, K., Egger, A. E., McFadden, R. R., & Schmitz, M. D. (2021). Comparing desired workforce skills and reported teaching practices to model students’ experiences in undergraduate geoscience programs. Journal of Geoscience Education, 69(1), 27–42. https://doi.org/10.1080/10899995.2020.1779568 Google Scholar
  • Volkwein, J. F., Lattuca, L. R., Harper, B. J., & Domingo, R. J. (2007). Measuring the impact of professional accreditation on student experiences and learning outcomes. Research in Higher Education, 48(2), 251–282. Google Scholar
  • Walter, E. M., Henderson, C. R., Beach, A. L., & Williams, C. T. (2016). Introducing the Postsecondary Instructional Practices Survey (PIPS): A concise, interdisciplinary, and easy-to-score survey. CBE—Life Sciences Education, 15(4), 1–11. https://doi.org/10.1187/cbe.15-09
-0193 Google Scholar
  • Wheeler, L. B., Mulvey, B. K., Maeng, J. L., Librea-Carden, M. R., & Bell, R. L. (2019). Teaching the teacher: Exploring STEM graduate students’ nature of science conceptions in a teaching methods course. International Journal of Science Education, 41(14), 1905–1925. https://doi.org/10.1080/
09500693.2019.1647473 Google Scholar
  • Wigfield, A., & Eccles, J. S. (2000). Expectancy–value theory of achievement motivation. Contemporary Educational Psychology, 25(1), 68–81. https://doi.org/10.1006/ceps.1999.1015 MedlineGoogle Scholar
  • Witherspoon, E. B., & Schunn, C. D. (2019, February). Locating and understanding the largest gender differences in pathways to science degrees. Science Education, 104(2), 144–163. https://doi.org/10.1002/sce.21557 Google Scholar
  • Zeileis, A. (2004). Econometric computing with HC and HAC covariance matrix estimators. Journal of Statistical Software, 11(10), 1–17. doi: 10.18637/jss.v011.i10 Google Scholar